When Robots Attack: A Look At 21st Century Warriors

P.W. Singer, author of the new book Wired For War, is concerned about how battlefield robots are changing IT perspectives.
The connection between society at large and war is complicated by video. Robots take video of their actions, and thousands of hours of that video finds its way to YouTube. That has the potential to be a benefit, because it creates an information channel between the military on the battlefield and the civilian population at home that's unmediated by the news media. On the other hand, it has the possibility of widening the disconnect, as people at home view the videos as entertainment, rather than images of real people suffering and dying, Singers said.

Singer described one e-mail with the subject line "Watch this," and an attached video of a Predator drone striking, an explosion, and bodies flying in the air, set to the music of the Sugar Ray song, "I Just Want To Fly." "People are turning war into a joke," he said.

But is abuse of combat robots inevitable?

"We can't resist ourselves. We always open Pandora's box; that's the nature of science and technology itself," Singer said. Fortunately, the human race has proven itself able to restrict harmful military technology, like poison gas, chemical and biological warfare, and nuclear weapons. "The darker side is we usually don't get our butts into action until the bad thing happens. We don't have international law until we've had the Thirty Years War, we don't get the Geneva Conventions until we have the Holocaust, we don't get to international land mine conventions until we've buried 25 million land mines under the Earth."

And even when the overwhelming majority of the world refrains from using terrible weapons, the lunatic fringe remains a threat.

So what about the question we started this article with? Will robots develop strong AI -- artificial intelligence at the human level or better -- and rise up to kill or enslave us all?

"The people that I spoke with in the field did take very seriously the idea that we would one day achieve 'strong AI' and that it would represent a break point in history, a 'singularity,' akin to the printing press or atomic bomb, where the old rules have to be re-evaluated and new questions about what is possible and proper have to asked," Singer said in an e-mail.

The people taking the possibility of strong AI seriously weren't just visionaries like Raymond Kurzweil.

"I even had a special operations officer just back from hunting terrorists in Iraq talk about this in an interview," Singer said.

But we can't predict what intelligent robots' relationship to the human race might be. They might be a threat, like the Terminator. But not necessarily. Some say robots would require a survival instinct to threaten human beings, and researchers are intentionally leaving the survival instinct out of their machines. Others say that artificial intelligences might be more moral than human beings -- more like Mr. Data from Star Trek than like the Terminator, Singer said.

"Others joke that just about the time the AI is ready to take over, their Microsoft Vista will crash," Singer said.

He added, "My sense is that there are certainly questions of ethics and control, but they are already happening now, well before we get to strong AI. That is, there are enough questions that arise with the predators and packbots of today, that we don't need to jump to the future to know that something both interesting and a bit scary is going on."