Contributor
Related Topics
Other segments from the episode on January 22, 2009
Transcript
*** TRANSCRIPTION COMPANY BOUNDARY ***
..DATE:
20090122
..PGRM:
Fresh Air
..TIME:
12:00-1:00 PM
..NIEL:
N/A
..NTWK:
NPR
..SGMT:
'Wired For War' Explores Robots On The Battlefield
TERRY GROSS, host:
This is Fresh Air. I'm Terry Gross. We're entering the era of robots at war, robots with names like Predator, Global Hawk, TALON, SWORDS and PacBot. The American military is getting ready for a battlefield where it sends out fewer humans and more robots. Unmanned planes over Afghanistan and Iraq have already been piloted by Air Force personnel located at bases just outside of Las Vegas.
My guest, P.W. Singer, is the author of the new book, "Wired for War: The Robotics Revolution and Conflict in the 21st Century." It's about these new robotic weapons and how they're changing warfare as well as what it means to be a soldier, and what new political, legal and ethical problems they're presenting.
Singer is a senior fellow at the Brookings Institution where he directs the 21st Century Defense Initiative. His previous books were about child soldiers and how private military contractors have changed warfare. He was the defense coordinator of the Obama campaign's defense policy task force.
P.W. Singer, welcome back to Fresh Air. What robotics are we using now in Iraq and Afghanistan? Let's start with the Predators and what they do.
Mr. P.W. SINGER (Author, "Wired for War: The Robotics Revolution and Conflict in the 21st Century"; Senior Fellow, Brookings Institution): Well, I think it's interesting to pull back and talk about how many of these are and then focus on the systems themselves. So the Predator is this large drone. It's about the size of a Cessna plane that started out just being used for reconnaissance, that is, almost like spy plane hunting down folks. And then they said, you know what, why don't we arm it? And now it's being used in hunter-killer roles.
Now, at the start of the Iraq war, we had just a handful of these machines in the air. We now have over 5,300 drones, including the Predators, but of all sizes - 5,300 of all these unmanned planes in the air. But the same thing is happening on the ground where we went in with zero ground robotics, and now we're up to 12,000.
GROSS: So what do these other drones do?
MR. SINGER: A wide variety of roles. So for example, on the ground, a particular one is called PacBot. PacBot is interesting because it's made by the same company that makes the Roomba vacuum cleaner, the little robot vacuum cleaner a lot of people have in their houses. The company behind it is called iRobot, after the Asimov book and not-so-great Will Smith movie.
And the PacBot started out, again, just being used for reconnaissance, go out and look and see. And then they figured out, you know what, because of this we can use it for a lot of different roles. So they've been adding on different systems on top of it. Its particular role in Iraq where it's been incredibly valuable has been in the counter-IED role. That is, it's going out there and taking on the jobs that military bomb squads would do. That is, it goes out and helps hunt down and then difuse these homemade bombs that the insurgents have been using against us out there.
GROSS: Then there's something that you describe as being like R2D2 from "Star Wars."
MR. SINGER: R2D2 is the C-RAM. It's basically an automated machine gun that shoots down - it's called the counter-rocket artillery mortar system. That is, they've mounted this machine gun that looks a little bit like R2D2, a large version of R2D2. And when mortars or artillery rounds are coming into things like the Green Zone in Baghdad, they're coming in too quick for a human to react to. At best, humans, you know, can get to mid-curse word and it's too late. R2D2 - and that's what the soldiers call it because it looks like that - can shoot down those rockets or mortar rounds that are coming in, shoot them down in the air. And again, it's been incredibly valuable and saved lots of lives.
Now, the funny story about R2D2 is that the first time that they used it, they actually didn't have the software programming right, and it accidentally targeted a American helicopter. And so they've since had to fix it. The helicopter didn't get shot down, everything worked out, but it points to one of these wrinkles that happens as you introduce new technologies into war.
GROSS: We're also using robotics in homeland security now. What are we using in the U.S.?
MR. SINGER: As these systems are taking off and the military is proving the uses for them, the war at home, so to speak, is picking them up too. So for example, Homeland Security Department saw what they were doing with these Predator drones minding, for example, the border between Afghanistan and Pakistan, and said, hey, why don't we use that here? Now, the original rationale for it was supposedly against counterterrorism, but so far it's mainly been used against a different kind of homeland security threat, as some people would put it, where it's being used to track down illegal immigrants at border crossings, drug dealers and the like. One of the Predators has been involved in over 30 different drug busts, capturing, you know, several thousand pounds of marijuana, for example.
There's another part of this that I find particularly fascinating is that we've got to remember that these systems, while they're incredibly sophisticated, almost anyone can buy them. And so, for example, along the border between the U.S. and Mexico, you've also got these, you know, some people call them the militia groups, some people call them vigilante groups. They're basically these groups of folks that have gotten together and are trying to do their own border security. And one of these, for example, when out and bought some drones, and they've been using that to monitor the border. It's a private organization that's doing this, and it, you know, raises a lot of questions.
GROSS: I want to get to some of the questions that your book raises about robotics technology. And one of the really important ones has to do with how it's changing the nature of war to have people who are controlling drones, for example, from computer screens within the United States. So they're halfway around the world from where the planes are actually flying and from where the damage is being done. You write that drones flying over Iraq and Afghanistan are flown by pilots sitting in Nevada, and it's like a video game, and this is bringing new psychological twists to war.
Before we talk about some of those new psychological twists, I want to play a very brief excerpt of an interview I did with Marc Garlasco, who was the chief of high-value targeting during the early phase of the Iraq war, but soon after, he left the Pentagon to join Human Rights Watch and become their senior military expert to help prevent collateral damage from bombs. But at the Pentagon, he led a cell of people looking for Saddam Hussein and other people in the deck of cards so they could be targeted and killed. And he was targeting Chemical Ali, Saddam's cousin, who had been involved in gassing the Kurds in 1988.
The Pentagon got information that Chemical Ali was going to be in a certain house in a residential area, so Garlasco's cell put together the target package and then watched the bomb attack in real time on the computer screen in the targeting cell. And here's his description of what happened and what he saw on his computer screen starting with the two bombs that missed the targeted house.
(Soundbite of interview)
Mr. MARC GARLASCO (Former Pentagon Military Analyst; Senior Military Analyst, Human Rights Watch): You know, there were two 500-pound bombs that went down on it. First one went down about three blocks away, and, oh, we were so angry. You know, how does a laser-guided bomb fall three blocks away? And - and we were very frustrated with that, but moments later, the second weapon came in, and I'll never forget. I was watching this guy. He was walking outside the building, and we were saying buddy, you are in the wrong place at the wrong time.
And moments later, poof, just white - the whole screen goes white because we're watching it in infrared. And then suddenly you can see the pictures start to coalesce. There's this huge explosion of fire, and we can see this rag doll, this dark rag-doll person just coming down to earth. The legs - I'll never forget it, were just flailing in the air, and he came down and hit on the ground and bounced. And you know, I'll be honest with you, we thought that we had killed Chemical Ali, and we cheered and patted ourselves on the back. And we even bet breakfast on how many times that person ended up bouncing.
And two weeks later, I was standing in that crater working for Human Rights Watch, and I was facing this - this old Iraqi with this leathery skin and the - you know, the thousand-mile stare, no tears, and just telling me about how his family had been wiped out and the family next door had been wiped out. Seventeen civilians were killed. We never got Chemical Ali. Eventually, he was captured and today, as we're doing this interview, Iraq is preparing to execute him, and so that was a strike that went very, very badly.
GROSS: Now as I mentioned, Marc Garlasco is now a human rights activist. P.W. Singer, how does that description get to some of the issues that you talk about in your book about how robotics are changing what it means for people fighting the war through video screens and through robotics in the United States?
MR. SINGER: I think it really captures it well, and actually, Marc is one of the people whose stories the book tells, what it's like to be working in a human rights organization at a time when we're bringing in machines into war. And one of the things he talks about is the challenge of trying to apply "Star Trek" technology to international law that comes out of the 1940s. And there's all sorts of ripple effects that are happening now when you go out and speak to these folks. They talk about the ripple effects and how they're playing out in everything from the laws of war, as Mark talked about.
And another illustration of that is we've on three separate occasions thought we got bin Laden with one of these Predator drones strikes, and it turned out not to be the case. And we have to figure out, what are the legal consequences of what you could call unmanned slaughter? And who do you hold responsible? And this is with systems that don't have a lot autonomy in them yet. The stuff that's coming soon, or is already in the prototype stage, really does push it into that science fiction realm.
But when you think about the ripple effects more broadly, what we found is that there's actually a whole lot of human psychology to the impact of robots on war. And one, for example, is the experience of the soldiers who are truly at war but not physically at war. That is, when we say, go to war, we've got a new twist on that meaning. I term it cubicle warriors. That is, these are folks who are working in office cubicles or something like that, but they're juggling the psychological consequences of being at war but at home at the same time.
There's a great quote from a Predator pilot who I interviewed, and he said it this way: You're going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants. And then you get in the car and you drive home, and within 20 minutes, you are sitting at the dinner table talking to your kids about their homework.
And that's one of the things that coming out of this is that we're actually finding that the drone pilots, because of this tough psychological juggling they're having to do, the drone pilots actually have higher levels of PTSD - Post-Traumatic Stress Disorder - than those who are actually physically serving in the combat zone.
And other examples of the ripple effects is what does this do when you can watch it play out? In that interview you heard, you know, someone talking about watching the impact of being able to see a strike. Well, put that in our strange world, and one of the things we're finding is the rise of - I call it YouTube War. That is, the Iraq War, because of all these systems, is the first one where you can watch but you don't have to be there. And these machines see all. And we're taking these clips and watching from afar, but we're also emailing them around.
We found over 7,000 different clips of combat footage in Iraq, and the soldiers actually call them war porn. And the worry of it is that it connects people to war. They get to see what's happening, but it actually widens the gaps, that is, it creates a further distance. They watch more but they experience less.
GROSS: My guest is P. W. Singer, author of the new book, "Wired for War." He's a senior fellow of the Brookings Institution. We'll talk more after a break. This is Fresh Air.
(Soundbite of music)
GROSS: If you're just joining us, my guest is P. W. Singer. He's the author of the new book, "Wired for War: The Robotics Revolution and Conflict in the 21st Century," and he's a senior fellow at the Brookings Institution.
Many people have made the observation that with more robotics, war becomes like a video game. But you report about how some of the robotics are intentionally designed like video games to take advantage of skills that young soldiers have as a result of having played a lot of video games. Would you describe how some war robotics are designed intentionally like video games?
MR. SINGER: Well, it's interesting. The military quickly figured out that there were two advantages of doing this. For example, the hand-held controllers that most of the ground robotics systems use, they're modeled after the Xbox or the PlayStation. And the reason was two-fold. One, they figured out, OK, these game companies have spent millions of dollars designing systems that are, you know, perfectly suited, where your finger should go and the like, if they did all the research, why don't we copy that?
The second is they figured out, hold it, video game companies have actually trained up our forces for us already. That is, you know, we're getting kids coming in who've spend the last several years working with these little video game controllers, so why not free-ride off of that as well?
And the result of it is, because of these systems and because they are trained up that way, it's another kind of ripple effect we are seeing, the demographics of war even being reshaped. That is, one of the people that we interviewed was a 19-year-old high school dropout. He's an army specialist. He's actually, by some consideration, the best drone pilot in the entire force, and it's in part because of video games.
And it's an interesting story because he originally wanted to join the army to be a helicopter mechanic, but because he had failed his English class, he wasn't qualified for that, and instead they said, hey, do you want to be a drone pilot? And he's turned out to be spectacular at it. They sent him off to Iraq, and then he was so good that they brought him back to be an instructor in the training academy. And again, this is someone who's not even an officer yet, and he's in the army.
Now take this ripple effect further. This is not a story that people in the Air Force like to hear, and it's spooking out a lot of people, for example, you know, F15 pilots, who spent years and years training, go to college, they're officers, and when they hear, hold it, this 19-year-old video gamer is not just better at these systems than me but is actually out there doing more fighting than me, what's going on here?
GROSS: Well, how much autonomy do you think robotics in warfare will ultimately have? I mean, right now they're controlled by humans, and even if the humans are in remote locations - the humans might be in an office in the United States and the Predator is flying in Afghanistan or Iraq - but might that change? Might the robotics or warbots have more autonomy as time goes on?
MR. SINGER: I actually think it's one of the parts that's already staring to change. We just don't talk about it. I joke that this issue is kind of like Lord Voldemort in the Harry Potter series. It's the topic that must not be discussed. And so when you go around asking people about this in the field, be it four-star generals to robotics scientists to, you know, the 19-year-old drone pilots, et cetera, they always tend to speak in absolutes. They always say something along the lines of people will always want humans in the loop. That's actually a quote from, for example, a Bush administration advisor. Or, you know, an Air Force captain, you know, said, we will always want to have humans in there. We need intrepid souls to fling their bodies across the sky. That's their quote.
There's three things that are really - bring those beliefs into question, that absolute, that always in the loop. The first is, the reality is, humans have already been moving out of the loop well before we got to robotics. We've just kept redefining the loop. That is, a system like the Aegis, which is the air defense system on U.S. Navy ships, it already has a series of modes in which the system can take over, for example, if the humans are killed. Even when the humans are involved, the system is working so fast the human's power in it is one of veto power. That is, they can only shut it down. And when we look at incident after incident, they're afraid to shut it down because they trust the machine's judgment more than themselves.
This is, for example, what a B52 navigator described of his experience of bombing Iraq. Quote, "The navigation computer opened the bomb doors and dropped the weapons into the dark." That was what it felt like to him to drop a bomb. That's where we're already at right now without robotics moving into greater autonomy. But the next part of it is that there are all sorts of demands, whether we want to openly talk about them or not, that are leading us to building in more and more autonomy into our systems. For example, if you always have just one human in control, you don't get any personnel savings from it. And that's why, for example, the army's doing research on how to have multiple robots, as many as 10 at a time, controlled by one human. Well, that means you're giving more autonomy.
Then, of course, it's war. The enemy gets a vote. So, they go, well what if the enemy cuts the line of communication? Well, we need the robot to be able to do certain parts of the mission on its own. So you know, another little slippery slope gets, you know - we go a little bit further down the slippery slope there.
GROSS: So as these robotic warriors become more autonomous, what are some of the ethical questions and legal questions that it raises for you?
MR. SINGER: Oh, gosh. It's a whole series of issues. I mean, I think on the ethical side, you have to think about it in terms of what should be the codes of behavior that first guide the people that make the robotics? And then secondly, what it is you want to put in that system itself. And so on the guiding - the people that make the robotics, it challenges - the robotics field is a very new field, and a fear that a lot of people have in it when I went around interviewing these various scientists is it doesn't have kind of a built-in code of ethics, like for example, medicine does. And they're scared that they're going to be almost like what happened to the atomic scientist, that they're going to build something remarkable, use their creativity to the fullest, and yet it's going to be like Pandora's Box. And it's going to be after the fact, when they start to wrestle with the consequences, it'll be too late.
That's on the scientists' side of it. There's also questions about who should have control of them, what kinds of things you put onto these systems or not, how much autonomy they should have, and this is where the, you know, the science fiction of Asimov's Laws break down when it comes to reality. So for example, Asimov, you know, didn't want - one of his codes was that robots can't harm humans. Well, guess what? If you're building robots for war, that kind of falls by the wayside. Or another of Azimov's codes was that, you know, they should be responsive to any human. Well, if you're building military robots, you don't want a robot that any old, you know, al-Qaeda terrorist can walk up to and say, robot, shut down. So there is, you know, some real consequences when you get there.
GROSS: P.W. Singer will be back in the second half of the show. His new book is called "Wired for War: The Robotics Revolution and Conflict in the 21st Century." I'm Terry Gross, and this is Fresh Air.
(Soundbite of music)
GROSS: This is Fresh Air. I'm Terry Gross, back with P.W. Singer, author of the new book, "Wired for War." It's about how robotics are changing the way war is waged, as well as what it means to be a solider. Singer directs the 21st Century Defense Initiative at the Brookings Institution and was the coordinator of the Obama campaign's defense policy task force.
There are real advantages of robots on the battlefield. For example, you point out, if there is chemical or biological weapons that are being used, they won't affect robots, but they would kill human soldiers. What are some of the other advantages of using robots on the battlefield?
Mr. SINGER: The way the scientists talk about them is the three D's. That is, robots can take on jobs that are either dull, dirty or dangerous. And that's their big advantage from a technical standpoint. Dull, you can't keep your eyes open for 30 hours. A robot can. And so there are certain physical limitations that, for example, you know, a pilot of a spy plane, they are limited not only by keeping their eyes open, for example, but also just being able to stay in that plane for hours and hours and hours on end. Even when you're doing stressful jobs, like, for example, defusing a bomb, you have to pause and recollect yourself. A robot doesn't have to.
The dirty, you laid it out. There are certain environments that it's very tough for humans to operate in that machines can. So, the chemical weapons environment, but also things like, you know, you can't see in the dark. A robot can.
The dangerous part, I think, is the biggest driver. That is, the motivation as far as we've seen it on the Pentagon side is that you can use machines to substitute for human risk. You can send them out there for jobs that entail sending a human in, and they might die in that role, and so the cost if you send a machine is a lot less. And that's whether it's something like defusing bombs or whether it's going into a house where you don't know if an insurgent's on the other side of the door or not. If you send that machine in first, the belief is that that way you're avoiding some of the risk.
Now, long term, you pull back and kind of think about this broader, is what does that do, for example, to your foreign policy if you have a shift where you are able undertake war without human consequence?
GROSS: The more we use robotics in war, the more, as you point out, it changes the nature of what it means to be a soldier. And we've always equated being a good soldier with courage and with risking your life on the battlefield. But it sounds like a lot of soldiers are becoming basically computer nerds. They're people who really know their way around a computer to remotely operate robots or to program robots. So what are some of the ways you see robotics as changing the nature of what it means to be a soldier?
Mr. SINGER: Well, it's interesting. You can think about this almost in terms of the demographics of war. There are certain physical attributes that may not be so necessary to be a soldier. One of the people we interviewed put it this way. Having a strong bladder and a big butt may be more useful physical attributes than being able to do 100 pushups. There's also the integration of robotics into our own bodies. One of the real heartening stories out of robotics research is using robotics to replace limbs that soldiers have lost to these IEDs, and about 40 percent of these wounded troops have actually returned back to military service even after they've lost an arm or a leg.
Now, part of this demographics shift, though, means that you're bringing in younger and younger troops or you've got older troops who are staying active, but it's not happening for everyone. And that's one of the dividing lines that's starting to happen within the military, that is, you've got some soldiers who are definitely fighting from afar, but you still got a large amount that are fighting on the scene, and the tensions between them are huge.
GROSS: What are the tensions?
Mr. SINGER: Well, one of the folks that I met with was a special operations officer, and he was one of the folks that goes out and does the raids on terrorist sites, hunting down terrorists. And he talked about this incident in Afghanistan where his team was on the ground, and there was a Predator drone above him that was being flown by someone 7,000 miles away in Nevada. And the drone had to be pulled out because of - as he described it, a bogus weather call. And basically, he was still upset about it two years later. And what was really driving him was a couple of things. One, that they pulled out the machine even though it didn't have a human inside, and in his mind, they valued the machine more than his men on the ground.
But the second part of it that really angered him was that the people that were making this decision were doing it not in the midst of war but far away. And as he kind of jokingly put it, they probably had to take their kid to soccer practice or something like that. It's definitely not true. He's just upset about it, but it points to a tension there in that he is truly physically at war. There are others who are virtually at war, and even though they are on the same side, they don't feel like they're in the same space. And he actually talked about having greater respect for Zarqawi - Zarqawi was one of the terrorists on another mission he was hunting down - he talked about having greater respect for Zarqawi even though he knew he was a terrorist than the people who were flying the drones. And it was because they were in that space, and it's part of that - that sort of definition of soldiers, putting yourself at risk. Well, what happens when you have riskless war?
GROSS: My guest is P. W. Singer, author of the new book, "Wired for War." He's a senior fellow at the Brookings Institution. We'll talk more after a break. This is Fresh Air.
(Soundbite of music)
GROSS: My guest is P. W. Singer, and we're talking about his new book, "Wired for War: The Robotics Revolution and Conflict in the 21st Century." And he's a senior fellow at the Brookings Institution.
We've been talking about what it means for the United States to have robotics in war, but if we have robotics, it means our enemies either have or will have robotics soon. So let's look at how, for instance, terrorists are using or can in the future use robotics for their ends. Are they already doing it?
Mr. SINGER: The fact of this revolution in war and technology is that we need to pull back and go, hold it. Do you always have a permanent first move or advantage? And that's never true in war, and look what happened with gunpowder or tanks. The first people to use them weren't the ones that ended up winning in the end. And it's the same think with technology. I mean, you know, we don't use Wang computers right now. We don't play video games still on Atari. And so when it comes to war and robotics, we've got to be concerned by the fact that, for example, 42 other countries have military robots programs, including all our, you know, potential adversaries at some point in the future.
But the twist of robotics is that it's not a technology like atomic energy where it takes a huge structure to put it together. This revolution is going to be open source, and we've already seen all sorts of interest or use by non-state actors. For example, when Israel went to war with Hizbollah, it was a war between a state and a non-state organization, but it was also the first war where both sides used robotics against each other.
But there have been other things. For example, there's a Jihadi Web site out there right now where you can remotely detonate an IED in a rock sitting at your home computer. Or one of the other folks that I interviewed was a robot scientist who consults for the Pentagon, and he went into a pretty detailed scenario where with just $50,000, as he put it, he could shut down Manhattan.
But the big trend when it comes to terrorism, in my mind, is two things. One, it reinforces the empowerment of individuals versus states and organizations. It's something that's already going on, but with these new technologies, it takes it further. And then the second is that with robotics, it eliminates the culling power of suicidal attacks. That is, you don't have to promise 72 virgins to a robot to convince it to blow itself up.
GROSS: I think most of us were introduced to the idea of robotics and war with the smart weapons of the Gulf War in 1991. And it just seemed amazing. These, like, weapons were programmed through global positioning systems to find their targets and go right to them, and it just seemed amazing to me and I'm sure to a lot of other people that this kind of thing was even conceivable. And now, so many of us have GPS systems in our cars or even on our cell phones. So that kind of amazing system has been personalized. We're not using it for bombs but we're still using the technology.
And at the same time, I have to say, as amazing as it is, my own GPS system has led me astray so many times. It has me turning onto one-way streets in the wrong direction. It has me turning left where it says No Left Turn Allowed. I once thought I had turned it off, and I got to my hotel room - I have a portable one - and it started telling me to turn right in my hotel room.
(Soundbite of laughter)
GROSS: And it just made me think, are the smart bombs having these problems too?
Mr. SINGER: Almost definitely. I mean, and the scary thing is the consequences are much bigger. One robot company executive described these as "oops moments," when your technology goes awry. And you know, what is an oops moment in the military robots field? Well, funny moments are, for example, when they pulled out a prototype of one that was armed with a machine gun, and during a demonstration for a bunch of VIPs, it actually pointed the machine gun at the VIPs.
(Soundbite of laughter)
Mr. SINGER: You know, it's almost like that scene out of the "RoboCop" movie. Fortunately, it wasn't loaded and no one got hurt. That's a funny one. A sad one is when in South Africa they had a training exercise with a system that was actually loaded and it had a quote "software glitch." And we've all had software glitches. Well, in this software glitch, the system turned itself on and started firing, and nine people were killed. Ooops moments are going to matter.
And there's a great amount of excitement over these technologies, over science fiction coming true. And the technologists talk about things like Moore's Law. You know, the power in our computers is affectively doubling every two years, and isn't that going to create some remarkable things? Well, it is. But there's another law that happens when you mix humans and technology, and that's Murphy's Law. You know, anything that can go wrong will. And the concern here with these systems is when you take that into war, the consequences are huge, but we've got this compression of time happening right now where we have a lot less time to react. And so these kind of discussions of things that really seem like science fiction, we've got to be serious and talk about them right now.
We're using thousands of these systems already in Iraq, and when you talk with, for example, Air Force three-star generals, they say, well, you know what? Pretty soon we're going to be using tens of thousands of them. But then when you go out and meet with, you know, people at the Red Cross, and you say, well, what are the laws for these? Where do we stand on international law? And they say, well, I don't know. That's not a good place to be in.
GROSS: Yeah. And as you point out, laws and human rights codes usually lag behind changes that have been made on the battlefield.
Mr. SINGER: That's the thing that scares me is that when we look at all the other new technologies and changes, you know, we humans usually wait for the bad thing to happen first before we react. And you know, we don't get the Geneva Conventions unless we have the Holocaust.
GROSS: One of the interviews that you did earlier on Fresh Air was about a book, a really good book that you wrote about private military contractors. And this was before we really learned how big a role private military contractors were going to have in war. This was before Blackwater. So watching the Blackwater story play out, watching the Bush administration kind of dismantling some of the military and relying more and more on outsourcing to private contractors, is there any turning back if we wanted to?
If the Pentagon felt that we just didn't have enough control over private military contractors and that there was too much fraud and it was ultimately more expensive, is there any way of turning back from that?
Mr. SINGER: There's always a way to turn back. It's a question of whether you're willing to pay the cost to do so. And when I say you, it's all of us collectively, both on the public, within the military, within the political leadership. And it connects to a bigger theme here. People go, you've worked on private military contractors, you've worked on child soldiers, and now you're doing a book on robots and war. How does all this tie together? And they actually do tie together in an important way, which is that we have an assumption of who fights war. And it's a man wearing a uniform, which means they're fighting on behalf of a military, and they're fighting on behalf of a nation state, and war is about politics.
Well, look at war in the 21 century. That assumption is just simply wrong. Yes, you do have men fighting in militaries. You also have women. But also, you have, for example, private contractors fighting alongside them, working so not for the state but for private companies. Look at who they're fighting. It's not other militaries, It's everything from terrorist groups and insurgent groups to child soldier groups. Another fundamental change.
And now we have this really big, big change happening, and that is the very assumption of a human in war is fraying, and we're using machines. And that has all sorts of consequences, and when we move down this pathway and we don't think about it, we really get into the briar patch. And we saw that, for example, well, it's very easy to hire private military contractors. Oh, we don't have the legal framework figured out for it? Well, don't worry about that. Oops, got a shooting - don't know what to do with these guys. Same thing with robotics.
But there's a bigger thing going on here, and I think it's one of the driving forces, is we're using contractors as a way to avoid some of the public costs of going to war, the consequences of it. And that's actually the same driving thing for moving towards using more and more unmanned systems. And so the ending point for me that I really wonder about is what are the consequences of this for democracy itself? That is, we're taking certain trends and moving them to their final logical ending point. A way of thinking about this is that if you don't have a draft, if you don't have declaration of wars, if you don't have war bonds, for example, and now you have the knowledge that the Americans at risk are more and more just American machines, you're taking the bars to war that are already lowering, and you're taking them to the ground. You're taking them to zero.
GROSS: In thinking about how the law will be lagging behind with actually habiting with robotics on the battlefield, what are some of the consequences that we are likely to face with autonomous machines fighting our wars, and what are like the legal accountability questions that we're going to have to deal with?
SINGER: I think this is a huge part that we're just now entering into. There's the issue of war crimes. Do using these machines make them more or less likely? So on one hand, many war crimes are caused by fury. These are often how massacres happen. And robots are emotionless, so you won't have things like Emilay(ph).
But the flip side of it is, again, robot are emotionless. To a robot, that 80-year-old grandmother in a wheelchair is just a different set of zeros and ones than that tank to target, and so the empathy is taken out of it. And that may be also true not just for autonomous systems but for when you've got the humans 7,000 miles away. All the studies have shown that it's a lot easier to kill when you have distance created, when you're looking at people on a screen.
The real issue of this for me is that we haven't yet created all these frameworks for how you apportion out accountability in this strange new world when the strange new world, whereas the guy at Human Rights Watch is describing it is, you know, trying to apply technology from the 21st century to laws of war from past centuries. And the weird part of this that I found is that there's a quiet minority that says, well, you know what, there's not a problem with this.
One of the scientists I interviewed, he said, you know what? There's no real legal or ethical problem that I can contemplate with robots unless, you know, the machine kills the wrong persons repeatedly. And then this was such a great quote, he said, then it's just a product recall issue. Like it's just any other machine.
GROSS: Wow. Right.
SINGER: And it is a killing machine. These are systems designed for war. But war is a place also of psychology, and I think that is just a huge aspect of this that we're just now starting to understand. How does this effect the enemy's attitudes? How does it affect the public's attitudes around this? There's a belief, for example, I interviewed a Bush administration official who said, this only plays to our advantage. The thing that scares people is our technology.
But when you go out, for example, and interview people in the Middle East, very different. One of the people that appears in the book is an editor from Lebanon who described these technologies as just another example of the cold-hearted Americans, and how it shows that we only have to kill a few of their soldiers to defeat them, that they are cowardly for sending out machines to fight on their behalf. So you have a complete opposite message from what you think you are sending and what people are receiving.
GROSS: Well, P.W. Singer, I want to thank you so much for talking with us.
SINGER: Thank you.
GROSS: P.W. Singer is the author of the new book, " Wired For War," and directs the 21st Century Defense Initiative at the Brookings Institution. Coming up, Maureen Corrigan reviews a new book collecting Susan Sontag's journals and notebooks. This is Fresh Air.
..COST:
$00.00
..INDX:
99663723
*** TRANSCRIPTION COMPANY BOUNDARY ***
..DATE:
20090122
..PGRM:
Fresh Air
..TIME:
12:00-1:00 PM
..NIEL:
N/A
..NTWK:
NPR
..SGMT:
Susan Sontag's Ruminations 'Reborn'
TERRY GROSS, host:
When public intellectual essayist and novelist Susan Sontag died in 2004, she left behind some 100 notebooks that chronicled her life and thoughts since adolescence. Selections from the earliest notebooks have just been published in a collection called "Reborn." It's edited by Sontag's son, David Rieff. But critic Maureen Corrigan has a review.
MAUREEN CORRIGAN: Maybe the unexamined life is worth living after all. I say that mostly in jest, but I don't know if I've ever even dimly entertained that thought before. The heretic notion that a very light lobotomy might be not be such a bad thing came over me as I was reading "Reborn," which is the title given to Susan Sontag's newly published journals and notebooks from the period 1947 to 1963.
Sontag is resolutely hard on herself and resolutely off-putting. She never makes it seem powerful or sexy or God forbid fun to be smart. She talks in her journals about the passions of the body, but about the passions of the mind she's mute. Instead, the impression a reader gets from delving into her private journals is that intelligence is as much an austere duty as it is a gift.
From her teenage years onward, Sontag willed herself to carry out the promethean task to know everything. Her journals are filled with lists of the books she's read or feels she needs to read. For instance, in an entry dated December 19th, 1948, when Sontag would have been 15 years old, she writes: There are so many books and plays and stories I have to read. Here are just a few. The list, which includes the works of Andre Gide, Dante, Faulkner, George Meredith, Dostyesky, Taso(ph), Shaw and O'Neil, goes on in the original notebook for more than five pages and includes over 100 titles.
The notebooks are also filled with sterner versions of Ben Franklin-like self-improvement lists. An entry for 1961 begins: One, not to repeat myself. Two, not try to be amusing. Three, to smile less, talk less. These 16 years' worth of Sontag's journal entries cover her antsy teenage years, her escape to college at age 15, her early lesbian love affairs, her baffling sudden marriage to her college instructor, Phillip Rieff, at age 17, her escape to Oxford and Paris, and eventually her divorce and landing in New York City during the late heyday of the reign of the New York intellectuals.
Taken together, these entries compose a rare female account of gargantuan ambition, self-confidence and discipline. As Sontag's son, David Rieff, who's the editor of this collection says in an astute and melancholy preface: From her early adolescence, my mother had the sense of having special gifts and of having something to contribute. She wanted to be worthy of the writers, painters and musicians she revered.
For better or worse, Sontag never got over herself. She always seems to have believed in her own fabulousness intellectually. Sexually, it was a different matter. There are lots of painful entries here about humiliation at the hands of female lovers who've grown indifferent. There are also many ruthless remarks about marriage. The entry recording her marriage reads: I marry Phillip with full consciousness and fear of my will toward self-destructiveness. Several years later, she comments: I am scared, numbed from the marital wars. Lovers fight with knives and whips, husbands and wives with poisoned marshmallows, sleeping pills and wet blankets.
As with so many other subjects in these journals, Sontag conceals even as she reveals. Writing in shorthand, rather than conducting a full-length assessment of herself, a reader naturally wants to know how she came to terms so early on with her lesbianism, why she married Rieff, how she made the decision to leave him and her young son to flee to Oxford. But somehow, Sontag's journals make readers feel that even to ponder these questions suggests that we're not grasping the more important subjects.
We don't much like public intellectuals in this country. We tend to think of them as elite, mostly humorless, and certainly anti-democratic. Whether the presidency of Barack Obama changes that attitude at all remains to be seen. But most certainly, Sontag's journals will just confirm the reigning prejudices against eggheads.
Sontag's journals chronicle an amazing story, the willed self-creation of an intellect. But it's a story to admire, not a story that inspires.
GROSS: Maureen Corrigan teaches literature at Georgetown University. She reviewed "Reborn: Journals and Notebooks, 1947 to 1963" by Susan Sontag, edited by her son, David Rieff.
You can download podcasts of our show on our Web site, freshair.npr.org. Fresh Air's executive producer is Danny Miller. Our engineer is Audrey Bentham. Dorothy Ferebee is our administrative assistant. Roberta Shorrock directs the show. I'm Terry Gross.
..COST:
$00.00
..INDX:
99729820
Transcripts are created on a rush deadline, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of Fresh Air interviews and reviews are the audio recordings of each segment.