What’s weird about this picture right now is that there is a robot in it. (Specifically the Packbot, manufactured by iRobot of Bedford, Mass., the same company that makes the Roomba robot vacuum cleaner.) In twenty five years, the weird thing about this picture will be that there’s a human being in it. Singer’s book is way too long, chatty, repetitive, gung-ho, and philosophically facile, but if you scrape away all the excrescent prose here, you get a complete and compelling account of how and why the military will be getting more and more dependent on robotics in the future. This is certainly good news in some ways. The Packbot is already saving lives in Iraq and Afghanistan. But when you consider the fact that 20 year olds are sitting in trailers at Nellis AFB in Nevada, controlling Predator drones over Iraq and Afghanistan and occasionally using them to shoot Hellfire missiles at suspected terrorists, you’re right to think that a whole host of ethical, legal, and moral questions are coming into play as the act of killing becomes more and more impersonal through the use of technology. Here’s another one to chew on: DARPA is spending millions upon millions of dollars developing AI for these robots and drones which will allow them to select targets on their own, without a human being in the loop. (Believe me, programs like this are not being funded by the Department of Defense in order to produce robots capable of reading newspapers to the blind.) I’m generally not a big paranoiac when it comes to questions like this; it seems like — as with biological and so-called “tactical” nuclear weapons — the human race usually seems to put the brakes on truly catastrophically stupid ideas before they’re fully realized. But some of this stuff truly scares me, since I know that there is such pressure on politicians to win wars without shedding blood.