“Lethal autonomy is inevitable,” said Ronald C. Arkin, the author of “Governing Lethal Behavior in Autonomous Robots,” a study that was funded by the Army Research Office.And if the software has a bug in it and the machine kills when it was not supposed to, would they then charge the software engineers with negligent homicide?
Arkin believes it is possible to build ethical military drones and robots, capable of using deadly force while programmed to adhere to international humanitarian law and the rules of engagement. He said software can be created that would lead machines to return fire with proportionality, minimize collateral damage, recognize surrender, and, in the case of uncertainty, maneuver to reassess or wait for a human assessment.
This country has become a lot more desensitized to war, because combat is now something that only impacts a small percentage of the population. I submit that if we still had an army where a good percentage of the combat soldiers were draftees, that there would be more political ramifications to going to war and our governing classes would not do it so casually.
Moving towards a day when autonomous robots, robotic mercenaries if you will, do the fighting will make war even more likely. We are becoming a nation of chickenhawks.
Those who think that this is a good idea need to have their heads examined.