Action on a global scale must be taken to curb the development of military killer robots that think for themselves, a leading British expert has said.
‘Terminator’-style machines that decide how, when and who to kill are just around the corner, warns Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield.
Far from helping to reduce casualties, their use is likely to make conflict and war more common and lead to a major escalation in numbers of civilian deaths, he believes.
“I do think there should be some international discussion and arms control on these weapons but there's absolutely none,” said Prof Sharkey.
“The military have a strange view of artificial intelligence based on science fiction. The nub of it is that robots do not have the necessary discriminatory ability. They can't distinguish between combatants and civilians. It's hard enough for soldiers to do that.”
Iraq and Afghanistan have both provided ideal “showcases” for robot weapons, said Prof Sharkey.
The ‘War on Terror’ declared by President George Bush spurred on the development of pilotless drone aircraft deployed against insurgents. Initially used for surveillance, drones such as the Predator and larger Reaper were armed with bombs and missiles.
At present these weapons are still operated remotely by humans sitting in front of computer screens. RAF pilots on secondment were among the more experienced controllers used by the US military, while others only had six weeks training, said Prof Sharkey. “If you're good at computer games, you're in,” he added.
“The next thing that's coming, and this is what really scares me, are armed autonomous robots,” said Prof Sharkey speaking to journalists in London. “The robot will do the killing itself.
“This will make decision making faster and allow one person to control many robots. A single soldier could initiate a large scale attack from the air and the ground.
“It could happen now; the technology's there.”