Robot warfare and autonomous weapons, the next step from unmanned drones, are already being worked on by scientists and will be available within the decade, said Dr Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.
The Stop the Killer Robots campaign will be launched in April at the House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons.
"These things are not science fiction; they are well into development," said Sharkey. "The research wing of the Pentagon in the US is working on the X47B [unmanned plane] which has supersonic twists and turns with a G-force that no human being could manage, a craft which would take autonomous armed combat anywhere in the planet.
"There are a lot of people very excited about this technology, in the US, at BAE Systems, in China, Israel and Russia, very excited at what is set to become a multibillion-dollar industry. This is going to be big, big money. But actually there is no transparency, no legal process. The laws of war allow for rights of surrender, for prisoner of war rights, for a human face to take judgments on collateral damage. Humans are thinking, sentient beings. If a robot goes wrong, who is accountable? Certainly not the robot."
Last November the international campaign group Human Rights Watch produced a 50-page report, Losing Humanity: the Case Against Killer Robots, outlining concerns about fully autonomous weapons.
"Giving machines the power to decide who lives and dies on the battlefield would take technology too far," said Steve Goose, arms division director at Human Rights Watch. "Human control of robotic warfare is essential to minimising civilian deaths and injuries."