Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

UN envoy urges end to plans for battle-field killing machines

Ethical groups join call to halt machine-soldiers that identify and kill without human input

Charlotte McDonald-Gibson
Thursday 30 May 2013 19:51 BST
Comments
Rise of the machines: Christof Heyns points to the history of drone-deployment as evidence that countries could escalate to using autonomous robots
Rise of the machines: Christof Heyns points to the history of drone-deployment as evidence that countries could escalate to using autonomous robots (Warner Bros.)

It’s a scenario that could have emerged from the imagination of a science fiction writer – killing machines stalking future battlefields with heat-seeking weapons so that human soldiers do not have to risk their lives.

But these machines are not confined to books and blockbuster action films. They are all very real – either already in use in conflict areas, or in development – as governments seek ways of exploiting technology to give them the edge on the battlefield. The existence of such “killer robots” is worrying Christof Heyns, the United Nations envoy on extra-judicial, summary or arbitrary executions. Presenting a report in Geneva, he called for a ban on developing robots which could identify and kill without any human input.

Mr Heyns warned that autonomous killing machines – not yet deployed in any battlefield – could blur the lines of command in war crimes cases and added that action must be taken before the technology overtakes existing legislation. “Time is of the essence. Trying to stop technology is a bit like trying to stop time itself – it moves on,” he said. His report argues that “modern technology allows increasing distance to be put between weapons users and the lethal force they project”.

That report is backed by the Campaign to Stop Killer Robots, a coalition of groups including Human Rights Watch, Amnesty International and Handicap International, which is calling for a halt in development of weapons which take the decision to shoot and kill out of human hands.

There has already been heated debate on the ethical implications of pilotless aircraft such as the Predator and Reaper drones, which are controlled from an air force base in Nevada, thousands of miles away from the mountains where they unload their ordnance.

But critics say this takes modern warfare too close to the realms of a computer game.

Ground robots currently deployed include the SGR-1, a robot fitted with a machine gun, which South Korea has installed along its border with its northern neighbour. While they are also not quite the dead-eyed androids wandering the dystopian landscapes of Ridley Scott’s Blade Runner, or the metal killing machines of the Terminator films, they are close enough to send a shiver down many a cinemagoer’s spine.

“The biggest problem in robotics is we’ve seen too much science fiction,” said Rich Walker, managing director of the Shadow Robot Company which researches and develops robotics.

He pointed out that robots are already deployed on the battlefields, performing vital tasks such as bomb disposal and argued that some responsive robots were actually little different from land mines and other booby traps which are set up by humans and which respond to stimuli.

“Autonomous robots should be seen as neither a good thing nor a bad thing, he told The Independent. “It’s the way they are deployed.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in