Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Robots will 'become lethal' and leave us 'absolutely defenceless', leading professor warns

Scientific community must decide whether it supports the creation of robotic killing machines, professor Stewart Russell has warned

Andrew Griffin
Thursday 28 May 2015 14:59 BST
Comments
The Terminator, perhaps the most famous fictional killer robot. Experts have warned autonomous killing machines could be a reality in the next 20-30 years
The Terminator, perhaps the most famous fictional killer robot. Experts have warned autonomous killing machines could be a reality in the next 20-30 years (REX FEATURES)

The US government is developing highly-advanced killer robots and we must decide whether we support or oppose them, a leading computer scientist has said.

Lethal autonomous weapons systems, or LAWS, are being developed that could eventually become super-powerful and will not be able to ethically choose who should live or die, Stewart Russell, a professor of computer science at the University of California, Berkeley has warned. In the journal Nature, Russell likened the power to nuclear weapons — and said that just as physicists eventually had to take a position on the use of that technology to kill, so should AI specialists and others.

"Autonomous weapons systems select and engage targets without human intervention; they become lethal when those targets include humans," he writes. "LAWS might include, for example, armed quadcopters that can search for and eliminate enemy combatants in a city, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions."

Russell said that all of the artificial intelligence and robots needed to create such killing machines are already in place. “They just need to be combined,” Russell notes, arguing that the same technology used in self-driving cars could easily be adapted for “urban search-and-destroy missions”.

If those robots are autonomous, they could kill humans without the normal checks and balances, Russell warns. “LAWS could violate fundamental principles of human dignity by allowing machines to choose whom to kill — for example, they might be tasked to eliminate anyone exhibiting 'threatening behaviour',” he writes.

Eventually those robots will likely become immeasurably small and so difficult for us to do anything about. Eventually, flying drones could carry “a one-gram shaped charge to puncture the human cranium”, and the only thing stopping them is the limits of physics, Russell warns.

“Despite the limits imposed by physics, one can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenceless. This is not a desirable future.”

The only way to stop such developments is for scientists to take a position, Russell warns. “Doing nothing is a vote in favour of continued development and deployment,” he says.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in