AI researchers have warned MPs that the development of “superhuman” artificial intelligence risks human extinction.
The House of Commons Science and Technology Committee heard from researchers at Oxford University, who advised that AI should be regulated in the same way as nuclear weapons.
“With superhuman AI, there is a particular risk that is of a different sort of class, which is... it could kill everyone,” said doctoral student Michael Cohen.
“If you imagine training a dog with treats: it will learn to pick actions that lead to it getting treats, but if the dog finds the treat cupboard, it can get the treats itself without doing what we wanted it to do.”
It is not the first time AI scientists have warned of the existential risks posed by the technology, with the latest warning echoing that of a thought experiment put forward by philosopher Nick Bostrom nearly 20 years ago.
The Paperclip Maximizer problem hypotheses that a super-intelligent AI would ultimately destroy humanity even if its initial goal – of producing the most amount of paperclips possible – was not explicitly malicious.
Recent AI advances have resurfaced fears surrounding advanced artificial intelligence and how it is handled and developed, though it will take broad consensus from governments and institutions around the world to impose effective safeguards and regulation.
The researchers said the AI industry had already become a “literal arms race” as competition mounts to produce both commercial and military applications with the technology.
“I think the bleak scenario is realistic because AI is attempting to bottle what makes humans special, that has led to humans completely changing the face of the Earth,” said Michael Osborne, a professor of machine learning at the University of Oxford.
“Artificial systems could become as good as good at outfoxing us geopolitically as they are in the simple environments of game.
“There are some reasons for hope in that we have been pretty good at regulating the use of nuclear weapons. AI is as comparable a danger as nuclear weapons.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies