Mr Hawking has previously said that AI could grow so powerful it would be capable of killing us entirely unintentionally
(Getty)
Sign up to our free weekly IndyTech newsletter delivered straight to your inbox
Sign up to our free IndyTech newsletter
Stephen Hawking has warned that technology needs to be controlled in order to prevent it from destroying the human race.
The world-renowned physicist, who has spoken out about the dangers of artificial intelligence in the past, believes we need to establish a way of identifying threats quickly, before they have a chance to escalate.
“Since civilisation began, aggression has been useful inasmuch as it has definite survival advantages,” he told The Times.
“It is hard-wired into our genes by Darwinian evolution. Now, however, technology has advanced at such a pace that this aggression may destroy us all by nuclear or biological war. We need to control this inherited instinct by our logic and reason.”
Gadget and tech news: In pictures
Show all 25
He suggests that "some form of world government” could be ideal for the job, but would itself create more problems.
“But that might become a tyranny," he added. “All this may sound a bit doom-laden but I am an optimist. I think the human race will rise to meet these challenges.”
“The real risk with AI isn't malice but competence,” Professor Hawking said. “A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren't aligned with ours, we're in trouble.
“You're probably not an evil ant-hater who steps on ants out of malice, but if you're in charge of a hydroelectric green energy project and there's an anthill in the region to be flooded, too bad for the ants. Let's not place humanity in the position of those ants.”
“Over time I think we will probably see a closer merger of biological intelligence and digital intelligence,” he said, suggesting that people could merge with machines in the future, in order to keep up.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies