'Godfather of artificial intelligence' worried about human-extinction level threats caused by AI

He said while he felt AI would increase productivity and wealth, the money would go to the rich “and not the people whose jobs get lost and that’s going to be very bad for society”.
Geoffrey Hinton, who created a foundation technology for AI systems.
Geoffrey Hinton, who created a foundation technology for AI systems. (File Photo | AP)

Professor Geoffrey Hinton, regarded as the "godfather of artificial intelligence" says he is “very worried about AI taking lots of mundane jobs".

He told BBC Newsnight, that against this backdrop, a benefits reform giving fixed amounts of cash to every citizen would be needed.

“I was consulted by people in Downing Street and I advised them that universal basic income was a good idea,” he said.

The concept of a universal basic income amounts to the government paying all individuals a set salary regardless of their means.

He said while he felt AI would increase productivity and wealth, the money would go to the rich “and not the people whose jobs get lost and that’s going to be very bad for society”.

Until last year, Hinton worked at Google, but left the tech giant so he could talk more freely about the dangers from unregulated AI.

Professor Hinton reiterated his concern that there were human extinction-level threats emerging, BBC said.

Developments over the last year showed governments were unwilling to rein in military use of AI, he said, while the competition to develop products rapidly meant there was a risk tech companies wouldn't “put enough effort into safety”.

Professor Hinton said "my guess is in between five and 20 years from now there’s a probability of half that we’ll have to confront the problem of AI trying to take over".

This would lead to an “extinction-level threat” for humans because we could have “created a form of intelligence that is just better than biological intelligence… That's very worrying for us”.

AI could “evolve”, he said, “to get the motivation to make more of itself” and could autonomously “develop a sub-goal of getting control”.

Related Stories

No stories found.

X
The New Indian Express
www.newindianexpress.com