To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Expert says AI 'could lead to literal human extinction' in jaw-dropping court testimony

Expert says AI 'could lead to literal human extinction' in jaw-dropping court testimony

The tech specialist addressed the Senate on the topic of Insider Perspectives on AI

A former OpenAI board member has discussed the dangers artificial intelligence (AI) poses and that developing technology could ‘lead to human extinction’.

On Tuesday (September 17), Melbourne-born researcher Helen Toner testified before the Senate Judiciary Subcommittee on Technology, Privacy, and the Law.

While testifying, the expert warned that advanced tools could have a seriously negative impact on humanity in the coming years.

An AI expert has recently addressed the Senate (Getty stock image)
An AI expert has recently addressed the Senate (Getty stock image)

Taking to X - formerly known as Twitter - afterwards to explain her thoughts on the Insider Perspectives on AI topic, Toner said: “I focused on a big disconnect I see between East Coast & West Coast conversations about AI: how seriously to take the possibility that very advanced—and possibly very dangerous—AI systems are built quite soon.”

In the hearing, the Center for Security and Emerging Technology (CSET) employee spoke about Artificial General Intelligence (AGI) and how she believes the term isn’t well defined.

“It’s generally used to mean AI systems that are roughly as smart or capable as a human but in public and policy conversations talks of human level AI is either treated as science-fiction or marketing hype.”

Toner went on to claim that some engineers and scientists believe they will be able to create AGI systems in the next ten or 20 years before saying some are as close as ‘one to three years away’.

“Many of these same people believe that if they succeed in building computers that are as smart as humans… that technology will be at a minimum extraordinarily destructive,” she continued. “And at a maximum could lead to literal human extinction.”

In follow-up posts, Toner claimed she didn’t find the concept of AGI to be ‘very helpful’ and that she believed we need to start preparation for soon-to-be-built, highly-advanced AI systems becoming readily available.

Helen Toner claimed AI could eventually wipe out humans (C-Span)
Helen Toner claimed AI could eventually wipe out humans (C-Span)

“Of course, ‘start preparing now’ is not the same as ‘assume AGI by 2027 and go all out to stop it,’” she elaborated.

“Personally, I'm extremely uncertain about how we should expect AI to progress over the next 5-10 years, so hardline policies with big downsides are not appealing.”

Toner has since called on the US government to implement some ‘super basic policy measures’ that could help with existing AI harms.

“What makes me most enthusiastic about these policies is that they would give us a much better shot at being able to notice & respond to changes in the field of AI over time,” she added.

“Maybe things get scarier and we need to massively ramp up oversight—maybe they don't! That would be great.”

Interestingly, Toner isn’t the only field professional who believes AI could eventually wipe out humanity.

According to the popular Nature journal, at least 5 percent of surveyed AI experts claimed there was a chance that super-intelligent machines will destroy humanity.




After hearing Toner address the Senate with her thoughts, one Reddit user typed: “I think everyone knew that for a while, and we’re just kinda banking on the fact it won’t.”

A second countered: “Many scientists believe humans will lead to human extinction.”

Someone else commented: “What about oil industry, other greenhouse gas emissions and climate change? I'm way more worried about these.’

“For sure we are going to die some day... but lets all work really hard to ensure we don't all die on the same very bad day,” remarked a fourth.

Featured Image Credit: cspan/X / Andriy Onufriyenko via Getty