Audio By Carbonatix
Oscar-winning director of the Terminator films, James Cameron, has long expressed his concerns with artificial intelligence (AI) being weaponized against humans. It isn’t just a work of fiction, he has claimed. In fact, he has gone so far as to state that the potential misuse of AI could literally be the cause of the end of the world.
“The point is that no technology has ever not been weaponized,” James Cameron said in 2023. “And do we really want to be fighting something smarter than us that isn’t us? On our own world? I don’t think so.
“AI could have taken over the world and already be manipulating it, but we just don’t know because it would have total control over all the media and everything. What better explanation is there for how absurd everything is right now? Nothing makes better sense to me.”
He also warned in 2023, “I think the weaponization of AI is the biggest danger. I think that we will get into the equivalent of a nuclear arms race with AI, and if we don’t build it, the other guys are for sure going to build it, and so then it’ll escalate. You could imagine an AI in a combat theatre, the whole thing just being fought by the computers at a speed humans can no longer intercede, and you have no ability to de-escalate.”
A ‘Terminator-style’ apocalypse
Fast forward a couple of years and James Cameron is still issuing warnings about artificial intelligence potentially taking over the world.
“I do think there’s still a danger of a Terminator-style apocalypse where you put AI together with weapons systems, even up to the level of nuclear weapon systems, nuclear defense counterstrike, all that stuff,” Cameron told Rolling Stone. “Because the theater of operations is so rapid, the decision windows are so fast, it would take a superintelligence to be able to process it, and maybe we’ll be smart and keep a human in the loop. But humans are fallible, and there have been a lot of mistakes made that have put us right on the brink of international incidents that could have led to nuclear war. So I don’t know.
“I feel like we’re at this cusp in human development where you’ve got the three existential threats: climate and our overall degradation of the natural world, nuclear weapons, and superintelligence. They’re all sort of manifesting and peaking at the same time. Maybe the superintelligence is the answer. I don’t know. I’m not predicting that, but it might be.”
Others have made similar warnings
James Cameron is far from the only person with a deep knowledge of AI that has issued such warnings. In 2022, a third of artificial intelligence scientists said they believe AI could cause a nuclear-level catastrophe. Oxford and Google Deepmind researchers have claimed AI will “likely” eliminate humanity. Bill Gates has expressed major concerns as well.
Let’s just hope Robocop is around to protect us before any of that happens.