6 Unsettling Similarities Between AI and Nuclear Weapons

You might not want to read this article, but that’s exactly why you should.

Brandon Cornett
6 min readOct 17, 2023

--

Back in May, the nonprofit Center for AI Safety released a statement co-signed by hundreds of industry leaders, including OpenAI’s CEO Sam Altman. You’ve probably heard of it already, given the flood of media coverage it generated.

Here’s the part of their statement that sparked all the headlines:

“Mitigating the risk of extinction from A.I. should be a global priority alongside other societal-scale risks, such as pandemics and nuclear war.”

But the similarities between artificial intelligence and nuclear weapons go far beyond their being mentioned in the same press release. They actually share a number of similarities.

For instance, AI and nuclear weapons both initiated a global race toward dominance and development. They were both created by some of the smartest scientists and engineers on the planet. And they both represent a point of no return, beyond which there is no chance for reversal or dis-invention.

6 Similarities Between AI and Nuclear Weapons

I’ve cited surveys, studies and reports throughout this article. But let’s start with a basic generalization.

It’s probably safe to assume that most people alive today wish that nuclear weapons never existed. After all, the U.S. and Russian arsenals alone could wipe out every city on the map, ushering in a nuclear winter and rendering the planet unlivable. So we’d be better off without them.

My prediction is that, over the coming years, many people will develop similar views about artificial intelligence.

With that in mind, here are six unsettling yet undeniable similarities between AI and nuclear weapons:

1. They were the biggest scientific breakthroughs of their time.

Artificial intelligence and nuclear weapons rank among the most complex technological achievements of all time, especially within their respective eras.

The development of atomic bombs during World War II marked a huge scientific leap, creating a weapon with unprecedented destructive power. The development of artificial intelligence — and especially the most recent advancements — has given computers an extraordinary level of sophistication and capability.

Both of the technologies also required tremendous investments of time, resources and funding. At its peak, the Manhattan Project involved 130,000 workers and spent $2.2 billion by the end of World War II (roughly the equivalent of $30 billion in current dollars).

OpenAI reportedly incurred more than $500 million in losses when it was developing ChatGPT. And a single Nvidia chip (the kind used in many AI applications) can cost $10,000. Not to mention the countless work hours and computing costs that go into their development.

2. They both spurred a global race for dominance.

Both artificial intelligence and nuclear weapons spurred intense global competition, especially among the most developed nations. Nobody wanted to be left behind.

Even more concerning, there could eventually be a dangerous new link between artificial intelligence and nuclear weapon systems.

According to a 2018 article from the RAND Corporation, a nonprofit global policy think tank:

“Stunning advances in AI have created machines that can learn and think, provoking a new arms race among the world’s major nuclear powers. It’s not the killer robots of Hollywood blockbusters that we need to worry about; it’s how computers might challenge the basic rules of nuclear deterrence and lead humans into making devastating decisions.”

(Note the publication date. The above-quoted article was published more than four years before ChatGPT and its rivals burst onto the scene, greatly accelerating the timetable for AI development. I would bet the authors of this article have even greater concerns today, in the wake of those advancements.)

3. They both need to be heavily regulated.

  • The Treaty on the Non-Proliferation of Nuclear Weapons (NPT)
  • Comprehensive Nuclear-Test-Ban Treaty (CTBT)
  • Treaty on the Prohibition of Nuclear Weapons (TPNW)
  • Strategic Arms Reduction Treaty (START) and New START
  • International Atomic Energy Agency (IAEA)

These are just a few of the treaties, agreements and organizations involved in regulating nuclear weapons. And there’s an obvious need for them.

As artificial intelligence becomes more and more powerful, it too will warrant oversight and regulation. And no one knows this better than the AI developers themselves. To my knowledge, AI is the only industry in history to ask for government regulations.

4. There’s no way to reverse or dis-invent them.

Nuclear weapons can be disposed of through disassembly and dismantling. It’s a long and complicated process, but it can be done. Unfortunately, preventing nations from developing new nuclear weapons is a tougher nut to crack. North Korea is the obvious example here.

Aside from traveling back in time, the only way to eliminate all nuclear weapons and prevent their future development would be to have all nations agree on the subject. And that’s probably something that will never happen.

Can you imagine all nine nuclear nations (including the U.S., Russia, China, Pakistan, India, Israel and North Korea) working together in harmony to dismantle their arsenals?

The genie is out of the bottle and there’s no putting it back.

The same goes for artificial intelligence. AI has dazzled the world with its current capabilities and future potential. It’s like the invention of electricity times a thousand, and now everyone wants to get in on it. AI has created the economic and technological version of FOMO, the fear of missing out.

5. They both pose existential threats to humanity.

Both of these technologies have the potential to create existential risks for humanity. Nuclear war could wipe out humanity, and AI could become so intelligent that it surpasses human intelligence and threatens our way of life.

We can probably all agree that nuclear weapons pose an existential threat. It’s inarguable really. AI, on the other hand, tends to generate a lot more debate when it comes to its future dangers.

Even so, a growing chorus of AI researchers, developers and experts have expressed serious concerns about the potential risks of these systems, up to and including an extinction-level threat.

In a December 2023 article in Quanta Magazine, Santa Fe Institute professor Melanie Mitchell wrote the following:

“It’s a familiar trope in science fiction — humanity being threatened by out-of-control machines who have misinterpreted human desires. Now a not-insubstantial segment of the AI research community is deeply concerned about this kind of scenario playing out in real life.”

Elon Musk, one of the original board members for ChatGPT maker OpenAI, believes that “AI is a fundamental risk to the existence of human civilization.”

Stephen Hawking, one of the smartest humans ever to walk the planet, once said that “The development of full artificial intelligence could spell the end of the human race.”

At the Yale CEO summit held back in June, 42% of surveyed CEOs said they believed artificial intelligence has the potential to destroy humanity within the next five to 10 years.

Computer scientist Yoshua Bengio, often referred to as one of “Godfathers of AI,” recently told the Bulletin of the Atomic Scientists:

“We really want to avoid making it easy for an AI system to control the launch of nuclear weapons. AI in the military is super dangerous, even existential. We need to accelerate the international effort to ban lethal autonomous weapons.”

(As for those AI-powered autonomous weapons, they currently exist and are rapidly advancing.)

6. They both cause widespread anxiety and worry.

The very existence of nuclear weapons can cause fear, anxiety and worry among some people. Add in geopolitical tensions, war and conflict, and the dread-o-meter shoots even higher.

According to a 2022 survey conducted by the American Psychological Association and Harris Poll, the vast majority of respondents said they were worried about nuclear war. To quote their report: “69% of adults reported they are worried the invasion of Ukraine is going to lead to nuclear war, and that they fear that we are at the beginning stages of World War III.”

More recent polls have shown that a lot of people experience a similar kind of angst relating to artificial intelligence. One such survey found that nearly half of Americans are worried about AI causing the end of the human race.

To understand why so many people have existential concerns about artificial intelligence, you need look no further than this list. Nuclear weaponry and AI have fundamentally altered the world order. (Though in the case of AI, it’s still early days.) Both of these technologies have created existential fears that did not exist before their arrival.

--

--