Who is Outrider?

Outrider believes that the global challenges we face together must be solved by working together.

Among the greatest threats to the future of humankind are nuclear weapons and global climate change. Outrider makes the bold claim that both threats can be overcome — and not just by policy makers but by people with the right tools and inspiration.

Nuclear Weapons

Can Humans Resist the Allure of Machine Speed for Nuclear Weapons?

by Natasha E. Bajema

Integrating artificial intelligence into nuclear systems is risky, and the temptation to do so will only get worse.

The Terminator Conundrum

Imagine you could get a fresh-baked pizza from your favorite Italian joint delivered in less than eight minutes. As enticing as that sounds, depending on where you live, it’s probably impossible—at least, not without drone delivery.

Now consider this. Experts predict countries will soon be able to deliver nuclear weapons anywhere in the world in that same amount of time. Hypersonic weapons, stealthy cruise missiles, and automated systems that use artificial intelligence (AI) will make this possible.

Popular stories
What Does Feminism Have to Do With Nuclear Policy?
Where Would Europe Stand on a U.S. No First Use Policy?
Rethinking the Convenience of Single-Use Plastics
Food Waste: The World's Dumbest Environmental Problem
Q&A: No First Use of Nuclear Weapons
More stories
soldiers and politicians sit at table

MOSCOW, RUSSIA - DECEMBER 26, 2018: Russia's Special Presidential Envoy for Environmental Protection, Ecology and Transport Sergei Ivanov, Russia's Defense Minister Sergei Shoigu, Russia's President Vladimir Putin and Russia's Deputy Defense Minister, Army General Valery Gerasimov (L-R) watch the launch of Russia's Avangard hypersonic missile system via a video link from Russia's National Defense Management Center.

Mikhail Klimentyev\TASS via Getty Images

Nuclear experts warn of the risks of integrating AI-enabled systems into early warning, nuclear command and control, and delivery systems. But the temptation to do so will only increase in the future.

AI-enabled systems would offer countries that adopt them a slight time advantage over others. In a nuclear conflict, having more time to make decisions could mean the difference between a nation’s survival and its annihilation.

If one country integrates AI into its nuclear systems, others might feel they must follow. However, there are some grave dangers. Such systems would transfer decision-making authority from humans to machines. This would significantly increase the risk of accidental use of nuclear weapons, unintended escalation, and nuclear war. What is intended to provide security and guarantee survival might end up killing us all.

 
robot in front of Big Ben

A mock "killer robot" is pictured in central London on April 23, 2013 during the launching of the Campaign to Stop "Killer Robots," which calls for the ban of lethal robot weapons that would be able to select and attack targets without any human intervention. The Campaign to Stop Killer Robots calls for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons.

Carl Court/AFP via Getty Images

Senior U.S. military leaders have referred to the dilemma of autonomous systems as the “Terminator Conundrum”. Even if countries understand the risks, the “logic” of nuclear deterrence may persuade some to go down the AI-enabled path.

The Need for Speed

Time has always been a critical element in shaping nuclear competition. During the Cold War, the U.S. Air Force boasted that Minuteman intercontinental ballistic missiles (ICBMs) could reach their target in thirty minutes or less. This was supposed to make U.S. adversaries too afraid to attack.

 
nuclear blast door with Dominos pizza box painted on it

The Minuteman Intercontinental Ballistic Missile, aptly named for its rapid strike ability, has been famously memorialized with a hand-painted Domino’s pizza box, along with the guarantee of swift delivery, on the eight-ton steel and concrete door protecting the entrance to a launch control center.

National Park Service

Before the U.S. deployed the solid-fuel Minuteman in 1962, extremely volatile liquid fuel had to be stored separately from the missiles. The fuel could only be pumped into the ICBMs shortly before launch. The minimum delay of fifteen minutes seemed like an eternity when every minute counted against the U.S. ability to retaliate against a nuclear attack. The Minuteman solved this problem.

The need for speed also underpins the “logic” of nuclear deterrence. To function properly, deterrence requires that adversaries coexist in a state of “mutually assured destruction”. In other words, each side must be capable of annihilating the other in order to prevent nuclear war. Any technology that enables one side to carry out a preemptive disarming attack removes the chance for retaliation and increases the risk of nuclear war. To prevent such a scenario, time is of the essence.

In the future, the President may not have enough time to retaliate prior to the first detonation of a nuclear weapon on U.S. soil. To save time, the decision to retaliate with nuclear weapons would be made by a machine, not a human.

Since the dawn of nuclear weapons, strategic thinkers have worried about the speed of delivery, the speed of detection and warning, the speed of decision, and the speed of launch. Some experts have recently argued for the integration of AI-enabled systems into U.S. early warning, detection, and command and control systems to achieve additional gains in speed. In the future, the President may not have enough time to retaliate prior to the first detonation of a nuclear weapon on U.S. soil. To save time, the decision to retaliate with nuclear weapons would be made by a machine, not a human. It’s reminiscent of Stanley Kubrick’s doomsday device from the classic film, Dr. Strangelove. And most of us know how that movie ended.

Gee, I wish we had one of them doomsday machines.

George C. Scott as Gen. 'Buck' Turgidson in Dr. Strangelove

Time’s Been Short for a While Now

But, do these short decision times really matter? If so, why?

It’s hard to fathom that decisions made in thirty minutes are much better than those made in fifteen or eight minutes—that is, when they concern the life and death of millions of innocent civilians.

In reality, we’ve already lived with this for many decades: submarine-launched ballistic missiles (SLBMs) can reach their targets in less than eight minutes if they are located near the coast. Of course, using these second-strike weapons in a first-strike attack defies the logic of nuclear deterrence. The key point is this: timeframes for nuclear decision-making have been short for a long time.

 

If this is true, why do we think there’s a new problem? Stanley Kubrick explained it best in Dr. Strangelove: the problem exists because “deterrence is the art of producing in the mind of the enemy... the fear to attack.” It matters because we fear it.

Is a Doomsday Device Really Necessary?

Today, about four hundred Minuteman III ICBMs stand ready to launch in response to a nuclear attack. Due to the inherent vulnerabilities of ICBMs, the President would face incredible pressure to “use or lose them”. In other words, the perceived need to launch ICBMs before they are destroyed by an adversary exacerbates the decision time crunch.

missile in silo

Overhead view of Minuteman missile in launch tube.

Mint Images via Getty Images

Wouldn’t the elimination of ICBMs get rid of this problem? In a world without ICBMs, decision-makers would no longer have to make life or death decisions in mere minutes.

Nuclear deterrence experts would remind us that ICBMS also help protect the U.S. from a preemptive decapitation strike. During the Cold War, a preemptive strike on just five targets within the U.S. would have jeopardized its ability to retaliate. In the future, shorter decision times will exacerbate the potential for a loss of command and control in a first strike. In such a scenario, the quick launch of ICBMs would ensure early retaliation before loss of leadership or communication.

However, in extremely compressed situations, there might not even be time for the U.S. President to reach a decision, give the retaliation order, and/or move to a safe location from which to launch a retaliatory strike.

See our new projects first
We publish 1-2 stories each month. Subscribe for updates about new articles, videos, and interactive features.
 

In the past, the U.S. responded to this problem by strengthening communications and early warning systems and ensuring a massive second-strike capability. In the future, under the logic of deterrence, such situations might seem “to require” some type of doomsday device to ensure effective retaliation.

Instead of considering such extreme measures, wouldn’t it be more logical to eliminate nuclear weapons? In a world without nuclear weapons, there would no longer be a need to worry about decapitation strikes and no need to automate decisions to use nuclear weapons.

text "nuclear free now!" written on the ground in candles

Nuclear abolition rally in Hiroshima.

International Campaign to Abolish Nuclear Weapons

Nuclear deterrence experts would warn us that as long as U.S. adversaries retain their nuclear weapons, the U.S. needs a credible and effective nuclear deterrent to support its national security strategy and to prevent nuclear war. According to the logic of deterrence, eliminating nuclear weapons would actually increase the risk of their use if only one country decided to keep them.

When Deterrence Dictates Our Choices

Decades of U.S. nuclear strategy reveal a similar unbridled optimism in the logic of deterrence. It assumes rationality and the availability of information. It assumes accurate interpretation of signaling and calculations. It demonstrates a belief in the ability to control nuclear escalation. It takes as given the absence of panic or miscalculation on both sides in midst of a nuclear attack. It ignores the potential risk of accidental launch, false alarms, or miscalculations.

Indeed, the logic of deterrence has been used to justify the development of Minuteman ICBMs, the production of tens of thousands of strategic nuclear warheads, the design of tactical nuclear weapons for use on the battlefield, provocative advancements such as multiple independent reentry vehicles, and much more. Today, such thinking supports the estimated trillion-dollar modernization of U.S. nuclear weapons systems, the development of a low yield submarine-launched ballistic missile, and nuclear-capable hypersonic missiles. All of this justified to prevent nuclear war.

The U.S. Nuclear Triad

Someday, the logic of deterrence may also dictate the integration of AI-enabled systems with nuclear weapons This will not happen overnight. Rather, it will occur gradually, under the radar and largely out of public view. We are entering an age where AI-enabled systems can do things faster than humans. They can absorb more data, detect nuances in patterns invisible to the human eye, and make complex decisions in nanoseconds. Human decision-makers will increasingly appear to be the slowest cogs in the system. We may even begin to see human involvement in the use of nuclear weapons as a security risk. And the day may come where leaders decide to replace fearful humans with logical AI-enabled systems that operate at high speeds. The logic will be dictated by deterrence.

The Courageous Choice

Can humans resist the allure of machine speed for nuclear weapons?

It depends.

First, we must remember that the so-called “logic” of nuclear deterrence boils down to fear. And since fear is not logical, it will not be well understood by AI-enabled systems. By removing human fear, judgment, and gut instinct from nuclear decision-making, AI-enabled systems would increase the risk of false alarms, escalation, miscalculation, and unintended nuclear wars.

Second, we must work to eliminate these risks in the first place. To do this, we must pursue greater reductions in nuclear weapons, strengthen the nonproliferation norm, and keep humans in the loop of decision-making as long as nuclear weapons exist. At the height of tensions between the Soviet Union and the U.S., even President Ronald Reagan came to the courageous conclusion that eliminating nuclear weapons was the only truly logical solution to confronting their threat. We must have the courage to believe in a world without nuclear weapons and to take tangible steps toward that end. Only then will we be able to resist the allure of speed.

 

Natasha Bajema is the Founder and CEO of Nuclear Spin Cycle and the host of the Authors of Mass Destruction Podcast. Follow her on Twitter @WMDgirl.

This content was developed in collaboration with Nuclear Spin Cycle.

Related Reading
A huge red and black submarine sits in a dock.
Nuclear Weapons
Paying for America’s Arsenal 
by Stephen I. Schwartz
woman and child walk by missile display
Nuclear Weapons
Q&A: No First Use of Nuclear Weapons 
by Jasmine Owens / Tara Drozdenko
A missile is paraded down a street past the the kremlin.
Nuclear Weapons
U.S. and Russia: Arms Race to Nowhere