Bolt Out of the Blue: Nuclear Attack Warning in the Era of Information and Cyber Warfare

c2

Privates George E. Elliott Jr. and Joseph L. Lockard were sitting in a monitoring van as their antenna scanned for airplanes on the morning of Dec. 7, 1941. Given the 23-year state of peace, Elliott and Lockard were using their radar more for practice than true defensive awareness. As they waited for a truck to drive them to lunch, a blip appeared indicating upwards of 50 aircraft 137-miles out. “It was the largest group I had ever seen on the oscilloscope,” Lockard said later. He ran tests for faulty equipment and found nothing. He relayed their information to headquarters and was told it was friendly B-17 aircraft. Within minutes, 353 Imperial Japanese aircraft attacked Pearl Harbor.

The objective of a surprise attack is to stun the opponent long enough to prevent an effective defense and diminish any attempted counterstrike. History has shown how the very act of being caught off-guard is interpreted as evidence of failure in a fully functional alerting system, leading to doubt and delay.

Today, a modern nuclear weapon can deliver more destructive energy than all the bombs dropped in World War II. An intercontinental missile can unleash this destruction anywhere in the world in less than 30 minutes. Add simultaneous information and cyber warfare and you’ve got the modern Bolt-out-of-the-Blue. The astonishment of lightning from a clear blue sky is an apt comparison to a surprise attack because it is as much about confusion as it is about surprise. The clouds from which the bolt emanates are visible only with hindsight. So extreme is the surprise nuclear first strike scenario, it was not acknowledged publicly in past Nuclear Posture Reviews. That changed this year with the following words:

The United States will maintain a portion of its nuclear forces on alert day-to-day, and retain the option of launching those forces promptly. This posture maximizes decision time and preserves the range of U.S. response options. It also makes clear to potential adversaries that they can have no confidence in strategies intended to destroy our nuclear deterrent forces in a surprise first strike.

That’s the good news. The bad news is that contraction of the nuclear enterprise, episodic defense priorities, and presumption of a slow build-up to war have created lasting disdain for the surprise first strike. Evidence of this disdain includes unabated complacency in operator training, lapses in important acquisition regulations, growing neglect of a major command and control system, and a recalcitrant unwillingness to prioritize effectiveness over secrecy. Each of these problems needs immediate repair, otherwise deterrence against a surprise first strike won’t be what the Nuclear Posture Review intends it to be. The way forward is to train operators for actual surprise, prioritize nuclear attack warning R&D, enforce strict adherence to acquisition regulations, and let go of the outmoded reflex to classify as secret anything to do with nuclear detonation detection.

Job One

Modernizing the nation’s nuclear deterrent delivery systems, including nuclear command and control, is the Defense Department’s “top priority.” Nuclear command and control encompasses many elements, including attack warning. Attack warning involves long-range radars: specialized radar that scans for ballistic missiles, space-based infrared satellites that track rocket plumes, and a space-based “highly survivable capability to detect, locate, characterize and report” nuclear detonations.

Missile warning is not the same as nuclear detonation detection. The U.S. Strategic Command’s requirement for conventional prompt global strike suggests that tracking inbound rockets does not necessarily signal the beginning of nuclear war. What really makes a missile strike unambiguously nuclear depends on what happens at the end of the missile track. Today, two independent information sources using different physical principles, such as from infrared and nuclear detonation sensors, are used to clarify an attack warning. A quick, bright flash at the end of an infrared track, comprised of a unique mix of optical, electromagnetic pulse, and ionizing radiation, will make clear that a nuclear weapon has detonated, as well as inform of its location and destructive yield. The nuclear detonation detection system is fielded specifically for this purpose and particularly useful for identifying a first strike in space, intended to disable space and land-based cyber networks with x-rays and electromagnetic pulse at the onset of war.

Nuclear weapon signatures are transmitted to operations centers for analysis and decision. Because the operating environment will be challenged by nuclear effects, the U.S. is equipped for trans- and post-attack with several types of specialized operation centers, including fixed and mobile command and control centers and airborne facilities such as the “Take Charge and Move Out” (TACAMO) airborne communications node, where each is manned by a specially vetted and trained staff.

Nuclear Pearl Harbor

Risk is the likelihood of an event times its impact. So even if the likelihood is small, the risk may be large. The risk of a surprise nuclear first strike derives from the potential impact: destruction of fixed nuclear counterstrike capabilities, which in theory includes all land-based ICBMs, strategic bombers (not airborne), and targetable U.S. command and control infrastructure. The United States has sought to lower likelihood by investing in trans- and post-attack capabilities and recently put a portion of its nuclear forces back on routine alert. These capabilities make a surprise first strike less likely to succeed and, therefore, less likely to be chosen by an enemy.

But the world is increasingly networked, and information and cyber operations have the potential to amplify the confusion and paralysis that would precede and accompany a surprise nuclear first strike, possibly making it a more attractive option. Information operations would seek to take advantage of a watch officer’s belief that a surprise first strike is unlikely and predilection to seek harmless explanations for warnings sign. For example, an adversary might use information operations to convince U.S. operators monitoring early warning that a large, high altitude explosion was caused by a meteorite rather than a nuclear weapon, reinforcing their inclination to explain away developments considered highly improbable, just like the radar operators did during Pearl Harbor. Operators will also be reluctant to declare an emergency if they’ve been burned by false alarms unknowingly contrived by the enemy.

Hostile cyber operations could take many forms as well. The enemy may seek to lock out valid users of U.S. command and control systems simply by making many failed attempts to gain access, or activate embedded Trojan firmware to wreak havoc at the worst possible moment.

Motivated by changes in the threat environment, the 2018 Nuclear Posture Review redresses the surprise nuclear first strike scenario. It does not dismiss it as a monster under the bed that we’ve outgrown, as many nonproliferation analysts would prefer. The revised stance won’t reverse decades of pervasive incredulity. But the U.S. nuclear enterprise can and should deal with the deleterious effects of the disbelief in the surprise first strike that are manifested in training, acquisition, and secrecy.

Training

Today’s nuclear operators train for a scenario that can’t be truly appreciated: A real bolt-out-of-the-blue will create fog of war on an immense scale. Inadequate training scenarios, real-world distractions, and unanticipated interruptions will multiply the danger of unpreparedness and inaction.

Nuclear war scenarios and simulations are planned and scripted months in advance, inviting participants to adjust their schedules to facilitate exercises and to prepare for the “big game.” Although operators don’t know the exact scenario, exercises generally follow a pattern: Geopolitical friction and “saber-rattling” lead to a higher alert status, alert crews are augmented by battle staffs and planners, and as the exercise’s political scenario degenerates into turmoil, command and control crews are launched to await the inevitable exchange of exercise-dependent mushroom clouds.

To approximate a real-life attack, strategic command and control assets train using actual no-notice launches. But even these high-fidelity exercises strive more to meet benchmarks or to satisfy daily training requirements. Many of these watch-standers do not believe in the bolt-out-of-the-blue scenario, treating the exercises less as practice for a genuinely plausible scenario and more as an obligation of the job. Distractions of daily training and other mission requirements often steer operators’ attention away from the reality that they are a critical link in the nuclear command and control chain. The 2007 “loose nukes” episode, where six nuclear weapons were mistakenly loaded under a B52 wing and flown across the country, is evidence of a loss of mission focus that allows for such distractions, even in a controlled, peacetime environment. If the operators and crews can’t get this sort of activity right in peacetime, imagine how bad it will get in a crisis.

Training in the nuclear command and control enterprise drives a need for perfection without interpretation, as evidenced in the missileer cheating scandal. While nuclear war is not something the U.S. can afford to get wrong, the training rarely deals with the unexpected. The combination of perfection in training and a lack of inventiveness brings with it the threat of crews focusing more on training goals than on readiness. Furthermore, we have observed that the rigorous training rules and daily flight schedule can lead to confusion between the training world and the real world. For example, crews may think first about their rest requirements, even in a real nuclear emergency.

The added confusion created by cyber or information operations, combined with natural human doubt, will delay any reaction to a surprise. An operator tasked to respond without hesitation, though versed in countless exercise procedures, cannot avoid the human reaction of asking “is this for real?” The risk is, of course, a nuclear Pearl Harbor because operators misinterpret the data or are fooled by enemy information or cyber operations.

Preparing operators for a surprise nuclear attack should remain the highest priority. But the agencies must also fix the important cultural defects in training. Give commanders discretion to go off script and actually train for surprise. Rather than focus training on preplanned and widely briefed attack scenarios, time and energy should be devoted to the unexpected. Exercise planners — such as a nuclear command and control Red Team — should enact information operations, cyber-attacks, and a simulated surprise nuclear strike, either all at once or in phases, with short time for operators to plan and react. Operators who understand that they are in a training environment expect and anticipate such attacks, but in an unexpected crisis with no time to generate forces or plans, system flaws can be identified and corrected.

Of course, operational requirements as well as the massive nature of the command and control enterprise make this proposal easier said than done, but the importance of getting the reaction right in the case of a real world unexpected attack makes the challenge worthwhile. Tactical and operational level units run these sorts of no-notice exercises with spot-checks and lower-level training scenarios; it is time to play them out at the strategic level.

Acquisition

The Defense Department owns the nuclear detonation warning system, but it relies on the Department of Energy and its National Nuclear Laboratories to develop and fabricate the satellite sensors and ground stations that collect and process the multi-phenomena signals. The systems are acquired differently than other major weapon systems in that there is no single integrating contractor.

In the nuclear detonation detection program, suppliers of sensors and ground equipment sometimes perform systems engineering, an organizational conflict of interest forbidden by regulation. It inevitably finds the systems engineers requiring more of the company’s products. The result can be superfluous or overly expensive capabilities that deprive resources from other programs. It may also unnecessarily lead to more sensors being required to simultaneously report an event, resulting in a lower detection probability and an unhealthy emphasis on eliminating false alarms. Insiders call this “the silent sensor syndrome.” How can more sensors lead to a lower detection probability? If sensors are allowed to report a detection independently, then the probability of detection would increase. However, if ground-station software requires several sensors to agree before issuing a report, then the probability of getting a report for the same event is lower. To compensate, more sensors must be fielded. To prevent these sorts of problems, the system needs an engineer who is financially independent of sensor and ground equipment procurement.

In 2001, responsibility for the acquisition and maintenance of the nuclear detonation detection system was moved from Air Force Material Command to Air Force Space Command to consolidate national security space infrastructure and create a “space cadre.” But under Air Force Space Command, development and maintenance of the nuclear detection system has received declining priority. That’s not surprising considering, for example, how much activity there is in missile warning compared to nuclear detonation detection. Shortly after the move, the nuclear detection acquisition program manager position was downgraded. Starting in 2009, Air Force Space Command sought to eliminate radiation sensors from the system that had been fielded since 1963. Air Force Magazine reported in 2012 that the nuclear detonation detection mission ranked 14th out of 15 space priorities.

The Defense Department should ask Congress for permission to assign the responsibility of nuclear detonation warning to Air Force Global Strike Command. The justification would be the same as the one for creating Global Strike in the first place after the loose nukes episode —so that nuclear deterrence capabilities receive the right level of priority. The department should also find and address cases in which systems engineering regulations are not being followed, determine if problems exist because of previous lapses, and reallocate appropriate funding to independent systems engineering. These reforms aren’t sexy, but will help ensure that equipment operates properly in a surprise first strike and under the new stresses imposed by simultaneous information and cyber attacks.

Secrecy

The electromagnetic pulse, radiation, and cyber environment accompanying a surprise nuclear first strike will test the survivability of networked communications. The military will continue to operate with hardened systems, those built to withstand nuclear effects, but unhardened networks that are overwhelmed or destroyed will maroon the public and private sectors. Without this information, people won’t know whether to shelter in place or leave an area to avoid exposure to radioactive fallout. Unlike navigation, weather, and tsunami information, which anyone can receive directly from satellites, even when ground networks go down, nuclear detonation transmissions are still classified and encrypted.

Most people don’t know that GPS signals were originally classified. It took an international tragedy, the Soviet downing of Korean Airlines Flight 007, before it was decided that the benefits of declassifying these signals outweighed the disadvantages. Now, virtually every cell phone is equipped with a GPS receiver because the private sector miniaturized the electronics after GPS was made public. Nuclear detonation detection sensors ride on GPS satellites and transmit in the same frequency band, so the data is also global, all-weather, and technically available to the public when not encrypted. During a war, the detonation data could be received and processed by cell phones and combined with weather data to inform the public about the location and heading of nuclear fallout. The data also has life-saving applications for peacetime. For example, lightning EMP detected from space has been used to pinpoint the location and strength of a hurricane in near real time.

Until the private sector is given the opportunity to miniaturize the electronics for nuclear detonation telemetry, as it has for GPS, the military will be stuck with a relatively small number of expensive, tractor-trailer sized ground stations to receive the warnings. With miniaturized electronics, nuclear detonation warnings could be provided to virtually anyone, vastly improving the system’s redundancy. And with public availability, other nations would also have access, which could help prevent or mitigate a nuclear attack. For example, if India and Pakistan were on the brink of war, the information could prevent a natural phenomenon, such as power outage caused by a solar storm, from being mistaken for the effects of nuclear electromagnetic pulse. In a nuclear war with the United States, an enemy could use the information to assess the damage it has suffered, enabling a decision to stop fighting.

False alarms carry risks, as we were reminded recently by events in Hawaii. Nuclear detonation detection has an extremely low probability of false alarm, but it is not zero. But just as with extreme weather or tsunami warnings, the government should not withhold nuclear detonation warnings for fear of false alarms.

The National Security Council should reverse the policy of classifying space-based nuclear detonation telemetry and make the data directly available to the public. It’s unorthodox, just like it was to make GPS data public, but in both cases the advantages far outweigh the disadvantages. Declassifying the data will transform the system into useful peacetime infrastructure and make it more likely to be useful during a war. More importantly, international access to the data could help prevent or shorten a nuclear conflict.

Conclusion

The presumption of a slow build-up to war, unrealistic expectations that people and hardware will always work perfectly, harmful conflicts of interest in systems engineering, and policies that don’t simultaneously protect and enhance prosperity are barriers to improving U.S. nuclear attack warning. The escalating use of information and cyber operations amplifies the risks created by these failings. Funding is needed, of course, but first and foremost the nuclear deterrence enterprise needs to take an open and honest approach to addressing those problems that are deeply embedded, currently ignored, and immune to simple fiscal remedy.

 

Lieutenant Commander Frank Nuño is an active duty military faculty at National Defense University. Previously, he has served at the National Security Agency, the Defense Intelligence Agency, and multiple aviation tours in nuclear command and control.

Dr. Vaughn Standley is on loan to the National Defense University from the Department Energy, where he manages space-based nuclear detonation detection R&D. Previously, he was a nuclear inspector in the International Atomic Energy Agency and nuclear scientist in the U.S. Air Force.

The opinions, conclusions, and recommendations expressed or implied are those of the authors and do not necessarily reflect the views of the Department of Defense or any other agency of the Federal Government, or any other organization.

Image: U.S. Air Force photo by Senior Airman Malia Jenkins

Do you like our articles?

Then you'll love this job opportunity! War on the Rocks is hiring another full-time editor. Help produce the articles you love to read.