The world was never the same after the United States leveled Hiroshima and Nagasaki in August 1945 with atomic bombs. Not only had perhaps 180,000 civilians been killed, the nature of warfare was forever changed. The Soviets accelerated their nuclear research, expedited in no small part by “atom spies” such as Klaus Fuchs, who had stolen nuclear secrets from the Americans’ secret Manhattan Project. Soviet scientists successfully tested an atomic bomb on August 29, 1949, years before American officials had estimated they would. This unexpectedly quick Russian success not only caught the United States off guard but alarmed the Western world and propelled a nuclear arms race between the United States and the USSR.
The United States detonated the first thermonuclear weapon, or hydrogen bomb (using fusion explosives of theoretically limitless power) on November 1, 1952. The blast measured over ten megatons and generated an inferno five miles wide with a mushroom cloud twenty-five miles high and a hundred miles across. The irradiated debris—fallout—from the blast circled the earth, occasioning international alarm about the effects of nuclear testing on human health and the environment. It only hastened the arms race, with each side developing increasingly advanced warheads and delivery systems. The USSR successfully tested a hydrogen bomb in 1953, and soon thereafter Eisenhower announced a policy of “massive retaliation.” The United States would henceforth respond to threats or acts of aggression with perhaps its entire nuclear might. Both sides, then, would theoretically be deterred from starting a war, through the logic of mutually assured destruction (MAD). J. Robert Oppenheimer, director of Los Alamos nucelear laboratory that developed the first nuclear bomb, likened the state of “nuclear deterrence” between the United States and the USSR to “two scorpions in a bottle, each capable of killing the other,” but only by risking their own lives.21
Fears of nuclear war produced a veritable atomic culture. Films such as Godzilla, On the Beach, Fail-Safe, and Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb plumbed the depths of American anxieties with plots featuring radioactive monsters, nuclear accidents, and doomsday scenarios. Antinuclear protests in the United States and abroad warned against the perils of nuclear testing and highlighted the likelihood that a thermonuclear war would unleash a global environmental catastrophe. Yet at the same time, peaceful nuclear technologies, such as fission- and fusion-based energy, seemed to herald a utopia of power that would be clean, safe, and “too cheap to meter.” In 1953, Eisenhower proclaimed at the UN that the United States would share the knowledge and means for other countries to use atomic power. Henceforth, “the miraculous inventiveness of man shall not be dedicated to his death, but consecrated to his life.” The “Atoms for Peace” speech brought about the establishment of the International Atomic Energy Agency (IAEA), along with worldwide investment in this new economic sector.22
As Germany fell at the close of World War II, the United States and the Soviet Union each sought to acquire elements of the Nazi’s V-2 superweapon program. A devastating rocket that had terrorized England, the V-2 was capable of delivering its explosive payload up to a distance of nearly six hundred miles, and both nations sought to capture the scientists, designs, and manufacturing equipment to make it work. A former top German rocket scientist, Wernher von Braun, became the leader of the American space program; the Soviet Union’s program was secretly managed by former prisoner Sergei Korolev. After the end of the war, American and Soviet rocket engineering teams worked to adapt German technology in order to create an intercontinental ballistic missile (ICBM). The Soviets achieved success first. They even used the same launch vehicle on October 4, 1957, to send Sputnik 1, the world’s first human-made satellite, into orbit. It was a decisive Soviet propaganda victory.23
In response, the U.S. government rushed to perfect its own ICBM technology and launch its own satellites and astronauts into space. In 1958, the National Aeronautics and Space Administration (NASA) was created as a successor to the National Advisory Committee for Aeronautics (NACA). Initial American attempts to launch a satellite into orbit using the Vanguard rocket suffered spectacular failures, heightening fears of Soviet domination in space. While the American space program floundered, on September 13, 1959, the Soviet Union’s Luna 2 capsule became the first human-made object to touch the moon. The “race for survival,” as it was called by the New York Times, reached a new level.24 The Soviet Union successfully launched a pair of dogs (Belka and Strelka) into orbit and returned them to Earth while the American Mercury program languished behind schedule. Despite countless failures and one massive accident that killed nearly one hundred Soviet military and rocket engineers, Russian cosmonaut Yuri Gagarin was launched into orbit on April 12, 1961. American astronaut Alan Shepard accomplished a suborbital flight in the Freedom 7 capsule on May 5. The United States had lagged behind, and John Kennedy would use America’s losses in the “space race” to bolster funding for a moon landing.
While outer space captivated the world’s imagination, the Cold War still captured its anxieties. The ever-escalating arms race continued to foster panic. In the early 1950s, the Federal Civil Defense Administration (FCDA) began preparing citizens for the worst. Schoolchildren were instructed, via a film featuring Bert the Turtle, to “duck and cover” beneath their desks in the event of a thermonuclear war.25
Although it took a backseat to space travel and nuclear weapons, the advent of modern computing was yet another major Cold War scientific innovation, the effects of which were only just beginning to be understood. In 1958, following the humiliation of the Sputnik launches, Eisenhower authorized the creation of an Advanced Research Projects Agency (ARPA) housed within the Department of Defense (later changed to DARPA). As a secretive military research and development operation, ARPA was tasked with funding and otherwise overseeing the production of sensitive new technologies. Soon, in cooperation with university-based computer engineers, ARPA would develop the world’s first system of “network packing switches,” and computer networks would begin connecting to one another.