The Bomb
The Bomb
For many observers, "Living with the Bomb" has become the evocative phrase to describe life in twentieth-century America. The cultural fallout from this technological innovation has influenced economics, politics, and social policy and life long after its first testing in the New Mexican desert in 1945. Americans have taken fear of attack so seriously that school policies include provisions for nuclear attack. Global politics became "polarized" by the two nations in possession of nuclear technology. In the 1990s, global relations remained extremely influenced by proliferation and the threat of hostile nations acquiring nuclear capabilities. Clearly, "the bomb" and all of atomic technology has carved a deep crater of influence.
The technology to manage atomic reactions did not long remain the sole domain of the military. The influence of nuclear weapons and power generation has defined a great deal of domestic politics since the 1960s. In recent years, such attention has come because of nuclear technology's environmental impact. If one considers these broader implications and the related technologies, twentieth-century life has been significantly influenced by "the bomb," even though it has been used sparingly—nearly not at all. The broader legacy of the bomb can be seen on the landscape, from Chernobyl to the Bikini Atoll or from Hiroshima to Hanford, Washington. Hanford's legacy with the bomb spans more time than possibly any other site. In fact, it frames consideration of this issue by serving as site for the creation of the raw material to construct the first nuclear weapons and consequently as the "single most infected" site in the United States, now awaiting Superfund cleanup. The site is a symbol of technological accomplishment but also of ethical lessons learned.
In February 1943, the U.S. military through General Leslie Groves acquired 500,000 acres of land near Hanford. This would be the third location in the triad that would produce the atomic technology. The coordinated activity of these three sites under the auspices of the U.S. military became a path-breaking illustration of the planning and strategy that would define many modern corporations. Hanford used water power to separate plutonium and produce the grade necessary for weapons use. Oak Ridge in Tennessee coordinated the production of uranium. These production facilities then fueled the heart of the undertaking, contained in Los Alamos, New Mexico, under the direction of J. Robert Oppenheimer.
Oppenheimer, a physicist, supervised the team of nuclear theoreticians who would devise the formulas making such atomic reactions possible and manageable. Scientists from a variety of fields were involved in this highly complex theoretical mission. Once theories were in place and materials delivered, the project became assembling and testing the technology in the form of a bomb. All of this needed to take place on the vast Los Alamos, New Mexico, compound under complete secrecy. However, the urgency of war revealed that this well-orchestrated, corporate-like enterprise remained the best bet to save thousands of American lives.
By 1944, World War II had wrought a terrible price on the world. The European theater would soon close with Germany's surrender. While Germany's pursuit of atomic weapons technology had fueled the efforts of American scientists, the surrender did not end the project. The Pacific front remained active, and Japan did not accept offers to surrender. "Project Trinity" moved forward, and it would involve Japanese cities, Hiroshima and Nagasaki, as the test laboratories of initial atomic bomb explosions. Enola Gay released a uranium bomb on the city of Hiroshima on August 6 and Bock's Car released a plutonium bomb on Nagasaki on August 9. Death tolls vary between 300-500,000, and most were Japanese civilians. The atomic age, and life with the bomb, had begun.
Bomb tests in an effort to perfect the technology as well as to design other types of weapons, including Hydrogen bombs, would continue throughout the 1950s, particularly following the Soviet Union's successful detonation in 1949. Many of these tests became publicity opportunities. For instance, the 1946 Pacific tests on the Bikini Atoll were viewed on television and through print media by millions worldwide. The technology became so awe-inspiring and ubiquitous that a french designer named his new, two-piece women's bathing suit after the test. The bikini, linked with terms such as "bombshell," became an enduring representation of the significant impression of this new technology on the world's psyche.
For Oppenheimer and many of the other scientists, the experience of working for the military had brought increasing alarm about what the impact of their theoretical accomplishments would be. Many watched in horror as the weapons were used on Japanese civilians. Oppenheimer eventually felt that the public had changed attitudes toward scientific exploration due to the bomb. "We have made a thing," he said in a 1946 speech, "a most terrible weapon, that has altered abruptly and profoundly the nature of the world … a thing that by all the standards of the world we grew up in is an evil thing." It brings up the question of whether or not, he went on, this technology as well as all of science should be controlled or limited.
Many of the scientists involved believed that atomic technology required controls unlike any previous innovation. Shortly after the bombings, a movement began to establish a global board of scientists who would administer the technology with no political affiliation. While there were many problems with such a plan in the 1940s, it proved impossible to wrest this new tool for global influence from the American military and political leaders. The Atomic Energy Commission (AEC), formed in 1946, would place the U.S. military and governmental authority in control of the weapons technology and other uses to which it might be put. With the "nuclear trump card," the United States catapulted to the top of global leadership.
Such technological supremacy only enhanced Americans' post war expansion and optimism. The bomb became an important plank to re-stoking American confidence in the security of its borders and its place in the world. In addition to alcoholic drinks and cereal-box prizes, atomic technology would creep into many facets of American life. Polls show that few Americans considered moral implications to the bombs' use in 1945; instead, 85 percent approved, citing the need to end the war and save American lives that might have been lost in a Japanese invasion. Soon, the AEC seized this sensibility and began plans for "domesticating the atom." These ideas led to a barrage of popular articles concerning a future in which roads were created through the use of atomic bombs and radiation employed to cure cancer.
Atomic dreaming took many additional forms as well, particularly when the AEC began speculating about power generation. Initially, images of atomic-powered agriculture and automobiles were sketched and speculated about in many popular periodicals. In one book published during this wave of technological optimism, the writer speculates that, "No baseball game will be called off on account of rain in the Era of Atomic Energy." After continuing this litany of activities no longer to be influenced by climate or nature, the author sums up the argument: "For the first time in the history of the world man will have at his disposal energy in amounts sufficient to cope with the forces of Mother Nature." For many Americans, this new technology meant control of everyday life. For the Eisenhower Administration, the technology meant expansion of our economic and commercial capabilities.
The Eisenhower Administration repeatedly sought ways of "domesticating" the atom. Primarily, this effort grew out of a desire to educate the public without creating fear of possible attack. However, educating the public on actual facts clearly took a subsidiary position to instilling confidence. Most famously, "Project Plowshares" grew out of the Administration's effort to take the destructive weapon and make it a domestic power producer. The list was awe-inspiring: laser-cut highways passing through mountains, nuclear-powered greenhouses built by federal funds in the Midwest to routinize crop production, and irradiating soils to simplify weed and pest management. While domestic power production, with massive federal subsidies, would be the long-term product of these actions, the atom could never fully escape its military capabilities.
Americans of the 1950s could not at once stake military dominance on a technology's horrific power while also accepting it into their everyday life. The leap was simply too great. This became particularly difficult in 1949 when the Soviet Union tested its own atomic weapon. The arms race had officially begun; a technology that brought comfort following the "war to end all wars" now forced an entire culture to realize its volatility—to live in fear of nuclear annihilation.
Eisenhower's efforts sought to manage the fear of nuclear attack, and wound up creating a unique atomic culture. Civil defense efforts constructed bomb shelters in public buildings and enforced school children to practice "duck and cover" drills, just as students today have fire drills. Many families purchased plans for personal bomb shelters to be constructed in their backyards. Some followed through with construction and outfitting the shelter for months of survival should the United States experience a nuclear attack. Social controls also limited the availability of the film On the Beach, which depicted the effects of a nuclear attack, and David Bradley's book No Place to Hide. It was the censorship of Bradley, a scientist and physician working for the Navy at the Bikini tests, that was the most troubling oversight. Bradley's account of his work after the tests presented the public with its first knowledge of radiation—the realization that there was more to the bomb than its immediate blast. The culture of control was orchestrated informally, but Eisenhower also took strong political action internationally. "Atoms for Peace" composed an international series of policies during the 1950s that sought to have the Soviets and Americans each offer the United Nations fissionable material to be applied to peaceful uses. While the Cold War still had many chapters through which to pass, Eisenhower stimulated discourse on the topic of nuclear weapons from the outset.
Eisenhower's "Atoms for Peace" speech, given at the United Nations in 1953, clearly instructed the world on the technological stand-off that confronted it. The "two atomic colossi," he forecasted, could continue to "eye each other indefinitely across a trembling world." But eventually their failure to find peace would result in war and "the probability of civilization destroyed," forcing "mankind to begin all over again the age-old struggle upward from savagery toward decency and right, and justice." To Eisenhower, "no sane member of the human race" could want this. In his estimation, the only way out was discourse and understanding. With exactly these battle lines, a war—referred to as cold, because it never escalates (heats) to direct conflict—unfolded over the coming decades. With ideology—communism versus capitalism—as its point of difference, the conflict was fought through economics, diplomacy, and the stockpiling of a military arsenal. With each side possessing a weapon that could annihilate not just the opponent but the entire world, the bomb defined a new philosophy of warfare.
The Cold War, lasting from 1949-1990, then may best be viewed as an ongoing chess game, involving diplomats and physicists, while the entire world prayed that neither player made the incorrect move. Redefining ideas of attack and confrontation, the Cold War's nuclear arsenal required that each side live on the brink of war—referred to as brinksmanship by American policy makers. Each "super power," or nuclear weapons nation, sought to remain militarily on the brink while diplomatically dueling over economic and political influence throughout the globe. Each nation sought to increase its "sphere of influence" (or nations signed-on as like minded) and to limit the others. Diplomats began to view the entire globe in such terms, leading to wars in Korea and Vietnam over the "domino" assumption that there were certain key nations that, if allowed to ally with a superpower, could take an entire region with them. These two conflicts defined the term "limited" warfare, which meant that nuclear weapons were not used. However, in each conflict the use of such weapons was hotly debated.
Finally, as the potential impact of the use of the bomb became more clearly understood, the technological side of the Cold War escalated into an "arms race" meant to stockpile resources more quickly and in greater numbers than the other superpower. Historians will remember this effort as possibly the most ridiculous outlet of Cold War anxiety, because by 1990 the Soviets and Americans each possessed the capability to destroy the earth hundreds of times. The arms race grew out of one of the most disturbing aspects of the Cold War, which was described by policy-makers as "MAD: mutually assured destruction." By 1960, each nation had adopted the philosophy that any launch of a nuclear warhead would initiate massive retaliation of its entire arsenal. Even a mistaken launch, of course, could result in retaliatory action to destroy all life.
On an individual basis, humans had lived before in a tenuous balance with survival as they struggled for food supplies with little technology; however, never before had such a tenuous balance derived only from man's own technological innovation. Everyday human life changed significantly with the realization that extinction could arrive at any moment. Some Americans applied the lesson by striving to live within limits of technology and resource use. Anti-nuclear activists composed some of the earliest portions of the 1960s counter culture and the modern environmental movement, including Sea Shepherds and Greenpeace which grew out of protesting nuclear testing. Other Americans were moved to live with fewer constraints than every before: for instance, some historians have traced the culture of excessive consumption to the realization that an attack was imminent. Regardless of the exact reaction, American everyday life had been significantly altered.
If Americans had managed to remain naive to the atomic possibilities, the crisis of 1962 made the reality perfectly obvious. U.S. intelligence sources located Soviet missiles in Cuba, 90 miles from the American coast. Many options were entertained, including bombing the missile sights; President John F. Kennedy, though, elected to push "brinksmanship" further than it had ever before gone. He stated that the missiles pressed the nuclear balance to the Soviet's advantage and that they must be removed. Kennedy squared off against Soviet Premier Nikita Kruschev in a direct confrontation with the use of nuclear weapons as the only subsequent possibility for escalation. Thirteen Days later, the Soviet Premier backed down and removed the missiles. The world breathed a sigh of relief, realizing it had come closer to destruction than ever before. For many observers, there was also an unstated vow that the Cuban Missile Crisis must be the last such threat.
The period of crisis created a new level of anxiety, however, that revealed itself in a number of arenas. The well-known "atomic clock," calculated by a group of physicists, alerted the public to how great the danger of nuclear war had become. The anxiety caused by such potentialities, however, played out in a fascinating array of popular films. An entire genre of science fiction films focused around the unknown effects of radiation on subjects ranging from a beautiful woman, to grasshoppers, to plants. Most impressively, the Godzilla films dealt with Japanese feelings toward the effects of nuclear technology. All of these films found a terrific following in the United States. Over-sized lizards aside, another genre of film dealt with the possibilities of nuclear war. On the Beach blazed the trail for many films, including the well-known The Day After television mini-series. Finally, the cult-classic of this genre, Dr. Strangelove starred Peter Sellers in multiple performances as it posed the possibility of a deranged individual initiating a worldwide nuclear holocaust. The appeal of such films reveals the construction of what historian Paul Boyer dubs an American "nuclear consciousness."
Such faith in nationalism, technological supremacy, and authority helped make Americans comfortable to watch above-ground testing in the American West through the late 1950s. Since the danger of radiation was not discussed, Americans often sat in cars or on lawn chairs to witness the mushroom clouds from a "safe" distance. Documentary films such as Atomic Cafe chronicle the effort to delude or at least not fully inform the American public about dangers. Since the testing, "down-winders" in Utah and elsewhere have reported significant rises in leukemia rates as well as that of other types of cancer. Maps of air patterns show that actually much of the nation experienced some fall-out from these tests. The Cold War forced the U.S. military to operate as if it were a period of war and certain types of risks were necessary on the "home front." At the time, a population of Americans who were familiar with World War II proved to be willing to make whatever sacrifices were necessary; later generations would be less accepting.
Ironically, the first emphasis of this shift in public opinion would not be nuclear arms, but its relative, nuclear power. While groups argued for a freeze in the construction of nuclear arms and forced the government to discontinue atomic weapons tests, Americans grew increasingly comfortable with nuclear reactors in their neighborhoods. The "Atoms for Peace" program of the 1950s aided in the development of domestic energy production based on the nuclear reaction. The exuberance for such power production became the complete lack of immediate waste. There were other potential problems, but those were not yet clearly known to the American public. In 1979, a nuclear reactor at Three Mile Island, Pennsylvania, which is located within a working-class neighborhood outside of a major population center, nearly experienced a nuclear melt down. As pregnant women and children were evacuated from Pennsylvania's nearby capital, Harrisburg, the American public learned through the media about the dangers of this technology. Most important, they learned the vast amount that was not clearly understood about this power source. As much of the nation waited for the cooling tower to erupt in the mushroom cloud of an atomic blast, a clear connection was finally made between the power source and the weapon. While the danger passed quickly from Three Mile Island, nuclear power would never recover from this momentary connection to potential destruction; films such as China Syndrome (1979) and others made certain.
When Ronald Reagan took office in 1980, he clearly perceived the Cold War as an ongoing military confrontation with the bomb and its production as its main battlefield. While presidents since Richard Nixon had begun to negotiate with the Soviets for arms control agreements, Reagan escalated production of weapons in an effort to "win" the Cold War without a shot ever being fired. While it resulted in mammoth debt, Reagan's strategy pressed the Soviets to keep pace, which ultimately exacerbated weaknesses within the Soviet economy. By 1990, the leaders of the two super powers agreed that the Cold War was finished. While the Soviet Union crumbled, the nuclear arsenal became a concern of a new type. Negotiations immediately began to initiate dismantling much of the arsenal. However, the control provided by bipolarity was shattered, and the disintegration of the Soviet Union allowed for their nuclear weapons and knowledge to become available to other countries for a cost. Nuclear proliferation had become a reality.
In the 1990s, the domestic story of the bomb took dramatic turns as the blind faith of patriotism broke and Americans began to confront the nation's atomic legacy. Vast sections of infected lands were identified and lawsuits were brought by many "down-winders." Under the administration of President Bill Clinton, the Department of Energy released classified information that documented the govern-ment's knowledge of radiation and its effects on humans. Some of this information had been gathered through tests conducted on military and civilian personnel. Leading the list of fall-out from the age of the bomb, Hanford, Washington, has been identified as one of the nation's most infected sites. Buried waste products have left the area uninhabitable.
The panacea of nuclear safety has, ultimately, been completely abandoned. Massive vaults, such as that in Yucca Mountain, Nevada, have been constructed for the storage of spent fuel from nuclear power plants and nuclear warheads. The Cold War lasted thirty to forty years; the toxicity of much of the radioactive material will last for nearly 50,000 years. The massive over-production of such material has created an enormous management burden for contemporary Americans. This has become the next chapter in the story of the bomb and its influence on American life.
—Brian Black
Further Reading:
Boyer, Paul. By The Bomb's Early Light. Chapel Hill, University of North Carolina Press, 1994.
Chafe, William H. The Unfinished Journey. New York, Oxford University Press, 1995.
Hughes, Thomas P. American Genesis. New York, Penguin Books, 1989.
May, Elaine Tyler. Homeward Bound. New York, Basic Books, 1988.
May, Ernest R. American Cold War Strategy. Boston, Bedford Books, 1993.