One significant change that the Cold War brought about in science was the development of a culture of secrecy. Beginning during World War II, the federal government poured an enormous amount of money into scientific research. Because so many of the projects they funded had to do with weapons or other sensitive technology, background checks were required and secrecy enforced. Even during the war, many scientists chafed at these restrictions. Scientists were used to a free and open-ended exchange of research information, and many of them described the security requirements of the atomic age — including travel restrictions and limitations meeting with some foreign colleagues — as a limitation of scientific freedom. Even for those not directly involved in defense-related research, it would have been hard to ignore the extent to which the process of doing science had changed. For example, in the early 1950s, the amount of classified research being carried out with federal money at American universities exploded. Secrecy requirements and classified research changed the culture of American universities, and even those directly involved would have noticed this.
What were the practical consequences of all this? When scientists grumbled about limitations on scientific freedom, or the damage secrecy did to science, what concretely were they referring to? A few specific examples can be taken from a talk given by the president of the Associated Universities, L. V. Berkner, in 1954. (The Associated Universities is a non-profit that manages scientific research resources such as large telescopes and observatories for the benefit of the scientific community. They managed Brookhaven National Lab from the late 1940s through the late 1980s.) Berkner did not critique security procedures or the need to classify some research, but he did acknowledge the frustrations that these brought with them. His examples included that of a hypothetical scientist, who, based on classified scientific knowledge, has good reason to criticize some specific government policy. His duty as a citizen leads him to voice his criticism publicly, but because he can’t name the (classified) information behind the critique, he is subjected to intense public disapprobation — and his criticism isn’t taken seriously. Or, in another hypothetical example, a scientist hears someone else give a paper and realizes that this person’s research has already been done, or the argument has already been disproved, by some classified research that the scientist in the audience knows about but the presenter doesn’t. Valuable research time and resources in this case have been wasted. More generally, there was the problem that keeping various technologies and concepts secret prevented the kind of serendipitous combinations of ideas that drove many scientific discoveries. Scientists had differing opinions about this question of secrecy — to what extent government policy was justified, what and how bad the effects on research were, and how to balance the need to protect certain information with the demands of research and the desire to exchange research freely with colleagues. But there was no denying that their professional lives had changed dramatically as a result of the Cold War.
Given the importance of atomic and engineering technology, it would be easy to assume that physics and engineering were the only fields affected by these changes. But even biologists doing work not directly related to any atomic secrets had to be cleared by the FBI if they worked at certain facilities — this was true of radiation geneticists at Brookhaven, for example. Even in contexts where the absence of strict secrecy was the point, security concerns played a role. In 1955, a U.N. conference on the peaceful uses of atomic energy took place in Geneva, Switzerland. The conference was part of an “Atoms for Peace” initiative led by the Eisenhower administration to inform the public about atomic technology and promote peaceful uses of atomic power. Part of the idea was that openness about the technology — up to a point — would provide an alternative to the threatening nuclear brinkmanship of two superpowers armed to the teeth with top-secret atomic weapons. And, of course, it also 1) gave the U.S. the chance at a propaganda coup if the Soviet Union rejected the idea and 2) offered cover for the development of more sophisticated weapons.
The initiative didn’t work out the way Eisenhower intended, as far as limiting proliferation was concerned. But the 1955 conference in Geneva was a heady experience for the participants. The atmosphere was “euphoric” and significant amounts of previously classified data was made available. Nevertheless, there were still security requirements. Like all the other American attendees, Brookhaven radiation biologist Ralph Singleton received a security briefing from the State Department, which reminded scientists that “security regulations are not road-blocks thrown helter-skelter into your path to impede your freedom of movement or speech” and outlined the practices that must be followed to ensure that classified data did not fall into the wrong hands.
Awareness of the culture of secrecy was not limited to scientists. The public knew that things were going on behind closed laboratory doors of which they knew nothing — indeed, they often assumed that more classified bomb-related research was going on than was actually the case. Visitors to Brookhaven often assumed that scientists there were making weapons even if they said otherwise. In 1967, a scientist there described to a Newsday reporter how every year, someone touring the lab during visitors’ day would ask the scientists to quit “kidding” and tell them where they were making the bombs.
Finally, the concept of secrecy was related in a very basic way to how scientists and military officials understood the nature of scientific discovery itself. During the period during the late 1940s when the United States was the only country that had nuclear weapons, many military officials and politicians were focused on keeping “the atomic secret” out of the hands of the Soviets — they believed that with the right precautions, the “U.S. atomic monopoly could be maintained indefinitely.” The thinking, in other words, was that the knowledge and technological know-how required to make a nuclear weapon were like a piece of music. As long as every recording and every score was kept strictly under wraps and no one went around whistling the tune, it would be impossible for anyone else to reconstruct the melody. But many scientists objected that this was a false understanding of science and scientific discoveries. Atomic technology was more like the extraordinarily precise measurement of the height of a mountain. The Americans might have been the first to do it, but in principle the mountain was still out there and there was nothing preventing other countries from developing ways to measure it with equal precision. Historian Jessica Wang has described the case of nuclear physicist Edward Condon, who — like many of his colleagues — rejected the idea of the “atomic secret.” For Condon, “scientific progress required open communication, free from military requirements of secrecy. Furthermore, since discoveries made by scientists in one country could be rediscovered by those in another, attempts to keep such discoveries secret were bound to fail over time. Secrecy could only retard research efforts without securing any advantages.” The idea that stringent security would be enough to permanently prevent the USSR from developing nuclear weapons was in the mind of Condon and many of his colleagues “a total misunderstanding of how science worked.”
Secrets and secrecy were key factors in how science was done — and how people thought about science — during the first decades of the Cold War. The massive increase in federally funded research, much of it classified, brought a new culture of secrecy to American research institutions, and scientists and politicians wrangled over the nature of scientific discovery itself. Was discovery an open-ended process that necessarily involved communication, the free circulation of information, and a certain amount of luck? Or could it be directed and controlled, with researchers knowing only what they needed to? The success of the Manhattan Project that produced the first nuclear weapons suggested that the second version could work, in an emergency situation. Many scientists, though, argued that heavy-handed security policies and the restriction of information would hamper their work in the long run, and in this sense it was in neither their nor the nation’s interest to aim for an iron control of data or ideas.
Photo Credit: Image courtesy of Brookhaven National Laboratory.