Science and Security at Risk

A comprehensive strategy for integrating science and security is urgently needed at DOE’s national labs and throughout the U.S. government.

The marriage between science and security in the United States has at times been turbulent, and never more so than in the fall of 2000, following the darkest hours of controversy over security breaches in the Department of Energy’s (DOE’s) national laboratories. At that time, I was asked by the secretary of energy to head a commission to examine the rekindled issues surrounding science and security at the labs. I knew the problem would be intense. But I thought it would be focused largely on the consequences of the security compromises and DOE’s harsh response, which was partly driven by the partisan politics of Washington. Instead, I found a much more complex and difficult problem. At risk is the vitality of science in some of the best laboratories in the United States. This situation could worsen if the government seizes on wider, poorly designed security measures for the nation in the aftermath of the September 11, 2001, terrorist attacks. If we fail to manage this problem properly, then the risk could spread beyond the government to U.S. universities and private-sector institutes.

I have spent my entire professional career in government, dealing regularly with various national security issues. I think about national and domestic security every day. I worry deeply about terrorism, and especially the consequences if terrorists gain access to chemical, biological, or nuclear weapon materials. And I worry about espionage. I know there are spies within our land.

But I also worry that misplaced and poorly conceived security procedures will provide very little security and could potentially cripple the nation’s scientific vitality, thereby posing a serious threat to our long-term national security. That is the purpose of this article. I will begin by reviewing the work of the Commission on Science and Security and then expand from that to the larger concerns I have during these critical times for homeland security.

At its core, DOE is a science agency. Science underpins each of its four missions, which focus on fundamental science, energy resources, environmental quality, and national security. DOE contributes enormously to science in the United States; it accounts for nearly half of all federal support in the physical sciences, including more than 90 percent of the investments in high-energy physics and nuclear physics. But from the beginning of the U.S. nuclear weapons program 50 years ago, science and security were in tension. The very nature of the scientific enterprise requires open collaboration. The essence of national security is restricted and controlled access to crucial information. We had that tension from the opening days of the Manhattan Project, and we managed it effectively throughout the Cold War.

What causes this tension, I came to realize, is not the incompatibility of scientific openness and security restrictiveness. Instead, the tension arises inside the national security community. Central to our national strategy for more than 50 years have been efforts to harness the nation’s scientific and technical talent to place superior tools in the hands of U.S. soldiers so that we could win any wars and, ideally, deter conflict in the first place. Therefore, national security requirements became a primary impetus for federal spending on science. Defending the United States without the genius of U.S. scientists would be infinitely more difficult.

The tension, then, is internal to the national security community. On the one hand, we need to advance the frontiers of knowledge to stay ahead of our opponents. On the other hand, we need to defeat those who would steal our secrets, keeping them as far behind us as possible in the race to field the weapons of war. Thus, we want to race ahead in one dimension and to slow the progress in another. That is the tension we confronted in DOE when the commission began its work. That, too, is the tension we now feel in a post-September 11 United States. How do we preserve our economic and social vitality and still secure our homeland?

Between 1999 and 2000, DOE was hit by two major security crises at Los Alamos National Laboratory, the home of the Manhattan Project and some of the nation’s most classified national security work. The first case involved Wen Ho Lee, a U.S. physicist of Taiwanese decent accused of giving sensitive nuclear information to China. The Lee case was considered explosive because it involved one of DOE’s own employees. Furthermore, the possibility that China had allegedly obtained, through a naturalized U.S. citizen, access to some of our most sensitive information was alarming. Less than a year later, on May 7, 2000, a second incident occurred at Los Alamos involving two missing computer hard drives containing classified nuclear weapons information. The hard drives resurfaced more than a month later in a classified area that had been searched twice before by investigators. As the magnitude of the second incident crystallized, accusations flew in Congress, DOE, and the security community.

DOE responded by issuing a series of controversial security measures intended to close the security gaps highlighted by the two crises. Although well intentioned, many of these department-wide measures were simply misguided or misapplied. In fact, the measures only exacerbated departmental tensions and contributed to a decline in employee morale, most notably at the nuclear weapons laboratories–Los Alamos, Lawrence Livermore, and Sandia–where security crackdowns, and the perception that the reforms were arbitrarily imposed, were most severe. Because the security measures were blanketed across the department, unclassified laboratories also were affected, even though their security needs were significantly different from those of the weapons laboratories.

With the high-profile allegations and security violations at Los Alamos as a backdrop, Energy Secretary Bill Richardson authorized the Commission on Science and Security in October 2000 to assess the challenges facing DOE and its newly created National Nuclear Security Administration (NNSA) in conducting science at the laboratories while protecting and enhancing national security. The commission was asked to examine all DOE laboratories (not just the three weapons labs where classified work is most concentrated) in order to address the department’s broad range of classified and unclassified activities and information. The commission included 19 distinguished members from the scientific, defense, intelligence, law enforcement, and academic communities. We presented our findings in May 2002 in a final report to Energy Secretary Spencer Abraham, who had taken office with the change in administrations and had rechartered the commission.

DOE needs a philosophy and clear procedures that integrate science and security, rather than treating them as separate functions.

The commission concluded that DOE’s current policies and practices risk undermining its security and compromising its science and technology programs. The central cause of this worrisome conclusion is that the spirit of shared responsibility between the scientists and the security professionals has broken down. Security professionals feel that scientists either do not understand or fail to appreciate the threats and thus cannot be trusted to protect U.S. secrets without explicit and detailed rules and regulations. Scientists, in turn, believe that security professionals do not understand the nature of science and thus pursue procedures designed to demonstrate compliance with rules more than securing secrets. These perceptions have hardened into realities that significantly and adversely affect the trust between scientists and security professionals in the department.

The damaging consequences of this collapse of mutual trust cannot be overstated. It is not possible either to pursue creative science or to secure national secrets if scientists and security professionals do not trust each other. Scientists are the first line of defense for national security. If we do not trust a scientist, then we should not give him or her a security clearance. If we grant a scientist clearance, then we should trust that person’s judgment and help him or her do the assigned job. Of course, the natural complement to trust is verification. Once trust has been established, that trust must be periodically verified by the organization in order to reduce insider threats, negligence, or employee incompetence in security matters. Verification that is transparent, unobtrusive, and selective can bolster security without diminishing productivity or demoralizing personnel.

Scientists cannot be expected to be aware of all the risks they face from hostile governments and agents. They depend on security professionals to establish the environment of security so that they can pursue effective science within that framework. They also depend on security professionals to translate uncertain and occasionally ambiguous information gathered by counterintelligence experts into realistic and effective security procedures. As the same time, security professionals can understand what is at stake only by working with scientists. And so the entwined needs of scientists and security experts come full circle. These two communities depend on each other to do their shared job successfully.

Key strategic elements

There are many problems standing in the way of that ideal working environment. After conducting extensive research and discussion, the commission outlined five broad elements of a comprehensive strategy for creating effective science and security in DOE’s labs. All have to do with developing a security architecture that is consistent with an environment in which, during the past two decades, both the conduct of science and the international security landscape have changed considerably.

To begin with, science has become an increasingly international enterprise. Multinational collaborative efforts on large science projects are now common, if not the norm. Within the government, classified science, once an isolated and compartmentalized endeavor, has come to rely on unclassified science as a vital new source of ideas. There is greater fluidity in the exchange of information that is accompanied by a need for U.S. scientists to work with scientists from around the world. As a result, global scientific networks have grown exponentially through the use of modern communications and information technologies. People also are more mobile, and scientists from developing countries have made their way to developed countries in search of better facilities and research environments. In the United States, increasing numbers of scientists, engineers, and mathematicians from other countries are filling slots in doctoral programs, laboratories, and businesses. In DOE’s unclassified laboratories, for example, the number of foreign students, as a portion of total staff, increased from 16 percent to 19 percent between 1996 and 2000.

As science has changed, so has our security environment. Since the end of the Cold War, our security priorities have shifted from a largely bipolar world to an increasingly complex world with asymmetric threats to U.S. interests. September 11 and the anthrax attacks have forced us to redefine and rethink the nature of risk to our national assets. Indeed, we have come to understand that zero risk is impossible; any system based on the presumption of zero risk is bound to fail. We can only minimize risks through careful calculation and analysis of threats.

Given this context, the first and arguably the most difficult element of a new security architecture requires the Energy secretary to confront the longstanding management problems of the department. Many well-intentioned reform efforts, piled on top of an organizational structure that traces back to the earliest days of the Manhattan Project, have created an organization with muddy lines of authority. The fundamental management dysfunction of DOE predated the security scandals at Los Alamos. The security “reforms” imposed administratively and legislatively in the aftermath of the security scandals made the problems dramatically worse. Therefore, the commission’s first recommendation is that the Energy secretary needs to clarify the lines of responsibility and authority in the department. This means creating not only smaller staffs but also clean lines of authority and new procedures that limit the endless bureaucratic wrangling in the department. There will always be tension among headquarters, field offices, and individual laboratories. These tensions need to be channeled into clear and predictable bureaucratic procedures that have a definable start and finish. Today, the losers in one bureaucratic skirmish merely advance to new firing positions and pick up the battle all over again.

Although these organizational issues seem arcane, it is absolutely necessary to have clear lines of authority in order to have sound security. For example, how can a counterintelligence officer be effective if he or she has two supervisors or none at all? How can emergency situations be managed properly when it is unclear who is in charge or, perhaps worse, if too many people think they are in charge? In the commission’s field visits to DOE laboratories, we found numerous instances in which there was profound confusion over the chain of command and responsibility. For example, DOE had two counterintelligence programs: one for the department itself and one for its internal NNSA. Because counterintelligence officers report to separate DOE and NNSA chiefs, there is inevitably fragmentation of information and communication. For a counterintelligence operation, which by its very nature requires informational cohesion to covertly detect spies, this is less than ideal. Among the changes the commission has proposed is assigning a single point of responsibility for counterintelligence in order to create a unified operation within DOE.

Second, DOE needs a philosophy and set of clear procedures that integrate science and security, rather than treat them as separate functions. The commission believes that the energy secretary should lead the departmental science mission and guide the supporting functions, including security. The directors of individual laboratories hold similar responsibilities at their level, in essence making them the chief scientists and the chief security officers for all laboratory functions. These directors need to have the flexibility to design the science agenda and the security program according to the needs of their laboratories, but they also need to be held strictly accountable for the performance of both. Headquarters organizations need to define policies but stay away from prescriptive formulas for how individual labs and offices should perform those functions. Headquarters organizations also should set standards of accountability and monitor performance.

The issue of integrating science and security at DOE is part of a much larger need for science to be included as a central component of security-related decisions throughout the government. Scientists must be part of the security solution. If they are not included, then we risk eroding the effectiveness of our security approach from the inside out. Without strong participation from scientists, we also risk losing top scientific talent from projects related to national security and even from unclassified laboratories, where the bulk of the nation’s fundamental scientific research is performed. The commission believes that DOE can better include scientists in the security decisionmaking process by adding them to headquarters and laboratory-level advisory boards, establishing rotating policy positions for scientists from the laboratories and developing new ways to link scientists and security in assessing risk and threats.

Third, DOE must develop and deploy a risk-based security model across the department’s entire complex. The sensitivity of activities is not uniformly distributed through every office and facility. As such, security rules should not be one size fits all. Instead, there are a small number of very sensitive “islands” of national security-related activity in an “ocean” of otherwise unclassified scientific activity. We need to protect the islands well, while not trying to protect the whole ocean and thus inadequately protecting its islands.

A risk-based security model for DOE needs to accommodate the complex nature of science today. As mentioned, science is increasingly collaborative, with research teams around the world that are increasingly connected through high-speed, high-capacity data channels. These teams will have U.S. citizens and foreigners working side by side. Securing critical secrets in this environment will be extremely challenging. The worst thing we can do, however, is throw a smothering blanket of regulation over the entire enterprise and chase away creative scientists from our labs. I am convinced that scientists will protect secrets if security procedures are clear and if the scientists are included in the policy process. But scientists are like the rest of us when they must endure security procedures that are arbitrary and easily subverted or skirted. Security professionals, therefore, need stronger skills and resources to design and convey effective security procedures. In essence, a risk-based model enables security professionals to protect what needs protecting.

Fourth, our security professionals need help in modernizing their security approach, and they need new tools and resources to do it. Like many parts of the intelligence community, DOE tends to inadvertently undercut its own capacity to implement a modern security model by providing inadequate tools to its security and counterintelligence professionals. In some instances, security and counterintelligence professionals are constricted by outdated systems and modes of thinking that are entrenched in years of shortsighted policy. For example, DOE’s analytic efforts are frequently undercut by a Cold War posture that emphasizes a rigid case-by-case approach, meaning that comprehensive data analyses are performed only when an incident provokes them.

Intellectually undisciplined categories such as sensitive unclassified information can harm security rather than help it.

The case-by-case approach is essentially reactive and does not employ continuous analysis as the best preemptive tool against spies and insider threats. The approach also fails to take maximum advantage of data and resources available in the laboratories and from the scientists themselves, who often collect routine data on visiting researchers that may be useful for counterintelligence purposes. These problems in DOE are symptoms of broader analytic disconnects in our intelligence apparatus that have received greater attention since September 11. The commission suggests a number of possible tools and techniques to assist in the development of a risk-based model.

The job of security professionals is to develop modern risk-based security models appropriate to the complex environment of the department and its laboratories. But they also need new kinds of training and analytic skills that are currently rare, particularly in the counterintelligence community. They need new high-technology security tools and analytic skills to design these security models and adapt them to an ever-changing dynamic work environment. For example, there are now biometric, personnel authentication, and data fusion systems that would be helpful not just to DOE’s security and counterintelligence work but also to broader government efforts to harness intelligence data from disparate sources. DOE could benefit tremendously from these state-of-the-art tools. Yet historically we do not honor our security professionals with the resources and support they need and deserve until we have a disaster, and then we invest too late. We need to invest now, but invest wisely.

Finally, DOE needs to devote special attention to cyber security. Although the department has always been computationally intensive, the digital revolution in the department has been sweeping. Like the rest of government and society, far more attention has been given to sharing information than to protecting the computing environment from malicious action. This becomes critical, because it is now dramatically easier to steal U.S. secrets by downloading files electronically rather than by covertly taking pictures of individual pages of drawings, as was the case for spies of an earlier era. DOE has devoted too little attention to cyber security. Although the commission found that the Energy secretary has already initiated steps in the right direction, there is precious little time to waste in this important area.

Microcosm of national problems

DOE, I believe, is a microcosm for the challenges facing the United States in the aftermath of September 11. What so shocked most U.S. citizens was the realization that the suicide terrorists lived among us for months planning their terrible work. We recognized that we were victims, in part, of the very features we cherish most in the nation’s way of life: a dynamic and energetic social environment, a freedom of movement and a privacy in our personal lives, and a nation increasingly interconnected with a wider world.

Now, homeland security has risen to the top of the government’s agenda. I strongly agree that it should. The first business of government is to protect its citizens from harm, caused by forces without or within. But I do not want to lose that which I love in this country, and I do not want our collective lives to be impoverished by security procedures that bring inefficiency without providing security. How do we protect ourselves from the various dark forces without becoming a police state? I will accept whatever it takes to protect the United States, but I also want to design those security procedures so that they do not sacrifice the values and the opportunities that make the United States unique.

Today, there are a number of efforts to protect and restrict access to scientific information. These include efforts to restrict the activities of foreign nationals, limit information already in the public domain, expand the use of “sensitive unclassified information,” broaden enforcement of “deemed exports,” and impose new restrictions on fundamental research. I believe we are at risk of duplicating some of the early mistakes we saw at DOE in several ways.

First, as an example of limitations on foreign nationals, the Department of Defense is circulating a draft regulation that would prohibit noncitizens from working as systems administrators for unclassified computer systems that are deemed “sensitive.” The regulation would apply to government employees and contract employees. However, there is no clear definition of what constitutes a sensitive computer system. Without a clear definition, any local security official could designate a system as sensitive. Such a regulation would enormously complicate the task of finding qualified personnel, practically necessitating an equivalent of security clearances for individuals who would not come close to classified work. We cannot afford to alienate noncitizens from unclassified U.S. scientific and technical enterprises, particularly because we are unable to supply enough U.S. scientists and technical experts to support our growing national needs.

Second, within the government there are efforts to narrow the body of publicly available government information, including documents that have been available for years on the Internet. Similarly, since September 11 there have been calls for expanded use of “sensitive unclassified information” and other ambiguous categories of information. I agree that there is a need to make sure information on Web sites and other public venues does not include information that might compromise our security. I also believe that certain information that falls somewhere in the gray area between classified and unclassified also should be controlled. But we must be clear: If information truly requires protection, it should be classified or protected by proper administrative controls that are based in statutes and have clear definitions for use.

Within the context of DOE, the commission witnessed how intellectually undisciplined categories, such as sensitive unclassified information, can harm security rather than help it. Sensitive unclassified information has contributed to confusion for scientists and security professionals alike in DOE and has resulted in the proliferation of homegrown classification labels in the laboratories. Indeed, the commission sees it as a category for which there is no usable definition, no common understanding of how to control it, no meaningful way to control it that is consistent with its level of sensitivity, and no agreement on what significance it has for national security.

Third, the “deemed export” problem has also become worse. According to the federal Export Administration Regulations, deemed exports are defined as any communication of technical data or source code that is “deemed” to take place when it is released to a foreign national within the United States. This category includes technical information or data provided by verbal means, mail, telephone, fax, workshops and conferences, email, or other computer transmission. The underlying concept of deemed exports applies to “sensitive” technologies, but there is no good definition of what constitutes sensitive technologies. Unfortunately, such broad criteria have been adopted that they could apply to almost any new and promising technical development. And since no clear definition exists, laboratory personnel are narrowing the scope of international cooperation in fear that they may be violating deemed export regulations.

Fourth, we must be wary that efforts to protect classified activities do not unintentionally compromise fundamental research as well. As mentioned, classified science relies increasingly on unclassified science as a source of innovation and ideas. The commission believes that there is a strong need for clarification in the protection of information that is produced as a result of basic research within DOE and throughout the government. In particular, we call on President Bush to reissue National Security Decision Directive 189 (NSDD-189). First issued in 1985 by President Reagan, NSDD-189 is a solid framework for protecting fundamental (unclassified) research from excessive regulation. The directive states that fundamental research is generally exempt from security regulations, and that any controls can be imposed only through a formal process established by those regulations. Although the directive remains in force today, too few government and security management professionals know about it or use it as a guide. In this time of heightened security, reissuing NSDD-189 would be a small but significant step in providing guidance to government organizations for striking a healthier balance between open science and national security needs.

We must be wary that efforts to protect classified activities do not unintentionally compromise fundamental research as well.

Although I understand and support the need for stronger security procedures, I see too many instances of inappropriate security procedures that are adopted in haste by government officials who fear criticism for inaction. In today’s increasingly dynamic society, security demands a disciplined, sophisticated analysis. I fear that without such an approach, heightened security restrictions will narrow the scope of creative interaction of U.S. scientists and technical personnel. Currently, there are many other areas where we are debating the merits of the right security methodology, including restrictions on student visas, mandatory biometric tags on passports, restrictions on drivers’ licenses for noncitizens living legally in the country, and access to biological agents, to name just a few.

Careful attention to security after the September 11 attacks is justified and appropriate. But we must not adopt hastily conceived security procedures with insufficient thought and design as an expedient in these urgent times. I believe that is precisely the mistake we saw at DOE. The Commission on Science and Security was assembled to examine the security policies and procedures made in that climate of fear and criticism. I worry that we are on the verge of making comparable mistakes now that would apply more generally in the United States. Now is the time to move to protect the country, but this must be done with prudent, reasoned security measures that provide the right tools and technologies for security professionals and preserve the openness and strength of our scientific institutions. This time, we all have a stake in getting it right.

Recommended Reading

  • Baker/Hamilton Commission, Science and Security in the Service of the Nation: A Review of the Security Incident Involving Classified Hard Drives at Los Alamos National Laboratory (Washington, D.C.: 2000).
  • National Academy of Sciences, National Academy of Engineering, Institute of Medicine, Committee on Balancing Scientific Openness and National Security, Balancing Scientific Openness and National Security Controls at the Nation’s Nuclear Weapons Laboratories (Washington, D.C.: National Academy Press, 1999).
  • President’s Foreign Intelligence Advisory Board, Science at Its Best, Security at Its Worst: A Report on Security Problems at the Department of Energy (Washington, D.C.: June 1, 1999).
Your participation enriches the conversation

Respond to the ideas raised in this essay by writing to [email protected]. And read what others are saying in our lively Forum section.

Cite this Article

Hamre, John J. “Science and Security at Risk.” Issues in Science and Technology 18, no. 4 (Summer 2002).

Vol. XVIII, No. 4, Summer 2002