Securing U.S. Research Strength
The more things change, the more they stay the same” applies today as it did in the 1980s to the U.S. capability to preserve the nation’s leadership in science and technology. In the mid-1980s, the main requirements for preserving U.S. leadership included the need to change the research system to pay more attention to and accommodate the increasing need for multidisciplinary studies; the need to attract more students to science and engineering; the need to increase investment, including a doubling of the government’s share of nondefense civilian basic research; and the need to leverage the federal support of that research by stimulating funding from industry and from state and local governments.
By some measures, these needs have been met, and the basic research base has improved significantly. Federal support for basic research in universities has increased from $5 billion to $13 billion during the past 15 years. Universities have been able to increase funding for their own research, with 20 percent of university research now being self-funded. And industry has increased its support for and–more important–dependence on basic research conducted in universities.
The results are evident: The United States has maintained its place at the forefront of science and technology. The nation is leading in the biological sciences, a field that has undergone a massive expansion in recent years. It is also leading in the computational sciences and the early development of nanotechnology. Universities and federal research agencies have recognized the value of and necessity for multidisciplinary research, as attested by the growth of centers almost beyond what was imaginable earlier. The competitiveness of many strategically important industrial sectors, including communications, semiconductors, manufacturing equipment, and pharmaceuticals, has improved.
More important, industry, government, and universities are increasingly interacting and communicating in a spirit of collaboration and cooperation to find solutions to problems. Universities, especially since the passage of the Bayh-Dole Act, have added technology transfer to their responsibilities, and commercial spinoffs from these institutions are common.
Adding to this positive development is the recent RAND report Federal Investment in R&D, requested by the President’s Council of Advisors in Science and Technology, which reviewed the past 25 years of federal R&D funding. What emerge from the data are a number of observations.
First, no matter how we look at R&D funding–either in actual dollar terms or as a percentage of gross national product (GNP)–the past decade has seen major increases, from $188 billion to $265 billion, or an average increase of almost 8 percent per year. Similarly, the amount invested as a percentage of GNP, 2.7 percent, is at its highest level since 1984. Perhaps an even more remarkable trend is the shift in industry’s share of R&D funding. In the 1980s, funding was split equally between industry and the federal government, but today industry funds 68 percent and the government 24 percent.
There are also causes for concern in the details of the report.
The inverse of the previous observation is that federal support for research as a percent of the total is declining, and research funding is more dependent on the country’s moment-by-moment economic fortunes.
Disciplines receiving major research funding increases were the life sciences (a threefold increase in 20 years), mathematics, and computer science (a fivefold increase in 20 years). In contrast, funding increases for the other sciences and engineering were in the 20 percent range during this period.
Finally, numerous competitor nations have made greater advances than the United States in terms of developing human resources for science and technology. Many countries in the European Union and Asia have exceeded U.S. degree production in the natural sciences and engineering. Europe overtook the United States in degree production in 1988 and has stayed ahead, and Asia pulled ahead in 1998. During this same period, U.S. degree attainment in these fields has declined.
These last examples point to the need to be alert and face the challenges in front of us. First, although industry is now the predominant funder of the nation’s R&D, most industrial R&D today consists of development. The past two decades have seen the downsizing and, in some cases, the demise of large industrial research laboratories, which historically contributed greatly to this country’s discoveries and inventions, in many cases in partnership with U.S. universities and national laboratories. As a consequence of these changes, we can expect that, over time, basic research will account for a lower percentage of total R&D funding as compared to the past.
Second, and of increasing concern, is the unequal funding of the life sciences compared with the other sciences and engineering. To make progress, research in all fields needs to advance at about the same rate, because one field depends heavily on insights from others. The life sciences depend on progress in materials, computer science, instrumentation, and electronics. The Human Genome Project was highly dependent not only on automated instrumentation but also on database organization; and the project now depends heavily on computational modeling, simulation, and visualization.
I don’t want to suggest that the focus on the life sciences was misplaced. Quite the contrary: A breakthrough in the understanding of the basics of the life sciences justified extraordinary federal support. Nor do I want to suggest that the other disciplines now deserve their turn. The important point is that rapid progress in the life sciences has opened up opportunities and especially new insights into the other disciplines that can now be exploited. The wholly new field of bioinformatics, the development of nanotechnology, and the generic development of biochips are examples of such opportunities.
No other issue, however, compares in seriousness to that of the deficit in human resources. For the past 30 or 40 years, the United States has substantially depended on the brainpower of people who came here as immigrants–students or faculty, permanently or temporarily. But as other countries continue to build their own bases in science and technology and increase their levels of industrialization, fewer qualified people will come to the United States or stay here. In addition, U.S. companies are sometimes moving operations to foreign countries explicitly to take advantage of the increasing number of highly educated people abroad. The United States could eventually be faced with a reduced science and technology base.
There is no easy answer to this problem. Increasing awareness is an indispensable step. Among practical actions, besides increasing support for K-12 science and math education, federal agencies could develop programs that provide financial help to would-be U.S. students in the sciences and related fields, and Congress could scrutinize the disincentives that industry faces in regulations and taxes when domestic development activities are being considered. And why should the research tax credit accrue to a U.S. company when the work is accomplished overseas?
More attention should also be paid to international science. The United States needs to be more of a participant than it has been in programs that require either extraordinary funding, such as nuclear fusion, or that can be carried out only through international cooperation, such as environmental science. Equally important, as other nations increase their capacity and capability in research, the United States needs to be aware of and take advantage of their progress.
The United States faces major challenges in maintaining the competitiveness of its research system. We must focus on human resources, on research funding sources, on balancing support for disciplines, on retaining our industrial development and manufacturing base, and on playing a proper role in international science. We neglect this agenda at our peril.