The new normal in science funding
Daniel Howard and Frank Laird make the important point that history demonstrates the difficulty of increasing federal research support without a major national priority such as the space race, the cold war, or concern about the health of an aging population. Hence, the authors call for important shifts to accommodate both the next generation of scholars and produce quality research, including throttling back startup funding, stressing collaborative research and team building, and expanding the portfolio of research sponsors (e.g., business and foundations).
While these make sense, it also seems prudent to continue the current effort to push for the funding of the American COMPETES Act, which recommends not only a doubling of basic research in physical science and engineering, but also a major investment in STEM education. Furthermore, additional recommendations of the recent National Academies report on research universities seem appropriate, since they not only call for the federal government to fund what they it has authorized, but also to restructure indirect cost recovery and regulatory burdens to reduce the institutional costs of research, restructure graduate education to broaden preparation for non-academic and non-research careers, to utilize federal grants with matching requirements (from states, donors, and institutions) to invest in the next generation of scholars and needed campus research infrastructure, and to commit the nation’s research universities to an aggressive effort to constrain costs and enhance productivity in both research and graduate education.
Put another way, these two federal studies have adopted the philosophy of “advocating for something better while preparing for something worse” in approaching federal research support.
At a time when both the nation and the world are increasingly dependent upon the new knowledge and graduates for prosperity, social well-being, and security, it seems very important to continue to call for stronger government support of research for tomorrow, albeit accompanied by the modifications of the current research paradigm necessary to address the realities of today.
I found the article “The New Normal in Funding University Science” (Issues, Fall 2013) by Daniel Howard and Frank Laird to be provocative, logical, and pragmatic. The article points out that the supply of scientists in the United States has exceeded the demand. Certainly, if the country elects to underinvest in science, it will inevitably have too many scientists. The problem, of course, is that Americans may not like the outcome of such a strategy.
The solutions proposed in the article are sensible but, in my opinion, insufficient. They include greater collaborative use of research facilities and, especially, greater support of university research by business. One other avenue that might have been mentioned is for our state research universities to significantly intensify their efforts at building endowments through alumni contributions, much as have many of the nation’s private institutions. It is difficult to make the case that graduates of public universities are inherently less loyal than those of private institutions, particularly if one happens to venture into a football stadium on a fall Saturday afternoon.
Unfortunately, it is unlikely that the business cavalry is going to come riding over the hill. Eighty percent of financial executives say they would cut R&D to meet next quarter’s earnings projections. Responding to the short-term demands of the financial markets, industry, rightly or wrongly, has been abandoning research. The iconic Bell Labs being exhibit one. But even if industry were to increase its investment in university research by 300% and proportionately share the proceeds across the research spectrum, that would, in the case of NIH, for example, barely offset the federal budget reductions it has suffered in the past 10 years. Business does fund two-thirds of the nation’s R&D, but the emphasis is on the “D.”
As the authors properly point out, we should prepare for a generally flat R&D budget. But preparing for the most probable outcome should not crowd out efforts to bring about a more constructive outcome. I believe that the science community can do a far better job of making the case for federal investment in science, and this is an area where the business community can and must help. Other nations seem to be able to find a way to fund R&D. The United States ranks behind Israel, Korea, Finland, Japan, Sweden, Denmark, Taiwan, and Germany in the fraction of GDP devoted to R&D. Further, the nation ranks 29th in the share of the nation’s R&D that is funded by the federal government.
The argument, as I see it, goes as follows: (1) The quality of life of the nation’s citizens depends heavily upon their having quality jobs; (2) each percentage point increase in jobs is accompanied by approximately a 1.7 percentage point growth in GDP; (3) about three-fourths of America’s growth in GDP is attributable to advancements in science and technology; and (4) it is important to all citizens that the nation invest in science and technology.
During one of the many occasions I have testified before Congress on behalf of increased funding for university research, I was asked if I were unaware that the nation faces a budget crisis. I answered that I was indeed aware of that circumstance and went on to explain that as an aeronautical engineer I have worked on many airplanes that during their development were too heavy to fly—but never once did we solve the problem by removing an engine. Science and technology are the engines that propel modern economies.
Where’s the evidence?
Jeffrey Liebman (“Advancing Evidence-Based Policymaking to Solve Social Problems,” Issues, Fall 2013), referencing evidence-based-policy (EBP), writes that “most government spending is not allocated based on evidence or with a focus on innovation or performance.” He is right; federal government spending specifically on evidence-based programs accounts for less than a half-percent of nonmilitary discretionary programs—about $1.2 billion of a $670 billion budget. Dissatisfied with this, Liebman urges renewed efforts, claims that “cataloging successful practices can help them to spread,” and believes more can be achieved “if Congress and the president take action to support the adoption of effective approaches.” Although his ambitious recommendations are thoughtfully presented, he is silent on what might increase the odds of their success. We are left with a huge gap between what is and what could be—if only EBP were more widely used.
Maybe, however, policymakers are not as indifferent to research evidence as the experience of EBP suggests. Although EBP practices, especially experiments, generally provide more definitive answers than, say, focus groups or descriptive statistics, it does not follow that definitiveness is what policymakers find most useful. If we ask what kind of research findings policymakers do find useful, EBP practitioners have little to say, though they often repeat the familiar complaint that political interests and ideologies trump science.
This complaint explains the obvious. Anyone expecting policy to be made independently of conflicting political interests and ideologies doesn’t appreciate democracy, a form of government that deliberately puts competing, power-seeking politicians in charge. This does not mean that science and politics are locked in a zero-sum game. To the contrary; democratic policymaking is reason-giving, a continuous parade of cause and effect statements: “X is the better policy choice because it will have Y outcome.” When that “because” can be argued with relevant science, odds are high that it will be in the mix, though not in the apolitical, deterministic manner seemingly expected in the EBP model.
Perhaps we should focus less on EBP than on EIPA, that is, Evidence-Influenced Policy Argument. Policymakers, suggests the National Research Council committee report The Use of Science as Evidence in Public Policy (which I chaired), are “engaged in an interactive, social process that assembles, interprets, and argues over science and whether it is relevant to the policy choice at hand and, if so, using that science as evidence supporting their policy arguments. Policy argument as a form of situated, practical reasoning directly leads to a concern with how evidence, in the specific way now defined, is used rather than how it is produced.”
Studies of knowledge utilization offer insightful typologies of use, but typologies have not helped us understand the what, when, why, and how of actual use. The use of science in policy argument—irrespective of whether the science is biology, engineering, or sociology— is a social activity. Social science needs to investigate what makes for valid and compelling policy arguments as seen by policymakers and those they want to persuade. Endlessly discussing how science is produced is not a guide to whether it will be used.
The value of education
Mark Schneider’s “Does Education Pay?” (Issues, Fall 2013) makes several important points. Like the work of Tony Carnevale at the Georgetown Center on Education and the Workforce, Schneider highlights the critical role fields of study play in students’ subsequent economic life outcomes. Since mathematicians, engineers, and nurses tend to make more than musicians, artists, or journalists, students are well served to understand the relationship between choice of major and future earnings. Going to college, however, should encourage students to understand that there is a difference between making a living and making a life.
In the status-driven world of higher education, Schneider’s work highlighting the power of community college and other sub-baccalaureate credentials is critical. America’s community colleges are the unacknowledged powerhouse for U.S. workforce development. Schneider’s finding that technical/ career oriented associate degrees show higher first-year earnings compared to baccalaureate degrees, as well as the earning power of certificates, is a testament to the efficacy of this system.
Equally notable is his finding that field of study is more important than place of study, a result bound to displease private not-for-profit and flagship colleges who actively promote their status as a reason for higher tuition. There is silence, however, on the issue of the impact of good teaching by college faculty, whose contribution to student success is rarely computed. Community colleges emphasize good teaching as essential for students receiving degrees that are worth something.
There are three issues I would like to surface. One is that Schneider’s work does not address the cost to the colleges of producing the technical degrees. Technical associate’s degrees are more costly to deliver, with higher equipment, faculty, and curricular costs. Given wholesale decreases in state higher education funding, providing these degrees will continue to be a challenge.
The second is basing college choice on first-year earnings data. Certificates in manufacturing, construction trades, and health fields are great for entry-level jobs and are particularly critical for the adult population who need immediate employment. But let’s be honest. Most manufacturing, construction, and health fields are physically demanding, with few advancement opportunities without further education.
The third is the ultimately false dichotomy that Schneider’s article creates between liberal arts degrees and technical degrees. No technical degree is worth its salt if it doesn’t include exposure to multidisciplinary ways of thinking, understanding of global issues, or development of an ability to communicate and work with people. The Association of American Colleges and Universities’ survey of employers found 74% of business and nonprofit leaders say they would recommend a 21st century liberal arts education to a young person they know in order to prepare for long-term professional success in today’s global economy. We cannot allow an emphasis on first-year earnings to skew students’ choices to short-term economic gain, just as we cannot allow colleges to offer a narrow technical curriculum without a healthy dose of the liberal arts.
After reading “Does Education Pay?” by Mark Schneider, ask yourself, “Are there any surprises?”
Beginning in the late 1990s, many states developed memoranda of understanding for linked data systems, but these were limited in their usage to reporting on the federal Perkins Loan Program and included only those institutions that receive Perkins funds. Schneider’s work expands on the capacity developed through the linked state data sets by including the addition of four-year colleges and universities with their bachelors and graduate degrees.
We live in interesting times during which many college graduates cannot find jobs at the same time that there is a shortage of qualified candidates for middle-and high-skill positions. Schneider’s work is important because it addresses the reason why so many attend college: to qualify for a well-paying job.
The findings are especially insightful as federal and foundation support for science, technology, engineering, and math (STEM) initiatives continues to drive the allocation of faculty and financial resources at our schools, colleges, and universities. It confirms that an engineering degree offers new graduates the best chance of finding a well-paying first job. Schneider’s data support what has long been suspected in academe: Careers in the humanities and liberal arts are limited, and their economic benefits generally pale when compared with STEM and other technical careers. However, he also finds that those graduating with degrees in biology, the most popular field of science, earn no more than sociology or psychology majors.
I caution that we not put our higher education institutions at risk of becoming sterile learning environments void of the humanities and the creativity of the performing and fine arts; our free and democratic society needs an actively engaged citizenry informed by the lessons of history.
For those unfamiliar with community colleges, the message is clear. There are well-paying jobs and careers that do not require the four-year degree, and there are no guarantees that a college degree will result in a dream job.
There are some powerful “takeaways” from this article:
• Some short-term higher education credentials command higher salaries than bachelor’s degrees
• Higher education does not necessarily lead to higher salaries
• High college tuition does not necessarily lead to higher salaries
• The S in STEM may be over-rated.
A major limitation of Schneider’s study, as he acknowledges, is that it reports only on the wage earnings from the first year after completing a degree or certificate.
There has been much work done in the individual states to develop components of the comprehensive data set described in this report; future efforts to expand the College Measures pilot project would be well advised not to duplicate the state systems but to enable their expansion to include all partners in the educational enterprise.
Prajwal Kulkarni (“Rethinking ‘Science’ Communication,” Issues, Fall 2013) gets a number of important things right. In particular, I appreciate his bold argument that people possess certain misconceptions about science (such as the idea that science proves things, or the idea that all scientists test hypotheses) in no small part because scientists have told them that these things are true. The desire to convince people that science is trustworthy can lead us into murky generalizations, and as Kulkarni points out, “a big-picture view of science will… hide more than it reveals.” Bravo. I also applaud him for encouraging scientists to focus on vivid and specific stories when sharing their work. This strategy prevents troublesome generalizations, and it also makes for more engaging communication that stands a better chance of revealing what is “wonderful and awe-inspiring” about scientific work.
Yet it is too early to claim that “there is no science anymore… only sciences.” Science may not be a very useful epistemic category these days, but it is still an important social category. We put more public money into science than any other sort of scholarship. Scientists are still among the most trusted groups in the country, at least according to the semi-annual Science and Engineering Indicators survey. Finally, when someone claims that a finding is based on scientific research, that claim still carries weight. The sociologist Thomas Gieryn, among others, has shown that struggles over the boundaries of science are often struggles over resources and credibility, and that regardless of what kind of science one does, the fact that people think of that work as science makes a difference. There are certainly limits to the social power and authority of science, but one need only watch the fireworks that erupt when one scholar calls another’s work “unscientific” to know that the label matters.
Kulkarni has given us half of the puzzle: Many different things bear the label science. The other half of the puzzle resides in the label itself. Without old tropes such as the scientific method to fall back on, how can we talk about what the science means and explain the resources and power that it is assigned within our society? Is there anything that people need to know about the overarching social category of science that will help them when they receive a new diagnosis or confront a science-related controversy? And where does all of this leave science educators, who are still faced with the task of explaining a thing called science to the next generation of scientists and citizens? Kulkarni recommends being specific and humble, telling the human stories of scientific work. I agree that this is crucial. But the story of science is also a story about our society: whom we trust, what we value, and how we pick particular questions to ask and answer. That story is not always simple or inspiring, but one cannot understand science without it.
In an otherwise insightful and thoughtful article, Sebastian Pfotenhauer (Trade Policy Is Science Policy,” Issues, Fall 2013) might better have entitled his contribution “Trade Policy Needs to Be Reconciled with Science Policy.” The North American Free Trade Agreement (NAFTA) and the agreements administered by the World Trade Organization, particularly the General Agreement on Tariffs and Trade (GATT) and the Technical Barriers to Trade (TBT), were adopted to promote international trade and increase the economic benefits therefrom. Harmonization of environmental, health, and safety, and (EHS) standards and practices was generally not the goal of these agreements, except perhaps for the TBT agreement, which was predicated on EHS standards being based on “strong science” that could result in uniformity dictated by rigorous scientific consensus focused on risk assessments.
NAFTA does not pretend to aspire to harmonization of EHS standards and practices, but rather to encourage its three North American countries to enforce their own laws that differ in their approach to health, safety, and the environment. In the GATT, exceptions to “non-tariff barriers” are specifically reserved to those measures in national law that are necessary to protect human, animal or plant life or health; necessary to secure compliance with other laws or regulations that are not inconsistent with the provisions of the GATT; and that pertain to conservation of exhaustible living and nonliving natural resources. Rather than leaving the impression that EHS standards have generally been “declared a barrier to trade,” Pfotenhauer might have instead emphasized that the ban of certain asbestos products by France was actually declared permissible in a case brought by Canada against the European Union and that the U.S. ban on shrimp caught along with endangered turtles would have been regarded as permissible were it not for the fact that the United States did not treat restrictions on shrimp imports from Asia the same as the imports of shrimp from other countries. These are very important decisions of the WTO under the GATT that are likely to be controlling legal precedent in future disputes. Particularly relevant is the established principle from these cases that nations are entitled to set their own precautionary degree and nature of protection in their EHS standards and practices without violating WTO trade agreements.
However, most disturbing was the opinion of the WTO appellate board in the asbestos case that the dispute should have been decided under the TBT agreement, rather than under the GATT. Under the TBT agreement, as mentioned above, technical standards are to be based on “science,” although uniformity among nations is not required. To the extent that what comprises strong science might be agreed on by scientific bodies employing traditional scientific standards for compelling evidence, a compromise of the precautionary principle might well be expected, and uniformity could theoretically emerge, although the latter is not guaranteed.
Finally worth mentioning is the political influence that lobbying efforts by U.S. companies have had and continue to have on the adoption of European Union EHS standards and practices. This was most obvious in the case of the European Union’s Registration, Evaluation, Authorization and Restrictions of chemicals and the current flurry of activity in Brussels in the energy and global climate area. Of course, one might say that trade policy has always been political, but when commercial and economic interests are allowed to trump legitimate and serious concerns about health, safety, and the environment, including those related to energy production and global climate problems, advocates for the environment and public health should press for adequate protections that are not held hostage to these commercial interests. The devil of the U.S.- EU trade agreement will ultimately be in the details of the agreement still to be worked out.
“The Future of Meat” (Issues, Fall 2013) explores the intersection of food, resources, and an emerging technology with the potential to completely transform our society and economy. The authors appropriately focus on scenarios— the what-ifs that help delineate ignorance—instead of on details that are as yet unknowable. Through those scenarios we can hope to grapple effectively with the interplay of rapidly changing technology, resource constraints, and climate change, all in the context of an increasingly interconnected world. The constraints imposed by population, land use, water, and food are rising in public awareness.
We have escaped such constraints before; perhaps we will do so again. Largely due to technology that replaces labor, the U.S. Department of Agriculture calculates that the total factor productivity of U.S. agriculture is the most improved of any sector in the economy over the past century. Thus the nation continues to fend off the warnings of Malthus and Ehrlich. At some time hence, carneries, as the authors call meat factories, may contribute to a world in which we look back on field-grown meat for consumption as we now do horses for transportation: quaint, slow, and unsuitable for sustaining a planet full of humans. Carneries may create great upheaval in labor markets, supply chains, and the very structure of society.
Given the scope of the potential impact, the authors make a broader call for “systemic evaluation of early-stage technologies with significant potential for societal transformation.” I have great sympathy for this endeavor, as my own research has moved ever further from investigating molecules and cells toward exploring how biotechnologies are developed and deployed in the economy. Just estimating the size of the U.S. bioeconomy is fraught with methodological peril because the government does not adequately measure relevant parameters such as revenues or employment in the biotechnology industry. Yet, once armed with even a basic set of numbers, we can begin to sort out the societal impacts of investment in technologies that transform the way we produce and consume.
For example, in 2010 U.S. revenues from genetically modified systems, including drugs, crops, and industrial products, were about $300 billion, or the equivalent of approximately 2% of GDP. In 2012, revenues rose to $350 billion, which puts annual sector growth at about 15%. Biotech drugs generated revenues of $100 billion, GM crops of $125 billion, and industrial products (fuels, enzymes, and materials) of $125 billion. Of the industrial component, biologically produced chemicals and biofuels generated $66 and $30 billion in revenues, respectively, which may contribute to conversations about the larger societal impacts of agricultural subsidies and of the Renewable Fuel Standard. Moreover, biotechnology now generates in the neighborhood of 40-60% of total revenues from U.S. agriculture, forestry, and fisheries. Thus, biotechnology, a relatively young commercial technology that may both illustrate and influence the future developmental path of factory meat, is already creating substantial social transformation even though we are not adequately measuring or discussing its effects.
The technologies that will enable commercializing cultured and printed cells are evolving quite quickly. I recently had the opportunity to handle leather samples from Modern Meadow. The swatches of cultured and cured cow skin came in a variety of thicknesses and textures, and to my untrained senses encompassed a range of leathers found in both fine shoes and less refined jackets or bags.
Producing animal-free leather is a long way in terms of technology, economics, and regulations from factory-grown meat. But putting that leather on our feet constitutes a first step toward eventually consuming meat grown in vats. As Mattick and Allenby describe, the economic case for this technology is compelling, and we need to get started considering the implications.