Forum – Fall 2015

Reevaluating educational credentials

Mark Schneider’s work on the wide variation in the economic value of postsecondary educational programs, as described in “The Value of Sub-baccalaureate Credentials” (Issues, Summer 2015), is of great importance because it reflects new labor market realities that affect nearly everyone in the United States. Before the 1980s, high school was enough to provide middle-class earnings for most people. In the 1970s, for example, nearly three in four workers had a high school education or less, and the majority of these workers were still in the middle class. But that high school economy is gone and not coming back. Nowadays, you go nowhere after high school unless you get at least some college first. The only career strategy more expensive than paying for college is not going to college.

As the relationship between postsecondary programs and labor markets has become stronger, it has also become more complex. The economic value of postsecondary degrees and other rewards have less and less to do with institutional brands and more and more to do with an expanding array of programs in particular fields of study. Degrees and other postsecondary credentials have multiplied and diversified to include traditional degrees measured in years of seat time; bite-sized credentials that take a few months; boot camps, badges, stackable certificates, and massive open online courses (MOOCs) that take a few weeks; and test-based certifications and licenses based on proven competencies completely unmoored from traditional classroom training.

The new relationship between postsecondary education and the economy comes with new rules that require much more detailed information on the connection between individual postsecondary programs and career pathways:

Rule No. 1. On average, more education still pays. Over a career, high school graduates earn $1.3 million, a B.A. gets $2.3 million, a Ph.D. gets $3.3 million, and a professional degree gets $3.7 million.

Rule No. 2. What you make depends a lot less on where you go for your education and a lot more on what you study. A major in early childhood education pays $3.3 million less over a career compared with a major in petroleum engineering.

Rule No. 3. Sometimes less education is worth more. A one-year computer certificate earns up to $72,000 a year compared with $54,000 for the average B.A.

issues_summer15_cover

As program value spawns new credentials and training venues, the market signaling from postsecondary programs to students, workers, and employers becomes a Tower of Babel. Today, there is a need for clear, comprehensive, and actionable information that connects postsecondary education options with labor market demand. The nation has built a vast postsecondary network of institutions and programs with no common operating system that links programs to careers. To get a better handle on the big black box that postsecondary education and training has become and address the inefficient and inequitable use of education and workforce information, we need a new approach.

Anthony P. Carnevale

Research Professor and Director

McCourt School of Public Policy

Georgetown University Center on Education and the Workforce

Revisiting genetic engineering

In “Regulating Genetic Engineering: The Limits and Politics of Knowledge” (Issues, Summer 2015), Erik Millstone, Andy Stirling, and Dominic Glover accurately criticize some of the arguments set forth by Drew L. Kershen and Henry I. Miller in “Give Genetic Engineering Some Breathing Room” (Issues, Winter 2015). Millstone et al. correctly point out that Kershen and Miller oversimplify when they say that genetic engineering (GE) does not need government oversight. However, Millstone et al. also mislead their readers by asserting their own generalities and biases about GE crops. Both articles fail to provide evidence about current GE crops, or acknowledge that the real question is not whether GE technology is safe or unsafe, but whether particular applications are safe and beneficial when assessed on an individual basis.

The Center for Science in the Public Interest’s (CSPI) Biotechnology Project is a nongovernmental organization whose nuanced, fact-based approach falls neither into the “for” nor “against” camps. CSPI has stated that the current GE crops grown in the United States are safe, which is consistent with a growing international consensus. That same conclusion has been reached by the National Academy of Sciences, the U.S. Food and Drug Administration, the European Food Safety Agency, and numerous other scientific organizations and government regulatory bodies. That says nothing, however, about future GE products, the safety of which will need to be assessed on a case-by-case basis.

There is ample evidence that GE crops grown in the United States and around the world provide tremendous benefits to farmers and the environment. For example, Bt cotton has significantly reduced the use of chemical insecticides in the United States, India, and China. Although GE crops are not a panacea for solving food insecurity or world hunger, GE is a powerful tool scientists can use to create crop varieties helpful to farmers in developing countries.

Although current GE crops are safe and beneficial, government oversight is essential, and the current the U.S. regulatory system needs improvement. In particular, the Food and Drug Administration has a voluntary consultation process rather than a mandatory premarket government oversight system similar to what is found in the European Union, Canada, and other countries. Congress should enact a premarket approval process to ensure the safety of GE crops and instill confidence in consumers.

CSPI acknowledges the negative effects on agriculture and environment from the use of some current GE crops. Glyphosate-resistant weeds and resistant corn rootworms are a direct result of overuse and misuse of GE seeds with unsustainable farming practices. Resistant weeds and insects force farmers to revert to using pesticides and farming practices, such as tillage, that are more environmentally harmful. The solution, however, is not taking away GE seeds but requiring better industry and farmer stewardship. Farmers using GE crops must introduce them into integrated weed- and pest-management systems where rotation of crops and herbicides is required.

As with other technologies, society’s goal should be to reap the benefits and minimize the risks. The future of GE crops should be led by facts and case-by-case assessments, not general arguments from proponents or opponents.

Gregory Jaffe

Biotechnology Project Director

Center for Science in the Public Interest

Keeping fusion flexible

Robert L. Hirsch’s article “Fusion Research: Time to Set a New Path” (Issues, Summer 2015) is informative and thought-provoking. The issues he addresses, including economic viability, operational safety, and regulatory concerns, are important and require closer examination. His analysis makes a convincing, fact-based case for the need to examine the merits of current fusion efforts supported by public funds. As of June 2015, the United States has invested $751 million in the International Thermonuclear Experimental Reactor (ITER) tokamak project. As such, the public should be able to access the ITER team’s findings regarding the issues Hirsch pointed out. Greater disclosure will allow a more meaningful dialogue regarding the merits of the current publicly funded fusion research and development (R&D) path.

Based on my experience in managing a private fusion company, the current fusion funding landscape will be an important factor in the education of next-generation fusion scientists and engineers. Over the past decade, several privately funded startup companies have sprung up in the United States and elsewhere in pursuit of practical fusion power based on radically different approaches from the tokamak. The emergence of these startups is largely due to the past technical progress in fusion research stemming from a diverse portfolio of approaches supported by the government. These companies have generated a significant number of jobs despite the fact that their combined budget is only about 10% of government-funded fusion programs. However, they face a common challenge of filling critical technical roles as the talent pool of young scientists and engineers with a diverse background in fusion research is dwindling.

In the federal fusion energy science budget for fiscal year 2015, the lion’s share of funding is directed toward a single fusion concept—the tokamak. Combined with the $150 million allocated to the ITER tokamak program, the total funding for tokamak-specific R&D amounts to $361 million. In comparison, only $10.4 million goes toward R&D on high-pressure compact fusion devices. This type of approach is pursued by all but one private fusion company due to its compact size, low-cost development path, and potential for economic viability. In mid-2015, the Advanced Research Projects Agency-Energy announced that it would provide one-time funding of $10 million per year over three years for this work. This will provide some relief to support innovation in fusion, but it is far from sufficient. This lopsided federal fusion spending creates a huge mismatch between the needs of the nascent, but growing, private fusion industry and the focus of government-supported fusion R&D. Although the tokamak has provided the best-performing results to date, ITER has projected that the widespread deployment of practical fusion power based on tokamak will nevertheless begin only in 2075. This timetable suggests that the nation must continue to support diverse approaches to improve the odds for success.

Over the past couple of years, I have had the opportunity to share our own results and progress with the public. It has been encouraging to me that the public, on balance, views fusion research as a worthy endeavor that can one day address the world’s need for sustainable and economical sources of power. People also understand the challenges of developing practical fusion power—yet by and large, the public is willing to remain as a key stakeholder in support of fusion research. It is thus imperative for the fusion research community to keep its focus on innovations, while being judicious in its spending of public dollars. In that regard, I think Hirsch’s article is very timely, and deserves the attention of the fusion research community and the public at large.

Jaeyoung Park

President

Energy Matter Conversion Corporation

Since leaving the federal government’s magnetic confinement fusion program and the field in the mid-1970s, Robert Hirsch has contributed a series of diatribes against the most successful concept being developed worldwide in that program. What is surprising is not the familiar content of this latest installment, but that it was published in Issues, a journal seeking to present knowledgeable opinion in this area.

As for the article, Hirsch complains that the tokamak uses technologies that have been known to fail sometimes in other applications, notes that the ITER tokamak presently under construction is more expensive than a conventional light-water nuclear reactor that can be bought today, and concludes with a clarion call for setting a new path in fusion research (without any specifics except that it lead to an economical reactor).

Components do fail, particularly in the early stages of development of a technology. Hirsch mentions, for example, superconducting magnets failing in accelerators, causing long downtimes for repair, and plasma-disruptive shutdown in tokamaks. This argument ignores the learning curve of technology improvement. Bridges have collapsed and airplanes have crashed, with much more disastrous consequences than a tokamak shutting down unexpectedly would have, but improvements in technology have now made these events acceptably unlikely. Why can the same technology learning curve not be expected for magnetic fusion technologies?

Hirsch’s economic arguments based on comparison of the estimated cost of ITER and of a Westinghouse AP-600 light-water nuclear reactor are disingenuous (at best) and completely ignore both the learning curve and the difference in purpose of ITER and an AP-600. ITER is an international collaboration entered into by the seven parties (the United States, the European Union, Japan, Russia, China, South Korea, and India) for sharing the expense of gaining the industrial and scientific experience of building and operating an experimental fusion reactor, most of the components of which are first-of-a-kind and therefore require the development of new manufacturing procedures and a large and continuing amount of R&D. Each of the parties wants to share in this experience for as many of the technologies as possible. To initially achieve the ITER collaboration, an extremely awkward management arrangement was devised, including in-kind contribution of the components and the requirement of unanimity among all parties on all major decisions. By contrast, the AP-600 benefits from a half-century learning curve in which hundreds of light-water reactors have been built and operated, many of them by the single industrial firm (Westinghouse) that offers the AP-600. A more meaningful comparison would be to cost an AP-600 to be built in the 1950s (escalated to today’s dollars) by the same type of consortium as ITER, involving the same parties with the same purpose, and requiring the development in 1950 of what would be first-of-a-kind components of the present AP-600.

Weston M. Stacey

Regents’ Professor of Nuclear Engineering

Georgia Institute of Technology

Climate clubs and free-riding

“Climate Clubs to Overcome Free-Riding” (Issues, Summer 2015), by William Nordhaus, falls short by several measures, and in the end is unworkable.

First, its author claims that climate change agreements such as the Kyoto Protocol suffer from the free-rider dilemma of collective goods. However, the protocol had problems almost from the beginning. The United States defected before its full definition, implementation, and ratification. A cap-and-trade system covering all of the participants was never implemented. And the European trading mechanism, which was supposed to prepare it, was flawed and never worked properly. Therefore, it is likely that it was not free-riding that destroyed the Kyoto Protocol, but rather the failure to set up properly functioning institutions.

Second, the modeling framework set up by Nordhaus, even though commendable as a theoretical tool, remains a blunt instrument. To reach workable results, a number of simplifying assumptions are included: Discount rates are the same for all countries. The trade sectors are rudimentary and do not account for exchange rate fluctuations, which are often more important than tariff barriers. Retaliatory trade measures and the institutional rules of the international trade system (under the World Trade Organization) forbidding tariff hikes are ignored. All this would not matter if parameters representing these aspects would not play a major role. But they do influence results, as was widely noticed. Moreover, the Dynamic Integrated Climate-Economy model, or Dice model, uses the standard assumption that countries constitute unitary agents. If we were facing a world of homogenous countries, this would hardly matter. But the international system is composed of a relatively small set of very big powers that have a disproportionate influence on the evolution of world politics. For them, domestic considerations are as important if not more than international ones, and internal coalitions are strongly constraining their policies.

Third, it appears that just as in the international trade regime, where domestic lobbies hurt by liberalization will try to oppose and defeat it politically, the same pertains to environmental agreements. The United States exited the Kyoto Protocol because of the influence of the fossil-fuel lobby and its stranglehold on the Republican Party, a situation that persists today. Similar but more hidden influences exist in other powers (the European Union and China). Is there a way out of this situation? The useful analogy is the Montreal Protocol to eliminate ozone-destroying gases, a successful agreement. The protocol was made possible because a relatively cheap substitution technology existed for refrigeration gases. This ensured that manufacturers had trouble coalescing to fight the treaty. If a cheap alternative to fossil fuels would be found, a similar outcome could be obtained because the substitution technology would spread and the financial back of the fuel lobby could be broken. Is there some hope for this? Yes: renewable energy technologies are getting ever cheaper, and combined with more efficient energy storage facilities, the supremacy of fossil fuels could be threatened.

Urs Luterbacher

Professor emeritus

Centre for Environmental Studies

Centre for Finance and Development

Graduate Institute of International and Development Studies

Geneva, Switzerland

Technology governance alternatives

In “Coordinating Technology Governance” (Issues, Summer 2015), Gary E. Marchant and Wendell Wallach present a compelling argument for the need for a Governance Coordination Council (GCC) to correct a key deficiency in oversight of emerging technologies in the United States. Specifically, the authors say that the GCC would “give particular attention to underscoring the gaps in the existing regulatory regime that pose serious risks. It would search, in concert with the various stakeholders, methods to address those gaps and risks.” In light of the incredible recent advances in technologies that are changing the physical and natural world and life itself, I fully agree with their call to better synchronize funding, regulation, and other policy actions to make more reflective and deliberate decisions about emerging technologies. To date, these decisions have been piecemeal and delayed. Current approaches have left interest groups, academics, practitioners, and product developers frustrated, at best.

Visions for changing governance are as varied as the scholars that have written about them. Coordinating mechanisms such as a GCC have been proposed by others, including me and my colleagues. In particular, in 2011, we reported on a four-year project, funded by the National Science Foundation, that analyzed five case studies of governance and resulted in our calling for “an overall coordinating entity to capture the dimensions of risk and societal issues…as well as provide oversight throughout the life-cycle of the technology or product.” As with Marchant and Wallach’s GCC, we suggested that a coordinating group use foresight to anticipate and prepare for future decision-making by funding risk- and policy-relevant research and elucidating authorities well before regulatory submission of products.

However, there are some key differences between the GCC model and ours: 1) the authority that the coordinating group would have, 2) the role of the public(s), and 3) the overarching institutional structure. In our model, we proposed that the Office of Science and Technology Policy should take a lead for convergent and emerging technological products, and we stressed that it should have the authority to mandate interagency interactions and to ensure that stakeholder and public deliberations are incorporated into agency decision-making. I fail to see how a coordinating group such as the GCC, without having access to government resources and legal mechanisms, could add more to what already exists in the form of think-tanks, academic centers, and other advisory groups that convene diverse stakeholders and provide input into policies. A coordinating mechanism needs sharp political teeth, as well as independence from undue influence—hence the dilemma.

We also suggested having three groups working together with equal power: an interagency group, an advisory stakeholder committee, and a citizen group that would speak for the results of wide-scale public deliberation. Thus, our model rests on a central role for the public(s) that are often marginalized from discussion and decisions. The three groups would help to focus national resources toward technologies that are most desired by the taxpayers who fund them. The Marchant-Wallach GCC model takes on a more hierarchical structure, with staff of the GCC and the stakeholders they convene holding a significant amount of top-down power. In contrast, our model would be more networked and bottom-up in structure, with information and viewpoints from citizens feeding into a process that has legal authority. It is debatable which approach is best, and like Marchant and Wallach, I believe it is a crucial time to test some options.

Perhaps most important, the nation will be stuck with past models until we acknowledge and challenge the elephant in the room: the lack of political will to make a change. Currently, the vast majority of power lies in the hands of technology developers who can fund political campaigns. Until high-level policymakers are willing to consider alternatives to a largely neoliberal approach in which technological development and progress take precedent above all else, the rest of us will be resigned to watch the world change in ways that we do not necessarily want.

Jennifer Kuzma

Goodnight-NCGSK Foundation Distinguished Professor

School of Public and International Affairs

Co-Director, Genetic Engineering and Society Center

North Carolina State University

Cite this Article

“Forum – Fall 2015.” Issues in Science and Technology 32, no. 1 (Fall 2015).

Vol. XXXII, No. 1, Fall 2015