Forum – Winter 2012
“Paying for Perennialism” by Sarah Whelchel and Elizabeth Popp Berman (Issues, Fall 2011) calls attention to an important area of research that, if successful in its goals, will enhance agricultural sustainability in the face of the growing world demand for food and the many challenges of a changing climate. Although past research to develop perennial grains has been slow-going, today’s genomics-based tools are enabling breeders to work much faster and attempt ambitious projects not previously possible.
The article understates the role that federal research is playing to move perennialism forward. Since the early 1900s, research has provided new tools to identify ideal traits in plants and more efficiently breed them into crops. The U.S. Department of Agriculture’s (USDA’s) Agricultural Research Service (ARS) has not only had a formative role in the recent developments and applications of agricultural genomic technology, but has contributed to a wide range of other advances, helping keep our farms productive and our food system safe and secure. This research complements and supports related research by public, private, and foundation partners.
Although the USDA is facing the same budget challenges felt across the nation, researchers are certainly not “walking away from work on perennials,” as the article suggests. There are formidable challenges to developing crop perennial types with winter hardiness, substantial yields, pest protection traits, and desired end-use qualities. The technical ability to dissect the genetic basis of perennialism and to apply breeding advances that are working so successfully in annual crops is only now becoming possible. At ARS, we are initiating research on the application of new technologies, including whole genome sequencing, genome-wide association studies, and rapid genetic selection methods, to perennial improvement.
ARS plant geneticists such as Ed Buckler (co-located at Cornell University) are leading the effort to make perennial corn production a reality. Among a broad array of crop genetic improvement projects, Buckler and his team are working to dissect perennialism by exploiting new genomic information and inexpensive DNA sequencing. They are also initiating experiments with Tripsacum, a genus closely related to the corn genus, Zea, to clone the genes needed for winter tolerance in the U.S. Midwest.
In Raleigh, North Carolina, ARS maize geneticist James Holland is working to design a breeding scheme to develop perennial corn, exploring the possibilities of intercrossing domesticated corn and its perennial relative wild teosinte.
In North Dakota, Fargo-based ARS scientist Jason Farris is discovering domestication genes in wheat and Brett Hulke is evaluating perennials in the USDA sunflower germplasm collection for disease resistance and working with crop breeders to introduce these genes into cultivated sunflower.
Work on perennials is also happening at other ARS laboratories across the country, in places such as Lincoln, Nebraska; Griffin, Georgia; and Kearneysville, West Virginia. While improving crops through genomics and breeding techniques is the first step in getting new varieties into the hands of producers, research will also be needed to determine how improved perennials respond to different agronomic practices and natural resource management to ensure that the potential represented in genetically improved crops is actu-ally achieved while ensuring sustainable production.
Underlying all crop improvement research are the conserved genetic resources in the National Plant Germ-plasm System, which includes perennial grain accessions. This extensive USDA collection, which importantly involves our land-grant university partners, is used by plant breeders across the nation and world to enhance the potential of crops, with benefits to our food security, food safety, nutrition, and the environment—far beyond the “well-worn path of agricultural research and production” stereotype of maximizing yields at all cost, as suggested in the article.
USDA research on perennialism is but one part of a much larger portfolio of agricultural science that is taking a multipronged approach to solving critical food, agricultural, natural resource, and sustainability challenges faced by the United States and the world. Conducted and/or supported by ARS and its sister USDA agency, the National Institute of Food and Agriculture, the broad agricultural science portfolio of USDA and its partners is coordinated by USDA Chief Scientist Catherine Woteki.
Within this system, particular roles of ARS are to conduct long-term, high-reward research supporting a diversity of production system approaches and engage in precommercial, foundational research where the private sector is not involved.
There is always more that can and must be done to advance this research. In the face of a growing global population and growing demand for food, sustainability will require the kinds of technological innovation that only result from dedication and coordination. Achieving sustainability also requires sustained investment, and now more than ever agricultural research needs continued and enhanced public support. Smart investments in agricultural research today will pay dividends to our world tomorrow.
Wes Jackson and I share a common theme, enjoying life in our seventh decade. And we both come from Midwestern farm stock, he from Kansas and I from Iowa. Wes, however, grew up in wheat country, and I was in corn country. Maybe this is why I did not catch Wes’s dream of perennialism earlier in my professional career spanning over five decades of soil science and related fields.
It was only when I moved beyond the halls of peer-reviewed academia to direct the Leopold Center for Sustainable Agriculture that I could see Wes’s forest. Now it is obvious that a perennial agriculture has a place in row crop agriculture, even in Iowa, where almost all the cultivated land is in corn and soybeans, and heaven help the poor farmer who might suggest otherwise at the local coffee shop.
Neither Wes nor I would, I think, advocate a 100% perennial landscape if we are serious about food crop production and maintaining economic viability. But the current barren landscape would greatly benefit by patches of perenniality that will also provide food and income. In fact, that was the agriculture of old, the one of Grant Wood paintings. Pasture dotted the hilly land, alfalfa (a three- to four-year rotation legume perennial) supported dairy herds, and beef herds and trees lined the streams. Fence rows that kept the neighbor’s cattle from straying were filled with diverse plants harboring beneficial insects.
Except for a few pockets of sanity, those landscapes are gone and probably will not return. Driven by economics, the big industrial farm stranglehold on agriculture has pushed soil erosion over tolerable limits, loaded the streams with sediment and nitrate, and depopulated the countryside. We have created an agriculture so risky that when things go wrong, as they often do in a world of changing climate, agriculture is too big to fail and must be a major part of the federal farm bill.
Perennials could fill a huge void here. They would add diversity, both financial and biological. If used wisely, proper perennial crops would greatly lower erosion. Carbon sequestration would be greatly enhanced. Benefits would be huge beyond the costs. But it is obvious that big federal research dollars will not go to perennial crops, no matter how much pleading is done and appeals to common sense are made by well-meaning folks. But that does not mean that there are not pockets of excellence out there in the research plots and labs of the land grant universities and the USDA.
Visionaries such as Wes Jackson are needed in this world even more than ever. But we are not training visionaries nor allowing them even to dream. Instead, they publish or perish, pile up grant dollars to fill an ever-deepening portfolio black hole, and are rewarded with who’s who plaques. So who will be the dreamers of the future? The National Science Foundation should try to identify them now and get them headed on the road to saving our world, because academia is not doing it very well.
Improving S&T policy
In “Science Policy Tools: Time for an Update” (Issues, Fall 2011), Neil F. Lane is correct in pointing out that our policies for managing science are in need of refurbishment, largely having been designed for a world that existed over a half-century ago. Perhaps the greatest change since that time has been characterized by Francis Cairncross, writing in The Economist, with the words “distance is dead.” Ironically, this phenomenon, also known as globalization, was itself brought about by advancements in science and technology.
Nearly all the major problems confronting the world today depend heavily on science for their solution. These range from ensuring quality health care to the provision of energy; from preserving the environment to defending against terrorism; and from building the economy to supplying food and water to all the world’s citizens.
But in the halls of our nation’s capital, one is far more likely to encounter a lobbyist for the poultry producers of America than anyone involved in science. Further, as Lane notes, the media is much too busy providing entertainment to address long-term pursuits such as science. (Of course, it could be noted that no one has told scientists that they cannot enter the political arena.)
In decades past, U.S. industry supported a considerable portion of the nation’s basic research, but today’s marketplace demand for short-term gains has all but eliminated that commitment (exhibit one: Bell Labs). Similarly, our nation’s universities have in the past been strong underwriters of research; however, these institutions increasingly face alternative financial demands. The federal government has thus become the default source for funding any research endeavor characterized by high risk, long duration, and results that may not accrue to the performer or funder but rather to society as a whole. But with an exploding deficit, federal funds for research are increasingly difficult to find.
A future shortage of scientists (and engineers) in the United States thus appears highly unlikely, but only in part because of the competition for funds. More significant is the fact that we have created a self-fulfilling prophecy. U.S. industry, for example, has discovered that it can move its research abroad, just as it did its manufacturing. That is where most of its customers are going to be anyway. Further, following a parallel philosophy, our universities are now expanding abroad. In the case of industry, with its newly created network of research laboratories around the world, when it finds some aspect of operating in the United States, such as export regulations, too onerous, it can simply bypass the U.S. laboratories and perform the work in its own facilities overseas.
The problem, of course, is that for many decades 50 to 85% of the growth in U.S. gross domestic product has been attributable to advancements in science and engineering. Given that only 4% of the U.S. workforce is composed of scientists and engineers, it can be argued that every 1% of that group is supporting some 15% of the growth of the overall economy and on the order of 10% of the increase in jobs.
The National Academies’ Rising Above the Gathering Storm study (I chaired the committee that produced the report) concluded that the two highest priorities to preserve America’s competitiveness are to repair our K-12 education system and to increase our investment in basic research. However, as Lane highlights, our system to do the latter is fragmented, and the former seems immune to all attempts at improvement.
A year ago, when testifying before Congress seeking funds to support these two goals, I was asked if I were unaware that our nation faced a budget problem. My answer was that I had been trained as an aeronautical engineer and that during my career I had worked on a number of new aircraft that were too heavy to fly—but never once did we solve the problem by removing an engine.
Research (and engineering) and education are the engines that drive our economy and promise to solve many of the other challenges we face as well. Neal Lane’s proposals would strengthen the hand of those few individuals now in our government who are trying mightily to strengthen the nation’s research endeavors. We should listen to him.
Neal F. Lane puts forth recommendations to the science and technology (S&T) policy community that call for increased integration, innovation, communication, and partnerships. He notes that the model laid out by Vannevar Bush in Science: the Endless Frontier has led to tremendous S&T accomplishments, and he rightly asserts that an update to our traditional S&T policy paradigm is long overdue. It is time to build on Bush’s visionary model in ways that reflect present-day challenges. Lane observes that although the S&T policy community has embraced the realities of increased complexity and rapid change in its discourse, it has not, in any systematic way, taken on the kinds of new thinking and conceptual frameworks that are required to address them.
Lane’s recommendation to more systematically integrate S&T activities across the federal government is a good one. Even within existing frameworks, the importance of this kind of coordination is being recognized. For instance, the National Science and Technology Council’s Committee on Environment, Natural Resources, and Sustainability is looking into mechanisms for increased interagency coordination to advance sustainability goals. The National Academies are also conducting a study entitled “Sustainability Linkages in the Federal Government” to identify linkages across federal S&T domains that are not traditionally incorporated into decision-making. In addition, within the Environmental Protection Agency’s (EPA’s) own research enterprise, steps have been taken to pursue cross-cutting goals, leverage expertise, and break down traditional scientific silos into a small number of integrated, transdisciplinary, sustainability-focused areas.
High-risk, high-reward R&D will also be necessary to reach beyond risk management and incremental improvement toward applicable, sustainable solutions. Transformative, disruptive, leapfrog innovations are critical to advancing scientific progress and competitiveness in the United States. The EPA’s Office of Research and Development has begun to facilitate and incentivize innovative R&D by appointing a chief innovation officer and launching an internal innovation competition, and, along with several other federal agencies, is engaging in open-source innovation challenges that solicit research solutions in exchange for cash awards. In addition, cross-sector partnerships have tremendous potential to bring about positive change through S&T. Lane’s government-university-industry model and other partnership schemes are ideal for incorporating multiple perspectives to increase R&D effectiveness and degrees of freedom in innovative solutions development.
Finally, Lane characterizes the public disconnect with scientific research as perhaps “the greatest threat to the future of the country’s research system.” This is not an overstatement. Members of the public are both the beneficiaries and sponsors of federal R&D. They need and deserve to understand how scientific discoveries affect their quality of life and foster U.S. progress. Although the media play an important role in communicating to the public, the S&T policy community must not sit on the sidelines. It is our job to explain the effects and significance of our research, science, and technology activities. To this end, communication skills should be considered essential for every scientist if the national research endeavor is to continue to thrive.
The globalization of scientific research
Caroline S. Wagner’s “The Shifting Landscape of Science” (Issues, Fall 2011) is to be commended for its recognition of an important and undeniable trend: the globalization of S&T affairs. We have started to shift from a reliance on so-called “national systems of innovation” to an emphasis on a series of new, globally networked systems of knowledge creation and exploitation. The concept of national innovation systems that was developed out of the research of leading scholars such as Richard Nelson, as well as the S&T policy team at the Organisation for Economic Co-operation and Development (OECD), has now become largely obsolete. Yet, as Wagner suggests, despite the fact that the data support the notion of this strategic transformation, the U.S. government refuses to make any significant adjustment in its policy mechanisms to accommodate the new R&D world of the 21st century. In some ways, the situation is even worse than Wagner suggests; as the activities of the world’s leading multinational firms clearly demonstrate, globalization of R&D activity has become a competitive imperative. U.S.-based multinational corporations, in particular, are establishing overseas R&D installations as a rapid pace; and not simply in other OECD nations, but in new places such as China and India. According to the latest data from the Chinese government, there are now more than 1,250 foreign R&D centers in operation in the People’s Republic of China. Far from being simply focused on local products and services, many of these R&D units are working on projects tied to the global marketplace.
So what is driving this steadily expanding push to globalize research and build new networked structures across the world? It is the rise of a new, dynamic global talent pool that has shifted the focus of overseas expansion by companies and even universities from a search for cheap labor and lower costs to a desire to harness the growing reservoir of brainpower that is popping up from Jakarta to Mumbai and from Dalian to Singapore. Supported by massive new investments in higher education as well as R&D, many countries that were once S&T laggards are emerging on the international scene as critical partners and collaborators. What gives these efforts at human capital enhancement even more momentum is that they are being complemented by significant financial investments in new facilities and equipment. In addition, many of these nations have benefitted substantially from their ongoing efforts to integrate domestic programs and initiatives with deeper engagement with countries across the international S&T system. The Chinese case is highly illustrative in this regard, as the Chinese government has built a multifaceted array of international S&T cooperation channels and relationships. In spite of lots of verbiage about promoting indigenous innovation, China has been one of the biggest beneficiaries of globalization and shows no signs of closing the door on its highly productive set of foreign S&T relations.
These developments clearly leave the United States in a potentially disadvantaged position vis-à-vis its hungry international counterparts. One of the clearest examples of the U.S. refusal to understand what is occurring in these other countries involves the recent restrictions put on the White House Office of Science and Technology (OSTP) by Congressman Frank Wolf (R-VA), who apparently believes that the United States has gotten too cozy with China’s S&T establishment. Through budgetary legislation engineered by Wolf, OSTP and NASA are currently restricted from fully engaging with China, putting in jeopardy the mutually beneficial S&T cooperation relationship that has been built by the two countries during the past three decades. Wolf ’s actions reflect an antiquated perspective that simply ignores the deep level of interdependence that currently exists in S&T affairs as well as in other aspects of the Sino-American relationship. Simply stated, there is no major global S&T problem in place today that will not require close cross-border collaboration between the United States and China, whether it is climate change, the search for new alternative energy supplies, or the efforts to combat threats to the global health system.
It is time for the U.S. government to wake up to the new realities highlighted by Wagner. If we do not reorient our policies and perspectives to these new pressing realities, we stand likely to become an also-ran nation in the ever-intensifying race for sustained scientific leadership and technological competitiveness. This will not happen tomorrow or the next day, but it will no doubt be part of a process of long-term decline in the efficacy of America’s once vibrant, highly productive S&T system.
Caroline Wagner’s article was a thoughtful and forward-looking commentary on the current debate about international competition versus collaboration in S&T. I agree with her thesis that “Science is no longer a national race to the top of the heap: it is a collaborative venture into knowledge creation and diffusion.” Friendly competition is healthy and even desirable for elevating the overall quality standard and advancing the S&T frontiers. An appropriately balanced combination of competition and collaboration will only accelerate the pace of discovery and innovation, especially in the current situation in which total world investment in S&T and the total number of S&T students worldwide are increasing.
Wagner advocates “an explicit U.S. strategy of global knowledge sourcing and collaboration” and suggests creating strategically focused, multilateral government programs. Indeed, the initial step toward that goal is well underway, and the U.S. government is taking the leadership role, as described in an editorial written by Subra Suresh, the director of the National Science Foundation (NSF), in the August 12, 2011 Science. (The views in this letter are my own and do not necessarily represent those of my employer, the NSF.)
Recognizing disparate scientific merit review as a fundamental barrier to multilateral collaborations, the NSF, on behalf of the United States, will host a global summit on merit review in May 2012 to “develop a foundation for international scientific collaboration.” Heads of research councils from approximately 45 countries are expected to attend the summit. It is hoped that this summit will lead to a long-term “virtual Global Research Council to promote the sharing of data and best practices for high-quality collaboration.” Success of the summit will lay a foundation for global knowledge sourcing, which will lead to the kind of multilateral collaborations that Wagner promotes in her article.
As other countries around the world have increased their S&T investments, U.S. dominance appears to be waning in comparison. However, this does not mean that U.S. excellence in S&T is declining in absolute terms. The United States is still the destination of choice for the best and brightest students from countries such as China, India, and South Korea. One of our challenges is to encourage U.S. students to go abroad to acquire global perspectives. To this end, NSF has partnered with funding agencies in Australia, China, Japan, New Zealand, Singapore, South Korea, and Taiwan, and annually supports 200-plus U.S. graduate students in study abroad. These students spend 8 to10 weeks in one of the seven countries conducting research and building a lifelong network with students and researchers in the host laboratory. Expanding the program to other countries is under consideration.
While strategically participating in international S&T collaboration to leverage intellectual and infrastructural resources across the globe, the United States must continue to invest in fundamental research in order to stay competitive in multilateral collaborations, and more importantly, to ensure new discoveries that will lead to totally new technologies that we cannot even imagine today.
Carolyn S. Wagner’s article makes useful points about the benefits to the United States of tapping burgeoning sources of foreign scientific knowledge by fostering and participating in more international research collaboration. But some important flaws in her assumptions indicate that Washington will need to pursue these goals more carefully and discriminatingly than her essay appears to recognize.
Although no one could reasonably object to the author’s goal of augmenting U.S. scientific wherewithal with knowhow generated abroad, the payoffs to the United States from this cooperative strategy will surely be more modest than Wagner suggests and the risks significantly greater.
Wagner’s first dubious assumption concerns the role of scientific knowledge in the world of nation-states and their interactions. Economists may view knowledge, in Wagner’s words, as “non-rivalous because its consumption or use does not reduce its availability or usefulness to others.” But history teaches unquestionably that knowledge is also power. Any collaborative policies must be subordinated to U.S. security and closely related economic interests. Therefore, and especially given the continuing U.S. science edge, collaboration must be tightly limited with mercantilist countries that simply don’t view international commerce as a positive sum (such as virtually all of East Asia and much of Europe), as well as with likely geopolitical rivals (such as China and Russia). This means not only that much critical U.S. knowhow must remain securely under U.S. control, but that Washington cannot allow the temptation to develop an economically rational global division of scientific labor prevent or stunt the development even of certain national capabilities that duplicate foreign expertise.
Wagner’s assumptions about the United States’ relative global science position and its future also seem vulnerable. If current trends continue, the nation’s East Asian competitors could remain flush with resources to finance expanded science and technology development. But given the still-considerable linkage between these economies and their best final customer—the financially challenged United States—their own continued rapid progress is far from assured. Moreover, the foreseeable future of both private- and public-sector science funding in crisis-ridden Europe—the third big global pool of scientific ex-pertise—appears to be even grimmer than it is in the United States.
Finally, Wagner apparently accepts an assumption about the worldwide proliferation of scientific knowledge that is as widespread as it is incomplete. Obviously, as the author writes, considerable and inevitable foreign catch-up with U.S. capabilities has characterized the post–World War II period. But the scientific rise of China and India in particular has been fueled largely by the policy-induced off-shoring activities of U.S.- and other foreign-owned multinational companies. Eliminating these firms’ incentives to arbitrage foreign subsidies and regulatory vacuums and ultralow costs for even skilled labor, especially in tandem with a raft of better domestic policies, holds much greater potential to bolster the domestic S&T base than collaborative programs that meet the prudence test.
Catastrophe insurance
Howard Kunreuther’s and Erwann Michel-Kerjan’s “People Get Ready” (Issues Fall 2011) reviews a number of critical challenges that undermine progress in making the United States more disaster-resilient, while the nation and the rest of the world seem to be entering an era of increasingly frequent and increasingly devastating catastrophes. Climate change is seen as a key factor in provoking weather-related disasters, particularly hurricanes and severe coastal storms. Just this year, many New England states experienced severe flooding from, of all things, a tropical storm that roared up the Atlantic coast, first manifesting itself as Hurricane Irene.
As noted in the article, recent studies conducted by my own center at Columbia University confirm some of the unrealistic citizen perspectives on disasters that impede the public’s motivation to “get prepared.” For instance, we found that 62% of Americans believe that, in the event of a major disaster, first responders would be on the scene to help them “within a few hours,” and nearly one in three feels that it would take less than an hour.
Dramatic increases in population density in disaster-prone areas, along with fragile, deteriorating infrastructure, have been an inevitable consequence of a rapidly growing population looking for natural beauty and/or economic opportunity. It is perfectly understandable that people are drawn to the normally calm climate and natural appeal of communities such as south Florida, the Carolinas, the Gulf Coast or the spectacular vistas of northern California’s rocky coastline. The calculus driving where-to-live decisionmaking, however, is rarely much affected by an objective assessment of disaster risk.
Some 80 million Americans live in communities at significant risk with respect to earthquakes. This past May, to test local and regional response capabilities, the federal government conducted a National Level Exercise hypothesizing major seismic activity along the 150-mile New Madrid fault, which runs through the middle of the country, putting some five states at considerable risk. Although the final report is not completed, my observations revealed substantial challenges in readiness for a large-scale catastrophe.
And besides potential weather and geological calamities, of course, there are a myriad of risks related to the built environment. Many of the nation’s 104 nuclear power plants lack sufficient safeguards to significantly reduce the risk of Fukushima-like catastrophes. What about trains loaded with dangerous chemicals rolling through unsuspecting communities? And the nation’s infrastructure, from electrical grids to levees and bridges, is increasingly be-ing recognized as disconcertingly fragile, putting many communities at considerable risk. Fixable? Yes, but at a cost estimated by the American Society of Civil Engineers to be in the range of $2.7 trillion.
Still, the fact remains that 310-plus million Americans have to live somewhere, and because it is virtually unavoidable, a substantial percentage of us live in or near an area of definable risk. Kunreuther’s and Michel-Kerjan’s focus on how we need to rethink how we approach preparedness, risk mitigation, and insurance coverage is relevant and important. But making the United States substantially more disaster-resilient will require more than innovative approaches to insurance reform, mitigation strategies, and building codes. One way or another, citizen engagement across all socioeconomic strata and in every cultural and ethnic community will remain a high priority if disaster resiliency is a central goal.
The article by Howard Kunreuther and Erwann Michel-Kerjan provides evidence from psychology and behavioral economics to help explain why individuals underinsure against catastrophic risk and fail to mitigate against disasters. In response, they recommend multiyear insurance tied to property and multiyear loans for mitigation. Options to encourage or fund more mitigation can be very cost-effective in reducing losses; a Congressional Budget Office (CBO) study found that the Federal Emergency Management Agency’s mitigation program delivered about $3 in benefits per $1 spent. (As a CBO employee, the views in this letter are mine alone, and not necessarily those of the CBO.) The authors’ proposals could also potentially reduce the need for federal disaster assistance. Barriers to implementing their recommendations may exist, including state regulation of rates and polices, and the details of the insurance contract will matter, so policymakers may continue to consider additional options to expand private coverage.
In addition to the psychological factors discussed by the authors, government policies may also contribute to underinsurance and too little mitigation against catastrophic losses. At the state level, the regulation of premiums and insurance coverage leads to high-risk policyholders not paying the full cost of their risk, which both reduces incentives to mitigate losses and leads to subsidies from taxpayers and lower-risk policyholders in state residual pools for catastrophic risks. Those policies contribute to overdevelopment in high-risk areas.
At the federal level, the government provides not only flood insurance but also various forms of implicit catastrophic insurance. After a disaster, Congress generally provides extensive federal assistance to individuals, businesses, and state and local governments to help cover uninsured losses and assist in economic recovery. After Hurricane Katrina, the CBO estimated that additional federal spending for hurricane-related assistance, together with various forms of disaster-related tax relief, would cost about $125 billion from 2006 to 2010. These types of assistance reduce financial hardship and help stimulate the economy after a disaster, but they also discourage individuals and businesses from taking steps to mitigate future losses and from seeking private market solutions for financing those losses.
Although they are not directed at natural disasters, federal mortgage guarantees, which covered about 95% of new residential mortgages in the first half of 2011, expose taxpayers to natural disaster risk. Enforcing the existing insurance requirements (including flood coverage) on those mortgages could reduce costs, and policymakers could also weigh the benefits and costs of requiring earthquake insurance for some policyholders.
Christopher Lewis and others have proposed that the federal government auction reinsurance to insurers and state-sponsored programs with the goal of improving their ability to provide coverage. Auctions might reduce the problem of underpricing federal insurance and crowding out private supply, particularly if the contracts were limited to the highest layers of risk. But a reinsurance program would also probably impose costs on taxpayers; the federal government generally has difficulties in efficiently managing insurance risk.
The article by Howard Kunreuther and Erwann Michel-Kerjan was of great interest to me and our 14,000 members. Not only are disaster costs from natural hazards in the nation increasing, but the risk of those hazards is increasing even faster. It is essential that we help those living at risk to understand the risk, take responsibility for it, and take actions to reduce their risk.
Everyone involved in the issue of fire risk works together so that nearly everyone with a home or structure in the United States has insurance for fire. This includes banks, insurance companies, realtors, and property owners. Yet in a majority of cases, those exposed to natural hazards such as floods and earthquakes, who are much more apt to experience a loss from those events than from fire, do not perceive this risk and do not insure against it or take steps to reduce their risk. In the meantime, as the article points out, we continue to allow development in areas at risk from natural hazards, so the consequences of flood and earthquake events are building rapidly. Risk is not only the probability of an event happening, but the consequences (or costs) if it does. But the banking, insurance, realty, and other development industries do not promote insurance for natural hazards.
A significant factor is pointed out in the article—that the federal taxpayers are picking up more and more of the costs of natural disasters, which means that communities, developers, and other decisionmakers can gain the benefits of at-risk development while letting federal taxpayers pay for the consequences through disaster relief. Disaster relief is not just funding from the Federal Emergency Management Agency, but also from the Department of Transportation to rebuild roads, bridges, etc., and from Housing and Urban Development, the Environmental Protection Agency, U.S. Department of Agriculture, and many other federal agencies that provide funding or grants after a disaster.
The article provides many good suggestions that should be considered to reverse the increased costs and human suffering from natural hazards. The most effective measures to reduce future risk rest with local and state governments through land-use and building codes. All of us need to support such actions and support a sliding cost-share system for disaster relief to reward communities and states who do more to reduce or prevent the problem, instead of continuing to throw more federal taxpayer money at communities and states who continue to allow massive at-risk development.
New approach to cybersecurity
The message in “Cybersecurity in the Private Sector” (Issues, Fall 2011) by Amitai Etzioni is clear: The private sector is evil, and if the federal government would only do its job and regulate, our cyber systems would be secure.
But Etzioni acknowledges that the Department of Homeland Security is not up to the job, although he suggests that’s because they use private-sector equipment and contractors. But one only need recall the WikiLeaks fiasco, when an unsupervised government employee with a Lady Gaga CD accessed masses of classified data and released it on the Internet, to realize that being a government employee doesn’t confer extra morality or security expertise.
The name, blame, shame campaign both misunderstands what we are dealing with and misdirects what we need to do about it. We are well beyond hackers, breaches, and perimeter defense. The serious attacks we face come from highly organized, well-funded, sophisticated, state-supported professionals (mostly from China), who successfully compromise any system they target.
Most cyber systems are substantially overmatched by their modern attackers. The solution is not to blame the victims. Currently, all the incentives favor the attackers. Attacks are relatively cheap, profitable, and difficult to detect. Defense is a generation behind the attackers, it’s hard to show quantified returns on investment from prevention, and successful criminal prosecution is almost nonexistent. This doesn’t mean that we have no defense, but we need to create a new system of defense.
The traditional regulatory model, constructed to deal with the hot technology of two centuries ago—railroads—is a bad fit for this problem. Regulations can’t keep up with rapidly changing technology and attack methods. U.S. regulations only reach U.S. companies, whereas the problems are international. And regulating technology impedes innovation and investment, which we cannot afford.
The Internet Security Alliance (ISA) has suggested an alternate model, the Cyber Security Social Contract, based on the public utilities model, wherein policymakers achieved a social goal (universal service) by providing economic incentives to the private sector (guaranteeing the rate of return on investment).
The ISA model suggests retaining existing regulation for industries where the economics are baked into the regulation (such as utilities). For the non-regulated sectors (information technology, manufacturing, etc.), we create a menu of market incentives (insurance, liability, procurement, etc.) to encourage greater security investment.
This modern, pragmatic, and sustainable approach, which Etzioni ignores, has received broad support. The Executive Summary to President Obama’s Cyber Space Policy Review both begins and concludes by citing the Social Contract. ISA white papers filling out the idea are cited four times more than any other source in the president’s document. These principles were also the primary basis of the House Republican task force report on cybersecurity released in October 2011.
A broad array of private-sector trade associations representing software providers, hardware providers, corporate consumers, and the civil liberties community published a detailed white paper in March 2010 that also endorsed this approach.
It is backward-looking policymakers and think-tankers who are holding progress in cybersecurity hostage to a 19th-century regulatory model that can’t address this 21st-century problem.
More focus on occupational certificates
Brian Bosworth’s “Expanding Certificate Programs” (Issues, Fall 2011) shines light on an important and often neglected area of labor market preparation. According to the Survey on Income and Program Participation (SIPP), fully 18% of workers have a certificate from a business, vocational, trade, or technical postsecondary program, and a third of these people also have a two- or four-year degree. Of the 20% of associate degree graduates with a certificate, 65% got their certificate first, 7% got it at the same time they got their degree, and 28% got their certificate after getting their associate degree.
As Bosworth shows, certificates are particularly useful for hard-to-serve populations, such as minorities, low-income adults, and young people who didn’t do well in high school. The advantages of these programs include being shorter, offering more-focused learning, and being flexibly scheduled.
The programs can also adapt more quickly to changing market demand for specific skills and fields.
Like any education/training program, there is variation on economic returns depending on the field of study. We feel that there needs to be constant monitoring of earnings of graduates to ensure that students have the best information to align their interests and talents with occupations that are growing and that pay well. Another crucial factor is placement. In our analysis of SIPP data, certificate holders who are in occupations related to their training earn 35% more than those not working in their field.
Bosworth’s presentation of strategies for success gives clear guidelines on how to structure programs to maximize student completion and transition to successful labor market outcomes. There is a lot of talk about the need for more postsecondary educational attainment. All too frequently, people view this as increasing our rate of bachelor’s degree graduates. Although this is a reasonable goal, four-year degrees are not for everyone. The subbaccalaureate programs that result in two-year degrees and/or certificates are an important option for many students and need to be promoted just as much as bachelor’s programs.
Brian Bosworth’s excellent article is an important contribution to the growing conversation about college completion and the labor market value of postsecondary credentials. He correctly points out that we have failed to recognize the value of certificate programs, particularly in high-value career fields with strong wages, which allow students to gain the credential and enter the workforce in a shorter period of time. This is a timely article as many states grapple with increasing the number of individuals holding some type of postsecondary credential.
Bosworth correctly argues that a certificate with good labor market value is the only ticket for certain populations to a good job and opportunity for a quality life. In Tennessee, as in many states, students come to institutions of higher education underprepared for collegiate work and also often have demands on their lives that prohibit full-time attendance in pursuit of the degree. Many adult students are unable to commit to four to six years of collegiate work in order to complete the degree.
A recent study by the Georgetown University Center on Education and the Workforce underlines the urgency for Tennessee. Between 2008 and 2018, new jobs in Tennessee requiring postsecondary education and training will grow by 194,000, while jobs for high-school graduates and dropouts will grow by 145,000. Between 2008 and 2018, Tennessee will create 967,000 job vacancies representing both new jobs and openings due to retirement; 516,000 of these job vacancies will be for those with postsecondary credentials. Fifty-four percent of all jobs in Tennessee (1.8 million) will require some postsecondary training beyond high school in 2018. The need for Tennesseans with postsecondary credentials is great. Certificates offer a tremendous opportunity.
But, as Bosworth states, not just any certificate will suffice, and certainly not only those delivered in the traditional structure. He argues that how we deliver such certificate programming has an even greater chance of ensuring completion for those adults who are busy with life and have many demands on their time and resources. His recommendations of the use of block scheduling, embedded student support and remediation, and cohort-based models are a major step forward in understanding successful structures for the types of students in our postsecondary institutions today.
His call for action to make this happen at all levels is important. In Tennessee, recent legislation requires the use of block-scheduled, cohort programs in our community colleges as a means to increase the number of those credentialed to obtain employment. We are taking this a step further and focusing some of our work on increasing the number of certificates of a year or longer that are delivered via this strategy. We believe that the data over the next couple of years will support the success of this effort. Of course, students already are telling us that this approach provides the only way that they could ever attend college. That speaks volumes to my mind.
Certificates that demonstrate significant occupation-related competencies and that are valued in the labor market are clearly an underdeveloped aspect of the college completion strategy. As Brian Bosworth points out, postsecondary certificate programs that are a year or longer in duration generally have good labor market payoff, and these longer-term certificates may be an important route to better employment and earnings for many Americans, particularly working adults and low-income and minority youth. Greater attention should certainly be paid by policymakers and opinion leaders to occupational certificates that can be completed fairly efficiently and that respond effectively to local employer needs.
But, as Bosworth notes, several pitfalls must be avoided. First, the goal cannot simply be more certificates: If states generate more short-term certificates requiring less than a year of training, few completers are likely to see any earnings gains. And the trends are troubling. According to the American Association of Community Colleges, in the past 20 years, community college awards of certificates of less than a year’s duration rose by 459%, while awards of certificates of a year or more rose 121%.
Needless to say, if minority and low-income students disproportionately choose or are steered to certificates with less economic payoff, the result may be more completers but little economic value for the graduates or society. Again, the trends give reason for concern. From 1990 through 2010, the percentage increase in short-term credentials earned by minority community college students was almost more than two times that of whites for blacks (770% compared to 440%) and three times for Hispanics (1337% versus 440%).
One important policy implication is that states need to track certificate students more carefully, so they have a better idea of who is enrolling in and earning what certificates, and so the labor market outcomes for recipients of different occupational certificates are well documented.
Bosworth ends his article with recommendations for how community colleges can implement evidence-based career programs. The federal government’s commitment of $2 billion in Trade Adjustment Assistance Communitgy College Training Grants can give these programs a big boost.