Forum – Summer 2000

Ecosystem assessment

In “Ecosystem Data to Guide Hard Choices” (Issues, Spring 2000) Walter V. Reid makes a compelling case that improved information is needed to enable decisionmakers to cope with the increasingly complex ecological decisions they will face in the 21st century. He points out that we no longer live in a world in which decisions made by one group have minimal effect on others. Rather, every increment of land or water newly devoted to one purpose is an increment taken from another, often equally valued, purpose. The debates that rage over policy decisions of the U.S. Forest Service and Bureau of Land Management, as well as those affecting private landowners, such as farm policy and wetlands regulation, are examples that mirror the global trend toward an increasingly integrated world.

The Millennium Ecosystem Assessment (MA) that Reid describes is an ambitious and important effort to make available information that will highlight the tradeoffs involved in decisionmaking. The principles that guide the assessment are important, since they will determine whether the results are accepted and can make a difference. We have some experience with such principles and their impact, because over the past several years we have been involved in creating a domestic analogue of several aspects of the planned MA: the Heinz Center’s State of the Nation’s Ecosystems Project (www.us-ecosystems.org), a national effort to provide ecological information to decisionmakers.

Based on our experience with the ecosystems report, we strongly support the notion that such an assessment must not be the creation of one sector of society; it must reflect a diversity of views about what goods, services, and ecosystem properties are viewed as important and thus monitored and reported. The Heinz project strongly embodies this multisectoral approach: We have involved close to 200 experts from business, environmental, academic, and government institutions in all aspects of the project. This has served us well in establishing the legitimacy of the ecosystems report as a nonpartisan effort that produces value for a wide array of interests and stakeholders.

We also agree with Reid that scientific credibility is the foundation on which all such assessments must be based. In fact, scientific credibility and multisectoral governance of such an enterprise go hand in hand. With multiple parties at the table, the likelihood of assumptions going unchallenged or of bias creeping into the selection and presentation of data is greatly reduced.

Finally, we agree with Reid that additional information is needed before any such assessment can be fully effective. Even in the United States, with a long history of environmental concern and substantial monitoring and data-gathering investments by federal, state, and local governments and private-sector groups, there are major and systematic gaps. Our knowledge of some very basic aspects of ecosystems, including their size and the nature and condition of the plants and animals that make them up, is in many cases dismal. The situation faced by the MA at the global level will almost certainly be far worse in this regard.

We have found the task of shaping a scientifically credible, politically nonpartisan, and practically useful report on the state of this nation’s ecosystems to be among the most exciting and challenging efforts in which we have ever been engaged. We can only hope that the MA will be successful in marshaling the resources, ingenuity, and patience that its vastly more ambitious effort will require.

ROBIN O’MALLEY

Project Manager

State of the Nation’s Ecosystems Project

The H. John Heinz III Center for Science, Economics and the Environment

Washington, D.C.

WILLIAM C. CLARK

Harvey Brooks Professor of International Science, Public Policy and Human Development

Kennedy School of Government

Harvard University

Cambridge, Massachusetts


“Lack of ecosystem data” rarely tops the list of causes of environmental degradation. On Easter Island, to take an extreme example, early inhabitants surely knew that chopping down the entire palm forest would not be a good thing. They depended heavily on porpoise meat, hunted from palm canoes. Nonetheless, they cut down every last tree. They then wiped out coastal avian and marine food resources, resorted in desperation to eating rats, and finally turned on each other. Possession of good ecosystem data–the importance and declining availability of palms would have been evident to all–did not prevent disaster.

One might therefore wonder whether the ecosystem data to be generated by the Millennium Ecosystem Assessment (MA) can do much to improve the global environmental situation. The answer is yes; the MA process is absolutely essential. Clues to why can be seen in the collective history of the Pacific islands.

Easter was not an isolated, freak case; the outcome on some other islands was equally grim. Yet on some islands with very similar initial conditions (people, culture, and environment), truly sustainable economies emerged. What accounted for the difference in trajectories? The answer is speculative and somewhat counterintuitive: size. It appears that small islands were more likely to attain sustainability. Tikopia, a model of success, is only about 1.7 square miles. Patrick Kirch proposes that where everyone knew everyone else, ecological limits to human activities were more likely to be accepted and major “policy changes” (such as giving up pork) and new institutions (such as regulating fertility) were more likely to be adopted. Conversely, the Easter-scale (64 square miles) islands were prone to dividing into “them” and “us” in a race to the bottom.

Two island lessons are particularly relevant today. First, the initial action must come on the social side. Like the islanders, we know enough scientifically to recognize trouble and start moving in the right direction. Second, human beings evolved as small-group animals. Our future prospects seem to depend on whether, in a population of over 6 billion, we can foster enough of a small-group feel to forge cooperative solutions to our predicament.

The time is ripe for the MA. Leaders in all parts of the world and in all sectors of society are starting to move in the right direction: recognizing ecosystems as valuable capital assets. This is apparent in developed and developing countries alike (Australia and Costa Rica stand out especially); in the integration of ecology, economics, and law; and, most critically, in private enterprise. We are witnessing a renaissance in the way people think about the environment. We must now create a formal process by which to generate much broader mutual understanding of the global environmental situation and of ways to address it from local to global scales. This will require top science and rapid development and deployment of innovative approaches to managing ecosystem assets profitably and sustainably. The MA is the best shot at achieving the small-group kind of communication necessary to do this. With luck, it might keep us from eating each other.

GRETCHEN C. DAILY

Department of Biological Sciences

Stanford University

Stanford, California


Implicit in Walter V. Reid’s call for support of a Millennium Ecosystem Assessment (MA) is the requirement for a comprehensive, versatile information infrastructure to enable the confluence of data, information, images, tools, and dialogue necessary to inform policy debate and decisionmaking. Fortunately, current and developing information and communication technologies allow the construction of this essential capability. The National Biological Information Infrastructure (NBII), located on the Web at , is an electronic “federation” of biological data and information sources that is dedicated to building knowledge through partnerships. The NBII provides swift access to biological databases, information products, directories, and guides maintained by a wide variety of organizations from all sectors of society, public and private.

In their March 1998 report Teaming with Life: Investing in Science to Understand and Use America’s Living Capital, The President’s Council of Advisors on Science and Technology (PCAST) recognized that scientific information–both that currently available and that to be generated to fill in the gaps of our understanding–must be organized electronically, interlinked, and provided to all parties who need it. PCAST acknowledged the value of the NBII and recommended that a next-generation NBII (NBII-2) be built.

The NBII has been developed through collaboration among federal, state, and international governments; academic institutions; nongovernmental organizations; interagency groups; and commercial enterprises to provide increased access to the nation’s biological resources. BioBot, a biological search engine for the Internet, is an example of a tool resulting from NBII collaborative activities. NBII customers use BioBot to search NBII information as well as other biological information available on the Internet through sources such as SnapBiology, AltaVista, and Yahoo. An example of an NBII-accessible standardized reference to support discovery and retrieval of pertinent biological data is the Integrated Taxonomic Information System, which provides easy access to reliable information on species names and their hierarchical classification.

Information and expertise worldwide can be brought to bear in support of activities such as the proposed MA through international initiatives such as the North American Biodiversity Information Network, the Inter-American Biodiversity Information Network (http://www.iabin.org), and the Clearing-House Mechanism of the Convention on Biological Diversity (http://www.biodiv.org/chm). The proposed Global Biodiversity Information Facility, envisioned as improving access to and interoperability of biodiversity databases, will become an important research resource for such efforts.

As with the MA itself, the creation of the NBII-2 has substantial momentum and support from the collaborators on the NBII and from a growing number of other constituencies as the benefits that can accrue become more apparent through their experiences with the NBII. It is critical that the interested parties in the MA and similar activities work collaboratively with NBII partners to ensure that the required salient, credible, and legitimate scientific information can be discovered, retrieved, and used appropriately to meet the objectives of the assessment as well as to enable better ecosystem planning and management in general.

DENNIS B. FENN

Chief Biologist

U.S. Geological Survey

Reston, Virginia


Walter V. Reid clearly and forcefully sets forth the need for better ecosystem management worldwide. The Millennium Ecosystem Assessment (MA) he proposes can make a major contribution to meeting this challenge. As Reid notes, data gaps, lack of capacity, and narrow mindsets too often plague policymakers whose decisions shape ecosystems from day to day. However, rapid advances in satellite, information, and communications technologies allow decisionmakers to pursue integrated resource planning in ways unknown a generation ago.

An initiative such as this can be fully successful only if assessment products are capable of being used by policymakers. The work of the Intergovernmental Panel on Climate Change is a model in this regard. Furthermore, a global ecosystem assessment can be successful only if it is more than a single report or snapshot. It must be a continuous process, with careful attention paid to the use of common data standards. Reid addresses these concerns directly and pragmatically, in part by proposing that a board of users govern the MA. This board would identify the questions to be answered by scientists, involve key stakeholders, and set policies for peer review.

The United States is building considerable experience in understanding ecosystem trends through such means as the U.S. Geological Survey’s recent Status and Trends of the Nation’s Biological Resources and the development of the National Biological Information Infrastructure, a distributed database for biological information from a variety of sources. Our experience with multiple partners in the development of a report on the state of the nation’s ecosystems, coordinated by the H. John Heinz Center, is also a model for building the MA. As a nation, we have much to contribute to a global initiative that is intended to pull together this type of information in a form usable to stakeholders and decisionmakers.

The U.S. government supports the MA. We were pleased to support a recent application to the Global Environmental Facility for funding of an initial phase of this project. The work of the MA can help contribute to sustainable development around the globe.

DAVID B. SANDALOW

Assistant Secretary of State for Oceans, Environment and Science

BROOKS YEAGER

Deputy Assistant Secretary for Environment

U.S. Department of State

Washington, D.C.


I am pleased to be able to comment on Walter V. Reid’s article. First, although science and technology can contribute to our ability to deal with environmental predicaments, traditional science alone will not save the world from environmental degradation, because problems and solutions involve other areas such as economics, demography, ethology (behavior), education, and religion. Real solutions must involve cross-disciplinary efforts between scientific and societal disciplines.

Second, when it comes to our basic life support system (the ecosystem), it is not a matter of choices because there is only one choice: Preserve the quality of life for humans. When life support systems begin to fail, there is only one mission–survival.

Third, the most immediate need is to reconstruct or extend economics to include nonmarket goods and services (that is, ecosystem services). Currently, only human-made goods and services have value in market economics. Life support services are considered to be externalities with no value until they have become scarce (when it may be too late!). My brother and I have written about this market failure since the 1970s, and a host of economists have picked up on this theme. A major point of agreement is that from now on, economic development must be based on qualitative rather than quantitative growth–that is, better, not just bigger. Also, the economic growth required for poverty reduction must be balanced by negative growth for the rich, which will increase the quality of life everywhere. And many business leaders are now writing about the “greening of business.” Incredibly, none of these trends are cited in Reid’s paper. I believe the time has come for serious consideration of reforms that have been suggested and documented over the past 50 years.

Finally, in my opinion the proposal for a Millennium Ecosystem Assessment is a waste of time and money. We don’t need any more litanies of problems and disasters. What we now must do is convince the public, politicians, and business leaders of the need for a change in the way we think, behave, and do business. The “need more study” is a cop-out for “we don’t have any ideas for what do to about the situation.”

EUGENE P. ODUM

Professor Emeritus and Director Emeritus

Institute of Ecology

University of Georgia

Athens, Georgia


Preparing for disaster

“Paying for the Next Big One” by Mary C. Comerio (Issues, Spring 2000) provides an authoritative and convincing discussion of a serious problem in the way in which our society responds to natural disasters. As Comerio points out, economic losses from these disasters are increasing rapidly, spurred by population growth, massive development, and a drastic rise in the cost of all the services needed to promote human and physical recovery. Well-intentioned legislation has resulted in a situation in which improved construction and less vulnerable locations for development have been subverted by a public perception–and reality–that the federal government will step in after a disaster and largely take care of the damage. Although attempts have been made, notably by the Federal Emergency Management Agency, to encourage and even regulate mitigation measures that will reduce these losses, they have been largely ineffective because of the political reality that no politician, from the president down, can afford to insist on invoking bureaucratic rules that might be perceived as imposing hardship on disaster victims.

Much real improvement in the federal disaster response has occurred in the past decade or so, but the fundamental problems remain. Governmental support cannot, and should not, continue to expand indefinitely. Private insurance companies have convincingly shown that natural disasters, particularly low-probability high-loss events such as earthquakes, are not only bad business for insurance companies but are intrinsically very difficult and risky to handle because of their nature. The actuarial basis provided by frequent fires and automobile accidents does not exist for the large earthquakes that occur once a decade or quarter-century.

Comerio is right in proposing that some mix of regulation and incentives is necessary to promote predisaster mitigation, which is the only effective long-term way out of the dilemma. The catastrophic urban fires that frequently occurred before the 20th century were only controlled by a rigorous combination of mitigation (regulated fireproof construction); building and management regulation (limits on the occupancy of large public spaces, for example); public education; and the development of sophisticated alarm and response systems. This was achieved largely through insurance company and governmental partnership. Though fires and property loss were not eliminated, their magnitude was brought under control so that insurance could cover the losses without bankrupting building owners and lenders.

The post-disaster problem has many dimensions that make a parallel solution very difficult, but bringing the secondary mortgage market into the picture makes a lot of sense. Ultimately, the only solution will be the use of insurance combined with improvements in our building stock. We need to find the right kinds of political policies and economic mechanisms to achieve these ends.

CHRISTOPHER ARNOLD

Palo Alto, California


Mary C. Comerio’s message is clear: America’s big cities and major suburban regions are a long way from being adequately prepared for the kinds of natural disasters that have recently been experienced in other parts of the world. Although the article raises important concerns for policymakers, Comerio also offers us some hope by describing actions that can be taken today to alleviate the potential losses and suffering in future disasters.

Ever since the publication of her book Disaster Hits Home: New Policy for Urban Housing Recovery (University of California Press, 1998), Comerio has been recognized as one of the world’s leading authorities on the human toll of natural disasters and on the government’s and private sector’s responses to these all-too-common occurrences. Her knowledge of the inadequacies of current disaster planning efforts in the United States comes from extensive analyses of Hurricanes Hugo and Andrew and of the Loma Prieta and Northridge earthquakes, as well as a sobering assessment of the impacts of even more powerful natural devastation on urban centers in other parts of the world: in Kobe, Japan; Taiwan; and Mexico City.

Although the United States has made some important strides in predisaster mitigation, particularly in the retrofitting of bridges, roads, and housing for the next “big one,” there has been much less progress to date in determining the best plans for dealing with the potentially devastating fiscal outcomes of the next inevitable major earthquake, flood, fire, or hurricane to strike a heavily populated center.

“Who will pay?” is the critical question of our times. We live in an era in which disaster recovery policy is severely constrained by the shrinking role of private insurance in providing coverage for property owners. Moreover, the financial consequences of a major urban disaster may be beyond the means of local and state governments, while federal agencies have spending limits and taxpayers are unwilling to raise their taxes to pay for large government projects.

The public, private, and nonprofit sectors will need to work together to create new institutional structures to cope with the next big earthquake, flood, or hurricane. Comerio offers practical suggestions for improving disaster recovery policy that will require a realignment of responsibilities and a more realistic determination of risks. Now it’s up to the politicians in Washington, state and local government to have the political will to take the bold action that is needed before it’s too late.

MARK BALDASSARE

Senior Fellow

Public Policy Institute of California

San Francisco, California


Mary C. Comerio provides an excellent summary of where we are and what some of the options are if we want to move in new directions in earthquake preparedness that are “safe, fair, and cost-effective.”

We may want all three but must recognize that there are inevitable and complex tradeoffs. This is made even more difficult by the fact that just as individuals’ rational calculus becomes blurred in the face of very-low-probability events, policymaking is also peculiar. There are flurries of activity in the window immediately after major events but very little at other times.

Comerio is right that the most bang for the buck will be gained if incentive-based policies move individuals toward the sorts of calculations that they normally entertain when buying, say, auto insurance, where the odds are much easier to grasp. Politicians are most likely to think about incentives when manipulating the tax codes. Reforms that involve these approaches should be our first research priority.

PETER GORDON

Professor

Director, Master of Real Estate Development Program

School of Policy, Planning and Development and Department of Economics

University of Southern California

Los Angeles, California


Managing agricultural pests

In “The Illusion of Integrated Pest Management” (Issues, Spring 2000), Lester E. Ehler and Dale G. Bottrell argue that there is precious little integration in the design and practice of integrated pest management (IPM) systems. They argue that recent efforts by the U.S. Department of Agriculture (USDA) and the Environmental Protection Agency to define, measure, and promote IPM have been based on simplistic and flawed approaches. What is missing, they say, is a degree of “ecological finesse” in the integration of multiple pest management practices, with a heavy emphasis on prevention. I agree with their diagnosis, but their prescription for change falls short of the patient’s needs.

Consumers Union (CU) undertook a broad-based assessment of pest management challenges, systems, and shortcomings in 1994­1996, leading to the publication of the book Pest Management at the Crossroads (PMAC) (C. Benbrook, E. Groth, M. Hansen, and S. Marquardt, Consumers Union, 1996). In it we recommended that policymakers focus on promoting the adoption of biointensive IPM: systems heavily weighted toward prevention through management of biological interactions and cycles. We advanced the concept of an IPM continuum ranging from chemical-intensive treatment-oriented systems (“No” and “Low” zones along the IPM continuum) to “Medium” and “Biointensive” zones where multitactic systems successfully lessen reliance on broad-spectrum pesticides.

In PMAC, we estimated the distribution of U.S. cropland along the four zones of the IPM continuum in the early 1990s and concluded that almost 70 percent fell in the “No” and “Low” zones along the continuum, whereas only 6 percent was managed with biointensive IPM. We set an ambitious goal: “By the year 2010, 75 percent of cropland should be under ‘Medium’ or ‘High’ (biointensive) IPM, including nearly 100 percent of fruit and vegetable acreage.”

The bar was raised for fruits and vegetables because we felt that these crops account for the majority of risk from pesticides in the food supply. Recent USDA data on residues in food proves that we were right and supports the need for priority attention to fruit and vegetable IPM (for details on the distribution of risk across foods, see our 1998 report “Do You Know What You’re Eating?”, accessible at .

How are the nation’s farmers doing now? Some very well, but collectively the pace of change is way behind schedule. Without a real IPM initiative in the next administration, backed by new dollars and decisive, consistent policy changes, farmers are likely to fall far short of the year 2010 goal that CU set in 1996.

Citing slippage and excessive reliance on pesticides rather than pest management, Ehler and Bottrell argue that “the time has come for a major policy change at the federal level…” However, they miss a more universal and formidable constraint: infrastructure. Biointensive IPM relies on knowledge and human skills and on the collection and timely use of field-based data on pests, their natural enemies, and pest-beneficial interactions relative to the stage of plant development. Biointensive IPM is not about how many practices a farmer uses. It is about what farmers do, when, and why.

The tools and infrastructure supporting the essential ingredients of biointensive IPM are working well where the high cost and spotty performance of chemical-based IPM have forced farmers to look for more robust management systems. But across most of the agricultural landscape, spraying pesticides remains easy, effective, and affordable. So why push for change? Because these attributes of pesticides reflect 50 years of supportive public policy and billions in public R&D investment rather than intrinsic technical superiority.

What about genetic engineering and genetically modified organisms? Transgenic herbicide-tolerant varieties and Bt-producing plants enhance the efficiency and ease of chemical-based systems. No doubt, they have been short-run commercial successes. However, these technologies are fundamentally incompatible with the core principles of biointensive IPM and are not likely to last because they are almost custom-made to accelerate the emergence of pest resistance.

Pest management is evolving and will continue to evolve toward biointensive IPM, and some applications of genetic engineering will help pave the way. The fact that it is moving slowly is a failure of policy and the market, not the concept of IPM.

CHUCK BENBROOK

Benbrook Consulting Services

Sandpoint, Idaho


I found most of what Lester E. Ehler and Dale G. Bottrell wrote to be true in my experience in working with growers. It was refreshing to see that at least two academics have a good grasp of the real world of agricultural pest management. Their observation that “IPM as envisioned by its initial proponents is not being practiced to any significant extent in U.S. agriculture” is very true in my opinion, and their conclusion that we should dispense with the “IPM illusion” and shift the focus to pesticide reduction is a solid and practical one. I am a big supporter of academic institutions and the work they do, but I am continually frustrated by the huge disconnect between these institutions and what really goes on in agricultural pest management. I think it is great to try to define IPM and to refine the term with concepts such as biointensive IPM, ecological pest management, and biologically based pest management. However, definitions tend to get in the way and even muddy the water when it comes to pest management as practiced by growers and pest control advisors. We get hung up on the theory and forget what is happening in the field.

Ehler and Bottrell are right to comment that monitoring schemes developed for pest and natural enemy populations may be too sophisticated and expensive to be a practical tool for growers and pest control consultants. Monitoring is indeed the foundation of any IPM program, but practitioners tend take what has been developed by researchers and make it fit their situation and time constraints and not worry about violating statistical requirements. Moreover, if they do monitor in some systematic way, many growers and consultants do not keep written records of this monitoring. If we can get growers and pest control advisors to monitor all fields on a regular and timely basis, I am convinced that significant pesticide reduction can be achieved. Realize, though, that the focus here is simply on getting people to monitor, not obsessing about the methods used.

Ehler and Bottrell may have thrown in the towel too soon by stating that the training of pest management practitioners is not adequate for dealing with the ecological complexity and challenge of IPM. I still have hope that with field experience, many practitioners who are truly interested in pesticide use reduction will be able to grasp these concepts and apply them.

I really like the goals for pest management in U.S. agriculture that are set forth in Ehler and Bottrell’s concluding paragraphs. I think they are all connected to the real world of agriculture, particularly the goal of shifting the debate to pesticide reduction, because that is where progress can be made. I agree with their conclusion that the IPM acronym should not be dropped–it is an extremely useful concept and framework for working with growers and pest management practitioners.

CLIFFORD P. OHMART

Research/IPM Director

Lodi-Woodbridge Winegrape Commission

Lodi, California


The perspective on IPM put forward by Lester E. Ehler and Dale G. Bottrell, though perhaps somewhat exaggerated in its account of a virtually total lack of horizontal and vertical integration of pest management tactics on U.S. farms, essentially rings true to the experience of many of us who have earnestly promoted IPM as a philosophy and set of practices to farmers. Nearly 30 years after initial government sponsorship of research and demonstration studies of IPM, only a very small percentage of farms receive anything beyond “integrated pesticide management.”

Recognizing this truth, we may ask why is this so. In our judgment, most of the answer lies in the structure of contemporary U.S. agriculture. As pointed out by Fred Magdoff and coauthors in the July 1998 issue of Monthly Review in “Hungry for Profit” and by Steven Blank in his 1999 book The End of Agriculture in the American Portfolio, the average U.S. farmer receives only 9 percent of the income arising from agricultural product sales to consumers. Farming has become marginally to highly unrewarding financially, especially for those who market their produce wholesale to firms and vertically integrated megacorporations that reap large profits from selling inputs (seed, fertilizer, and pesticides) to farmers and from processing and marketing outputs. Globalization of corporate access to cheap food and cheap oil for transporting food allows corporate buyers to beat down the price offered to U.S. farmers for their produce. This has put many U.S. farmers at much risk.

Farmers at risk do everything possible to lower the cost of inputs, which is one reason why integrated pesticide management has succeeded. Its practice has given rise to considerable savings in dollar outlay for pesticides. Lowering the cost of inputs also means lowering the amount of hired labor. Advancement to higher levels of IPM that embrace true horizontal and vertical integration involves substantial investments in time and labor to carry out practices that emphasize alternatives to pesticides, such as biological, cultural, and behavioral controls. Farmers at risk are usually unable to make such investments. Moreover, what marketplace advantage is to be gained by growing a crop using more advanced IPM practices, only to see the end product mixed with other produce grown under conventional (or integrated pesticide management) practices? The farmer receives no recognition for his or her efforts and probably incurs higher costs.

In our judgment, true integration of pest management practices has the greatest chance of succeeding among farmers who sell their produce locally or under their own brand name. Such produce can be identified with a particular farmer who is then able to build a clientele of consumers that appreciate the extra mental and physical effort that goes into using a higher level of IPM. Clients may even be willing to pay a premium price for this produce. Only a small percentage of U.S. consumers (probably less than 5 percent) might overtly welcome agricultural products grown under advanced IPM (possibly the same consumers who welcome organic products).

Until the corporations that control mainstream agriculture in the United States decide that it is in their financial or image interest to promote or demand vertically and horizontally integrated pest management, it is doubtful that the “I” in IPM will be much more than a symbol of hope.

RON PROKOPY

TRACY LESKEY

JAIME PINERO

JUAN RULL

STARKER WRIGHT

Tree Fruit IPM Laboratory

University of Massachusetts

Amherst, Massachusetts


Conserving wildlife

Congratulations to Issues for printing “Conservation in a Human-Dominated World,” by David Western, in the Spring 2000 issue. He makes the case very well that conservation practices work best when they have the enthusiastic cooperation of those who must immediately live with their consequences. The “command and control” approach has not only often failed to include the input of indigenous peoples, it has discouraged our input. Worse, our opinions and knowledge have often been dismissed, even condemned, as being inherently wrong and unqualified to be considered as part of any solution.

My experience with the collaboration efforts of the Malpai Borderlands Group has taught me that my urban counterparts are equally frustrated by their prescribed roles in conservation. Although able to exert majority will through the ballot box and legislation, the ultimate results are often not to their liking.

Here in Arizona, we now have more than a couple of generations on the land, in government, and in the cities that have grown up with “delivered” conservation. Whether we like it or not, it’s what we know. The “transitional vortex” that Western speaks of is real. Sudden top-down change that empowers local decisionmaking when the institutions, the laws and, most important, mind sets and life experience are not prepared for it will only exacerbate our current dilemma.

As Western points out, there are real examples where true grassroots, collaborative, inclusive conservation efforts are underway and functioning. Every effort should be made to build support systems around these efforts. Their successes will lead to their multiplication at a faster rate than we can imagine, but efforts to speed them along through government mimicry or appropriation will mean failure.

The frustration with the current way of doing business is palpable. The road to the future, although not yet clear, is becoming so. Western’s article describes the course succinctly. Are we up to the challenge?

BILL McDONALD

Executive Director

Malpai Borderlands Group

Douglas, Arizona


David Western’s article is a refreshing review of the issues that have a direct impact on meeting today’s conservation challenges. The interesting aspect of his paper is that there are no surprises; the points he raises are practical and full of common sense.

Most conservation organizations have been grappling with many of these issues for a number of years. Starting in the early 1980s, efforts at understanding what works (and perhaps more important, what does not) in integrating conservation and development have been the focus of a number of initiatives. In 1996, the World Wildlife Fund (WWF) and the Royal Netherlands Government (DGIS) joined forces to support seven integrated conservation and development projects spanning the tropics, with the specific goal of better understanding the factors that contribute significantly to successfully linking conservation and development. Through an iterative process of testing, monitoring, and reviewing project experiences, as well as through review of experiences from other integrated project approaches, four issues were identified as critical to successful integration.

1. Learn from doing. Plan, monitor, learn, and adapt. Early in the implementation process, know the questions to which you want to discover the answers. Know who needs what information to be able to make decisions beyond the life of the initiative. Practice adaptive management.

2. Policy environment and natural environment. Supportive laws, policies, and regulations must be in place for conservation and development to ultimately be successful and sustainable. Conservation initiatives cannot simply address field-based issues. They must take a vertically integrated view toward implementation, which means that advocating policy action and understanding change are as critical to their success as is infrastructure on the ground.

3. Leave something behind. Ensure that the capacity and confidence to make decisions and respond to change are in place. This is an important sustainability indicator. Build institutional capacity to train and develop skills and devolve management to those–be they communities, nongovernmental organizations, governments, or others–who will be ultimately responsible.

4. Tell the story. Communicate messages in an interesting and visual way. If efforts are to have an impact well beyond their area of immediate operations, then they must be able to capture the attention of those who do not have a direct or technical interest in the activities being undertaken. Use information and knowledge to influence others.

What is most notable about these “significant findings” is that they are little different from those presented by Western. Although the natural sciences are critical to better understanding how and why ecosystems function, addressing many of the root causes of biodiversity loss requires skill sets and experience–such as negotiation, facilitation, sociology, anthropology, public policy, human rights, food security, and so on–that are different from those traditionally associated with conservation organizations. Practitioners must either add such skills to their repertoire or look to form partnerships with others who have such experience. Hopefully, change is in the air.

THOMAS O. McSHANE

Coordinator

DGIS-WWF Tropical Forest Portfolio

WWF International

Gland, Switzerland


I lived in Kenya from 1958 to 1982 and have returned for frequent visits since then. I have watched conservation being transformed both conceptually and operationally, plus lots of ways in between. All this is admirably described in David Western’s article. Much of the transformation, indeed, has stemmed from his visionary insights. What we have now is conservation allied to development, justice, and human welfare writ large. How different from the colonial hangover, when parks were run by retired colonels who saw conservation as a battle between “us,” the animals, and “them,” the local communities. Result: a war of attrition. Talk of cooperation between the two sides was viewed as treasonous parleying with the enemy. Park staff persons were trained as military personnel, their prime form of communicating with local people being a rifle. Some of this spirit existed as recently as a few years ago, with the policy of “shoot poachers on sight” leading to still more of an adversarial stance all round.

The eventual winner in conservation efforts has to be local communities. In 1971, a drought hit Tsavo National Park in Kenya. Large numbers of elephants died. The same drought afflicted the park’s hinterland and its human communities. Starving people were obliged to look over the park boundary and see thousands of elephant carcasses left to bloated hyenas and vultures. They were not permitted a single chunk of elephant meat on the grounds that any sort of wildlife use would betray the purist policies underpinning the park. In any case, a national park’s animals belonged to the national community, so local communities did not count. The aggrieved local people became opposed to all the park stood for and did nothing to help counter the subsequent grand-scale poaching of elephants (though they did not kill many themselves).

These people have told me over the years that they want to see an end to the park, and they have periodically succeeded in having portions excised. As elsewhere, the role of local people is pivotal: They are in a position to make or break a park. Much of the problem stems from the traditional concept of a park with static boundaries. It is one thing to draw a line on a map or in a warden’s mind. It is another thing to have the boundary recognized by migrating herbivores, birds, locusts, diseases, rivers, rainfall regimes, and wildfires, let alone pasture-short pastoralists and poachers. What is required, as Western graphically demonstrates time and again, is a more flexible arrangement for ecosystem conservation within a regional land use plan. What is certainly not required is the response of fencing off the parks in Leakeyesque style–a denial of basic ecology and socioeconomics. The fluid and perpetually adaptive approach will become all the more imperative as global warming starts to bite.

In an even longer-term perspective, there is need for conservation to safeguard the scope for evolution. A park protects the present pattern of species, which is altogether different from the future process of evolution, extending over millions of years and requiring much larger tracts of wild lands than the most expansive network of traditional parks.

I favor the prognosis of Jeffrey McNeely at the World Conservation Union. He proposes that in 50 years time, we may have no more parks, for one of two reasons. Either parks will be overtaken by outside pressures from, for example, land-hungry farmers and global warming, or we shall have learned to manage all our landscapes in a such rational fashion that we automatically make provision for wildlife and all other conservation needs. The time is coming when we can save biodiversity only by saving the biosphere.

NORMAN MYERS

Honorary Visiting Fellow

Green College, Oxford University


David Western’s cogent and eloquent discussion of the challenge of forging the connection among environment, development, and welfare, given an ever-increasing rate of change of principles and of environmental conditions, poses major questions for scientists who are interested and involved in conservation biology and biodiversity. His splendid example of the work in Africa, especially that of the Kenya Wildlife Service and the development of the Amboseli reserve and its extensions, illustrates the need for the combination of current ideas about conservation biology with the traditions of the local peoples. Of great concern to many of us recently embarked on attempted syntheses of science, policy, and management is the relative lack of communication among the several generators of such syntheses, despite declared good intentions to do so. The question of how to achieve that communication remains. Western’s examples of success are diverse. Some reflect grassroots initiatives; others are top-down government agency­driven efforts. All are relatively local and appropriately deal with the specific situation at hand. But do general principles exist that would facilitate the kind of communication necessary to make “conservation in a human-dominated world” understood and practiced? The rapid advance of the science is a problem: Western correctly notes that the recent shift in scientific paradigm to a more holistic and dynamic perspective, although welcome and essential, requires constant adaptation to new knowledge by people who are not familiar with the details of the science, so application is often far behind intent.

Western identifies several factors that are necessary to a general process. He doesn’t tackle what to me is the crucial one–the leadership and initiative that drive any process. Western is an example of an individual who has masterfully accepted the leadership role, and his success is laudable. Individual effort, with intellectual and pragmatic commitment, remains the key to driving change. Individual effort, however, is subject to the vagaries of government and other support and to the kind of application that any individual can sustain.

As Western comments, the involvement of a great range of institutions is necessary. But many gatherings of principals have formulated useful approaches on paper, with little or no application. In my opinion, it is essential that scientists, managers, policymakers, and the local and national governments that support them become knowledgeable about and involved in the centralized international efforts that currently exist (such as the DIVERSITAS program, co-sponsored by the United Nations Education, Scientific, and Cultural Organization and several international scientific unions), in order to produce science and policy, and, more important, to communicate them. These programs flounder without such support, and their efforts remain little known and of limited effect. They would foster leadership, communication, and coordination, and the effort to integrate conservation and sustainability would come closer to fruition–helping us to stop recreating the wheel, thus losing critical time.

MARVALEE H. WAKE

Department of Integrative Biology

University of California

Berkeley, California


Next-generation fighter planes

Steven M. Kosiak’s “U.S. Fighter Modernization: An Alternative Approach” (Issues, Spring 2000) is a salutary entry in a policy debate that has too often featured surprisingly simplistic arguments from aircraft proponents and critics alike. Kosiak raises many key points that are frequently overlooked or glossed over by commentators on the major current U.S. fighter programs, especially the Air Force’s new F-22 Raptor.

For example, he recognizes that in most cases the capabilities of an aircraft have relatively little to do with when its airframe was originally designed. Although F-22 advocates tend to imply that having been conceived in the 1970s makes the F-15 and F-16 virtual antiques, current versions of these aircraft with modern engines, avionics, and weapons are extremely capable indeed. In fact, the recent decision to export a new variant of the F-16C to the United Arab Emirates has caused concern in some quarters at the prospect of the UAE possessing a fighter more capable than any in the U.S. inventory (but seemingly not enough to make the U.S. Air Force consider buying the relatively affordable plane itself, lest this reduce support for the F-22 or Joint Strike Fighter).

Then why buy the F-22 at all? The Air Force has been surprisingly ineffective in communicating the answer in its efforts to win funding for the plane, although it has tried up to a point: U.S. fighter requirements are not determined simply by the capabilities of the combat aircraft flown by potential enemies, but also by the need to be able to operate in the presence of very dangerous, modern surface-to-air missiles, such as the Russian S-300 (SA-10/12) series. Though Kosiak downplays them too much–the quality of the missiles being exported to the Third World matters more than their numbers–surface-to-air threats, not air-to-air opponents, are the best reason to acquire the F-22 instead of relying entirely on improved F-15s or F-16s that can never equal the Raptor’s stealth and sustained speed and thus survivability. (Indeed, because suppression of enemy air defenses has become so important in achieving air supremacy, one might expect this mission to figure more prominently in the F-22’s multi-role repertoire or elsewhere in the Air Force’s current acquisition plans, especially given the recently demonstrated shortfalls in U.S. defense suppression and jamming capabilities.)

The Raptor has other powerful selling points. It does have utility for strike missions, unlike the single-role F-15C, which had nothing to contribute to the Gulf War once the Iraqi air force had been swept from the skies and was irrelevant over Kosovo. Moreover, its speed, range, and data fusion capabilities mean that a wing of F-22s will be able to do the air superiority work of a much larger, and ultimately more expensive, force of F-15s. But if sound policy choices are to be made about building this aircraft and the other systems competing with it for limited defense resources, more analyses such as Kosiak’s will be required. Neither vapid and contrived sloganeering about air dominance nor facile assumptions that U.S. military capabilities can be neglected without eroding will suffice.

KARL MUELLER

School of Advanced Airpower Studies

Maxwell Air Force Base

Montgomery, Alabama


I was impressed with Steven M. Kosiak’s thoughtful analysis of fighter modernization options. He is probably right in saying that current programs will cost more than planned, resulting in revised production schedules. That is particularly true of the Joint Strike Fighter (JSF), which is by far the biggest and most complex of the three programs (not to mention the least advanced, in terms of its developmental state).

However, the answer is not to cut back all three programs and continue producing the 30-year-old F-15. The Air Force’s F-22 and Navy’s F/A-18 E/F are well along in their development, are top priorities for their respective services, and have already expended a large portion of their intended acquisition budgets. The cheapest, most prudent course of action would be to complete their purchase as planned during the current decade while delaying production of the less popular JSF for at least five years (as Kosiak recommends).

There are three reasons for sticking with the two smaller programs. First, we have no way of knowing precisely what threats the nation will face 20 years hence, and it is quite possible that the new technologies incorporated into F-22 and F/A-18 E/F will be necessary to prevail. Second, the post-production cost of operating the two planes is significantly less than that of their predecessors, saving money over the long run (most of the life-cycle cost of fighters is incurred after they are manufactured). Third, the kinds of cutbacks envisioned by Kosiak would cripple what is left of the domestic combat aircraft industry.

JSF is the biggest program in the Pentagon’s acquisition plans through 2020, and it is far from clear that the military services want or need the 2,852 planes currently planned. Kosiak’s proposal to wait and see on JSF makes sense, as long as we stick with the rest of the Pentagon’s fighter modernization program.

LOREN B. THOMPSON

Chief Operating Officer

Lexington Institute

Arlington, Virginia


As the United States enters the 21st century, its air power reigns supreme in fighter aerial combat, precision strike capability, worldwide rapid transit of forces and supplies, and protection for U.S. allied forces from air attack. To most, air power’s recent successes in Desert Storm and Operation Allied Force finally prove the value and decisiveness of our aerospace weapon systems and our balanced focus on readiness. However, the world is not static, so our plans must recognize then leverage our strengths to ensure the best return on our limited investments. Steven M. Kosiak’s article provides a thought-provoking assessment but misses the mark on key judgments and draws imprudent, biased conclusions.

Opponents and potential enemies adjust. Today, at least six foreign aircraft threaten to surpass the performance of the 1970s-designed F-15 and F-16 fighters. These foreign aircraft are being marketed aggressively around the world to our allies and potential adversaries. Even a small number of these advanced fighters in a theater of operations would significantly threaten our existing forces and jeopardize mission success. An even greater threat is the increase in the number of advanced surface-to-air-missiles. The Air Force’s F-22 and Joint Strike Fighter (JSF) programs are designed to ensure that U.S. forces will dominate the future battle space despite the introduction of these aircraft, defense systems, and other new weapon systems still in design.

Unfortunately, developing and fielding new weapon systems isn’t free. Cost-benefit analyses, tradeoffs with current readiness, projections of future sustainability, and procurement costs have and continue to be considered in great detail and are fundamental to our program management. At the same time, we realize that without modernization investments, another price would be paid with the lives of America’s sons and daughters. Kosiak fails to include this dimension and advocates unnecessarily risking America’s military dominance and our warfighters’ lives to save less than 2 percent of the Department of Defense budget and less than 0.25 percent of the total federal budget.

All of Kosiak’s options result in fielded air forces that are less capable than under the current plan. To be balanced, would it not be fair to consider options that increase capability? America’s and the Air Force’s strength is in scientific and technical innovation, swift development, and industrial agility. The fighter forces that have proven to be one of the most flexible and effective tools in our arsenal best represent our progressive solutions. Should we not take advantage of this strength? Should we arbitrarily constrain ourselves when the revolutionary integrated designs of stealth, advanced propulsion, flight controls, structures, and avionics technologies in the F-22 and JSF affordably provide us untouchable capabilities? Should we cripple our scientific and technical communities through “Band-aid” modification programs of legacy systems and fewer stretched-out new development programs? Healthy, exercised, and challenged development teams have given us today’s revolution at an affordable price and, if fully supported, will keep our forces ready and able to engage when called upon and win.

COLONEL ROBERT M. NEWTON

Air Superiority Division Chief

U.S. Air Force

Arlington, Virginia


Plutonium politics

Luther J. Carter and Thomas H. Pigford’s “Confronting the Paradox in Plutonium Policies” (Issues, Winter 1999) does a great service in two respects. 1) It summarizes most issues concerning the disposition and management of weapons-usable materials of all three types: excess plutonium from nuclear weapons, reprocessed plutonium from civilian nuclear power, and highly enriched uranium. 2) The authors propose that the widely spread storage sites of these materials be consolidated into a concentrated network under international unified management.

I support the latter proposal as an intermediate measure prior to geological disposal of unreprocessed spent fuel from nuclear reactors, including those burning mixed oxide fuel, or of vitrified logs containing plutonium. In addition, blending of highly enriched uranium to low enriched uranium for use as commercial fuel should be pursued. Progress along each of these lines is inexcusably slow and deserves much higher priority and leadership on the part of the United States.

The article identifies the adoption by the U.S. Department of Energy (DOE) of the Spent Fuel Standard, originally proposed by a National Academy of Sciences (NAS) study. It should be recognized that that standard in itself does not guarantee adequate protection of weapons usable materials; it must be complemented by safeguards or other institutional barriers to meet an adequate overall standard of proliferation resistance.

An National Academy of Sciences committee set up to refine the concept of the Spent Fuel Standard and to express judgment on whether DOE’s present plans for disposition of excess weapons-usable material withdrawn from nuclear weapons meets the Spent Fuel Standard has issued an interim report. Carter and Pigford quote the doubts expressed in that report about whether the current can-in-canister approach adopted by DOE for immobilizing these materials meets the Spent Fuel Standard. Moreover, there are problems regarding the total inventory and the availability of sufficient quantities of highly radioactive fission products required for incorporation into the canisters.

The comments on Carter and Pigford’s article published previously in Issues have been largely supportive of the authors’ approach. There is now general consensus that the once-through light water reactor fuel cycle is not only the most proliferation-resistant approach to civilian nuclear power but is also the most economical. Michael McMurphy of COGEMA-USA points out that the once-through fuel cycle uses only a small fraction of the energy inherent in uranium, and he therefore advocates recycling. Although recycling indeed can recover more of the energy content of uranium, this is irrelevant as long as it remains uneconomical. How long this situation will persist is difficult to predict in view of uncertainties about the future demand for nuclear power and the possibility of extensions of an inexpensive uranium supply, such as recovery from seawater.

Because recycling demonstrably increases the proliferation risk, the U.S. policy against recycling and the discouragement of that approach internationally deserve support. In fact, recycling operations abroad are falling out of favor and attention is being rightfully focused on the management and disposition of the recycled plutonium stocks. Although I support the U.S. approach, I emphasize that there is no such thing as a fully proliferation-resistant approach to civilian nuclear energy. Proliferation resistance is a relative matter, and all nuclear fuel cycles have to be complemented by appropriate institutional safeguards. The approaches proposed by Carter and Pigford are instrumental in reducing the need for such safeguards.

WOLFGANG K. H. PANOFSKY

Director Emeritus

Stanford Linear Accelerator Center

Stanford University

Stanford, California


Luther J. Carter and Thomas H. Pigford promote three goals, each of which currently has a different degree of acceptance in the nuclear community (1) Removal of all excess nuclear weapon materials. This substantially reduces proliferation risks, an effort that is universally supported. (2) Ending reprocessing. This is supported by organizations and individuals who are convinced that reprocessing increases proliferation risks and opposed by nations and individuals who are reluctant to discard a valuable energy source. (3) Implementing a limited international network of storage and disposal facilities for plutonium wastes and spent fuel. This is generally recognized as a desirable option, but various groups questions its near-term feasibility.

Among the examples for the creation of a global network of storage and disposal centers suggested by the authors is the Pangea concept for deep geologic repositories. As people directly involved in Pangea, we would like to add some comments intended to update the information in the article and to foster wider discussion on the complex topic of international or regional storage and disposal. Pangea Resources International is now headquartered in Switzerland, with its first regional office in Australia. Although the initial full feasibility studies were begun and continue in Australia, other regions of the world also are being considered. We are confident that more than one international repository will be considered.

Safeguards today function well where properly implemented, but we, like the authors, gaze into a future of continually increasing amounts of excess nuclear materials from dismantling nuclear weapons or civil nuclear industry activities. A global system of a few large facilities in isolated areas for storage and disposal of fissile materials under multinational scrutiny should be preferable to many small facilities, often located in less-than-ideal areas. The selection of host countries, sites, and operating procedures for the facilities can be optimized for safeguards. The host country must present solid nonproliferation credentials; it must also be willing to accept stringent continuing oversight by the international community to enhance confidence. The site can be chosen at a remote, easily monitored location that would facilitate detection of diversion attempts. The design, construction, and operation of the facilities can be done with advice from safeguards experts to optimize the nonproliferation and security aspects.

What could enhance the near-term feasibility of developing international repositories in a suitable host country, especially in light of the significant challenges of public and political acceptance? As Carter and Pigford point out, there certainly will be strong economic incentives, but these alone are not sufficient. A host country may be attracted to the possibility of providing an invaluable service to the world by reducing the danger from proliferation of nuclear materials. Although there are basic environmental and economic reasons for supporting international storage and disposal of fissile materials, the most valuable aspect of this service may be the vital contribution to safeguards and nonproliferation. Pangea is committed to transforming this possibility into reality and urges that serious consideration should be given to all international storage and disposition proposals that promote this goal.

CHARLES MCCOMBIE

Chief Executive Officer

RALPH STOLL

Vice President

Pangea Resources International

Baden, Switzerland


The evolving university

As a biologist, I tend to think in broad evolutionary terms. Mutations give rise to new possibilities, some of which thrive and persist; most prove to be maladaptive and quickly disappear. Sudden cataclysmic events can lead to the elimination of even the best, and in the end the fittest survive.

Looked at with this lens, the developments in university structure described by the president of the University of Phoenix, Jorge Klor de Alva appear to raise as many questions as they provide solutions (“Remaking the Academy in the Age of Information,” Issues, Winter 1999). The modern perspectives are all there: technology, accountability, dramatic changes in demand, productivity, time efficiency, customer service, bottom-line accountability, and, most important, profit. Do more with less and provide the consumer with specific skills demanded by employers. Reduce or eliminate the costs of faculty, who after all are interested only in their own needs, and convince the public that real libraries can effectively be replaced by online collections of documents.

An important part of selling this new model of “higher education” is convincing the public that faculty in the rest of higher education are indifferent and the curricula they design are disorganized and illogical. So much for Harvard and the rest of traditional higher education.

At Phoenix, unlike some more modern incarnations that have no faculty at all, most of the faculty (called “faculty facilitators”) are adjuncts. The argument is made that this cadre of part-timers who receive no benefits needs no time to prepare for class because they “teach at night what they do during the day.” The fact is that all good teaching requires adequate preparation time. And the argument about teaching in the area in which one works falls apart when it is philosophy that is being taught at night.

Twenty-five percent of “instruction time” in classes at Phoenix occurs in student groups that take place without an instructor. The new world of higher education suggests that a faculty member should be the “guide on the side” rather than the “sage on the stage,” helping the knowledge students bring with them to class to burble up from the depths. Such a situation may prevail in business, but I have found that students bring a remarkable lack of specific information gained in their world of experience to my class in biochemistry. Phoenix was called to task and recently reached a settlement out of court with those who provide federal student aid, because in-class time with instructors was inadequate to meet federal requirements for assistance given to students.

President de Alva would point with pride to the consistency of courses offered by Phoenix at sites all over the United States and in Canada. Courses are defined by committees of faculty adjuncts, with time to be spent covering each topic specified in minutes. But it is in striving for the best rather than settling for a lower common mean that excellence emerges. It is in an environment where student and teacher are using the known to examine the unknown that real higher education happens. And it is not in the measurable mastery of content that real creativity is fostered. That happens when students come to realize that they cannot only master the known but image the new. The uniqueness of higher education is that it has involved a dimension of engaging the unknown (research), a dimension equally as important in a community college, where practical education is more central to the mission, as it is at the most elite Ivy League or Big 10 institution.

The most distinctive characteristics of higher education that have served us so well as a society since 1776 are the two underlying features of academic freedom/ tenure and collegial governance. Both academic freedom (the right to examine the popular and the unaccepted in the classroom and the laboratory) and collegial governance (the notion that academic institutions are governed by all component parts and that faculty develop and teach the curriculum) are threatened by the new “visions” of what higher education should be. It is hard to imagine academic freedom existing at institutions where the content of teaching is prescribed or collegial governance in places where no real faculty exist.

Perhaps the University of Phoenix will be seen in time to have been a pioneer in higher education. But the biologist in me can’t help but feel considerable skepticism about whether it is really fit and will survive. It certainly is not a substitute for traditions that have served us so well over time.

JIM PERLEY

Chair, Committee on Accreditation

American Association of University Professors

Department of Biology

College of Wooster

Wooster, Massachusetts

Cite this Article

“Forum – Summer 2000.” Issues in Science and Technology 16, no. 4 (Summer 2000).

Vol. XVI, No. 4, Summer 2000