Forum – Fall 1998
International science cooperation
In “Toward a Global Science” (Issues, Summer 1998), Bruce M. Alberts highlights four principles that guide the international activities of the U.S. National Academy of Sciences. They relate to the role of science in strengthening democracy, facing the challenge of population expansion, spreading the benefits of electronic communication, and assisting national policymaking. Among these, I would like to dwell briefly on his statement that “new scientific and technological advances are essential to accommodate the world’s rapidly expanding population.”
Alberts has warned that a potential disaster is looming in Africa. Although this is correct, my country, India, is likely to face even more serious problems. Our population is still growing at about 2 percent per year and may reach 1.2 billion by the year 2020. However, there are states within India such as Kerala, Tamil Nadu, Goa, and Andhra Pradesh that have shown that through attention to the education and economic empowerment of women and better health care services, including attention to reproductive health and delivery of socially acceptable contraceptive services, the desired demographic transition to low birth and death rates can be achieved. In addition, a committee of the government of India that I chaired, set up to draft a national population policy statement for adoption by the Indian Parliament, recommended that population issues be dealt with in the context of social development. We suggested the preparation of sociodemographic charters by village-level democratic institutions as a tool for sensitizing the local communities to the population-supporting capacity of their ecosystems.
An urgent task facing scientists is the standardization of technologies that can help increase crop and farm animal productivity under conditions of shrinking per-capita arable land and irrigation water resources and expanding biotic and abiotic stresses. Population pressure is also causing increasing damage to the ecological foundations that are essential for sustainable advances in biological productivity. These challenges can be met only by mobilizing frontier science and technology, particularly in the areas of biotechnology, information, space, and renewable energy technologies and blending them with traditional technologies and ecological prudence. Such hybrid technologies can be referred to as ecotechnology.
Recent advances in genomics and molecular breeding have opened up great opportunities for producing novel genetic combinations conferring a wide range of useful traits, including resistance and tolerance to pests and diseases. However, there are also well-grounded apprehensions in the public mind about the safety as well as the nutritional and ethical aspects of genetic engineering. This is where Alberts’ suggestion that scientists make use of the possibilities of electronic communication assumes importance-it is obvious that there is a need for greater efforts to promote wider public understanding of the implications of genetic engineering. The positive impact of educational efforts is clear from the results of a referendum held in Switzerland on June 7, 1998 on the question of whether the production and distribution of transgenic animals and field trials with genetically modified organisms (GMOs) of any sort should be banned. More than 66 percent of the people voted against outlawing genetic alteration of animals and the release of GMOs into the environment. The referendum’s proposals for a ban on GMOs were rejected in all 26 cantons of the country. The results were the opposite of what was widely expected. Heidi Diggelmann, president of the Swiss National Science Foundation, attributed this unexpected turn in the public perception of genetic engineering to widespread efforts by researchers to talk to the people in the streets.
The global science that Alberts talks about should emphasize that we should neither worship nor decry a technology because it is either old or new. What is important is to promote the development and dissemination of technologies based on sound principles of ecology, economics, gender and social equity, and ethics.
I read Bruce M. Alberts’ thought-provoking article with great interest. From personal contact with him, I am aware of his sincere concern for the promotion of science internationally. In general, I am in agreement with most of what he propounds. My association with the Population Summit, a meeting of the world academies of science held at New Delhi in 1993, and the subsequent establishment of the InterAcademy Panel (IAP) has convinced me that even very complex issues of global concern can be dealt with in the true spirit of scientific debate, and that a consensus can be reached among scientists from diverse socioeconomic, cultural, geographical, and political backgrounds, which can then be pursued with national governments and international organizations as the collective voice of the scientific community. I therefore fully share Alberts’ dream for the IAP to become recognized as a major provider of international advice for developing nations. I presume that in his eagerness to help the less privileged he has omitted to mention the developed nations, who also can benefit from politically neutral, purely scientific collective wisdom. To give an example he has himself referred to, the Population Summit, which enunciated the urgent need to control the ticking demographic bomb in the developing countries, equally forcefully warned against the wasteful production and consumption practices in the developed world. This in turn has caused the U.S. National Academy of Sciences, the Royal Society of the United Kingdom, and more than 50 other national academies to bring out a joint statement, “Towards Sustainable Consumption.” The opinion of an isolated group of scientists from any country of the world could not carry such conviction as the joint statement of all these 55 academies.
No one would contest Alberts’ statement that “As scientists, I would hope that we could lead the world toward more rational approaches to improving international development.” When one recognizes that science and technology (S&T) provide the most crucial means for development in today’s world, it follows that a vital national S&T enterprise is essential. Yet evidence shows that the gap between the S&T knowledge and competence bases of the developed and developing countries has continued to widen. According to Science Citation Index 1994, 80 percent of the world (the Third World countries) contributed only 2 percent of the scientific literature published in index journals. To mitigate the many current maladies threatening the planet that are unmindful of national boundaries, international collaborative effort is necessary; but for it to succeed, state-of-the-art scientific capabilities and infrastructure in all participating countries are essential. Sharing information with the help of modern information technology as proposed by Alberts is most welcome as long as it is recognized that in order to use information, there first has to be a indigenous science base of high quality. Empowering the scientists in the developing countries is, in my opinion, the first step in promoting the genuine partnership envisaged by Alberts. The rest will no doubt follow.
I cannot resist quoting a little-known but very poignant and pertinent statement by Mahatma Gandhi, published in 1936: “When Americans come and ask me what service they can render, I tell them, if you will dangle your millions before us, you will make beggars of us, and demoralise us. But in one thing I do not mind being a beggar. I will beg of your scientific help.” This applies to all developing countries today and should be the first step in our efforts to globalize science.
In a world full of conflicting cultural values and competing needs, scientists share a powerful common culture that respects honesty, generosity, and ideas independently of their source, while rewarding merit. By working together internationally, scientists can better use their knowledge to benefit humanity. This theme is very well put in Bruce M. Alberts’ article.
The advance of science and technology promotes the progress of civilization, democracy, and the improvement of legal systems. An improved legal system, in turn, guarantees the development of science and technology. One of the most important requirements for the construction of knowledge-based economies is respect for human initiative, equality, and cooperation.
The solution to the problems of growing world population, dwindling resources, and the worsening environment lies in raising human awareness and in the advancement of science and technology.That will lead to the formation of a diversified development pattern throughout the world.
As Alberts points out, the spread of scientific and technological information throughout the world, involving a generous sharing of knowledge resources by our nation’s scientists and engineers, can improve the lives of those who are most in need around the globe. The global economic system, the eco-environment system, and the science and knowledge system are integrated. Efforts should be stepped up in South-North and South-South cooperation, especially in cooperation among scientists to eliminate poverty, disease, terrorism, violence, drug abuse, injustice, and the damaging of the environment. A new world order should be set up to bring into being justice, peace, equality, and development patterns that respect different national cultures for a common bright future of humankind.
Bruce M. Alberts is an outstanding academician and has always been a man of vision. It is reassuring to find that he continues to offer attractive and thought-provoking proposals for an increased role of global science in international affairs that are in keeping with the precise historical moment we are witnessing.
Alberts advocates efforts by scientists and academies throughout the world to create and consolidate a scientific network that can “[become] a central element in the interaction between nations, increasing the level of rationality in international discourse, while enhancing the influence of scientists everywhere in the decisionmaking processes of their own governments.” This proposal, in a world plagued by regional and global conflicts, is not only pertinent but a matter of survival.
I also agree with Alberts’ view that there are several reasons why the U.S. State Department and similar departments in every country should incorporate more science and technology issues into their foreign policies. As he points out, 1) science is certainly a powerful force for promoting democracy, 2) new scientific and technological advances are essential for meeting the needs of the world’s rapidly expanding population, 3) electronic communications networks enable a new kind of world science, and 4) scientific academies can be a powerful force in sensible policymaking.
I strongly believe that the world’s scientific academies should play a more strategic role in helping governments and global society to make sensible decisions concerning our regional and global problems. As members of these academies, we in the scientific community should stress the importance in our respective countries of consolidating the Interacademy Panel[Ed.: Please check name.]. This panel was the result of a 1994 attempt by 60 academies to achieve the integration of an international consortium of academies. We should be able, as part of this consortium, to enhance bilateral and multilateral agreements between members and governements or international organizations. As Alberts notes, this kind of organization will stand a better chance of offering international advice to all of the world’s societies.
Bruce Alberts’ essay is a most appropriate statement that reflects how science and technology are becoming increasingly intertwined with major global issues. In my view, however, he could go much farther to recognize how new technologies may alter the progress of science and to identify the problems of relating scientific and technological factors to the conduct of foreign policy.
Regarding the former issue, it seems that we are on the verge of a new era in which the ability of scientists to engage in cooperation across borders may be fundamentally different than in the past. The often-overhyped information revolution will, in this case, make it easier, less costly, and more effective for scientists to work together on a real-time basis without the need for extensive travel or for raising additional resources. Even large experimental equipment will be able to be shared from the comfort and convenience of a scientist’s own laboratory. These changes are already in train, but we are only at the beginning of what will be possible. The U.S. National Academy of Sciences will and should have a major role in smoothing the way to this new level of cooperation.
The problems of the latter issue are not so easily resolved. There have been many attempts in the past to improve the U.S. State Department’s ability to include scientific and technological factors effectively in the policy process. Most have been only marginally successful. The leadership of the department today recognizes the weakness of the past and the importance of making a new effort. The increasing relevance of technical factors in central policy issues makes that mandatory.
There is no magic bullet to meet the need. This is especially so when the public is apparently less interested in or concerned about foreign affairs, and the State Department is increasingly beleaguered by draconian budget and personnel cuts at the very time when U.S. global responsibilities are growing. Assembling advisers nationally and internationally as Alberts suggests is only part of the needed response. More to the point is the creation of a structure that is able to interact with the department to provide advice that recognizes the intricacies of the policy choices the department and the nation face. Equally important is the development of greater sensitivity to the scientific dimensions of policy issues on the part of the Foreign Service.
Meeting these challenges is a task that only the scientific and technological communities and the State Department can work out together, and it requires the direct interest of the department’s leadership. For that reason, it is particularly encouraging that in this latest attempt to cope with the issue, the Secretary of State has turned to the Academy to ask for help.
The productivity paradox
I have only admiration for the two fine articles on computers and productivity growth in your Summer 1998 issue (“Computers Can Accelerate Productivity Growth” by Robert H. McGuckin and Kevin J. Stiroh and “No Productivity Boom for Workers” by Stephen S. Roach). However, there are two matters that I believe require qualification. The first of these articles makes an appropriate distinction between average labor productivity (ALP) and total factor productivity (TFP), implying that the latter is a superior measure and that the main reason for use of the former is its easier computation and the readier availability of the requisite data. The authors then go on to discuss the difficulty of measuring changes in product quality with productivity growth calculations, implying that any index of productivity that does not adjust for quality changes is per se inferior.
This is misleading. Both ALP and TFP provide valuable information, but information that is useful for different purposes. Moreover, even an index with absolutely no adjustment for quality provides very useful data if employed for appropriate purposes.
True, by definition, only TFP tells us directly about the growth in the productive capacity of the full set of productive resources the economy possesses. But it is ALP that comes closer to the issue of the economy’s ability to increase living standards. The explanation is simple. Economic living standards are measured by output per capita-that is, total output divided by total population. If the percentage of the population that is employed remains relatively constant, then it follows that output per capita must grow at the same percentage rate as output per worker; that is, as fast as ALP. It does not matter for this purpose whether that growth stems, whether from more plant and equipment, better technology, or whatever. It is ALP, not TFP, that tells us how living standards are doing.
Quality-unadjusted productivity also is a useful measure because it is an important indicator of cost trends and budgeting needs. For example, the education budget of a city will depend on trends in the number of teachers per student, which is, of course, a measure of (teacher) ALP-one that is totally unadjusted for teaching quality. Obviously, the quality does affect the value of the outcome crucially, and in the longer run it will probably affect costs. But in budgeting for the next three years, it is quality-unadjusted productivity growth that is the far more relevant measure, and one that is critical for many economic activities.
On the surface it looks as if the optimistic article by Robert H. McGuckin and Kevin J. Stiroh contradicts the pessimism of Stephen S. Roach, and debunks the “computer paradox.” But that is not really so at all. In fact, McGuckin and Stiroh confirm the paradox; but they add some new and interesting detail.
I don’t think anyone ever doubted that the spreading use of computers and robots in manufacturing would boost labor productivity directly. That is shown very nicely in Chart 1. A new labor-saving capital good saves labor in those industries where it is applied. It is also interesting that McGuckin and Stiroh find no correlation across industries between increased computer use and the rate of TFP growth. The first of these findings is a good but standard piece of economic analysis. The second brings us back to the computer paradox, and in fact strengthens it.
The insight that computers function just like other capital goods in manufacturing suggests some further research. There are well developed ways of analyzing the interplay of labor and capital (and intermediate inputs) in production. They involve isolating and estimating incremental productivities, degrees of substitutability and complementarity between inputs, and other characteristics of technology and cost. It would be very useful to go further and look more closely at the ways computers resemble other kinds of capital goods, and the ways they differ.
The story is quite different when it comes to the service-producing industries, where most of the computers actually are. Chart 2 tells the story fairly emphatically. There is no convincing way that Chart 2 can be explained away by mismeasurement of service-sector output. In the first place, as Roach shows, mismeasurement can cut both ways. Even on the quality-of-output side, one hears plenty of complaints about the deterioration of service. There is no reason to suspect an asymmetry in statistical measurability.
For mismeasurement of output to come anywhere near justifying the presumption that “true” service-sector productivity has behaved like manufacturing, one would have to believe that the underestimation of productivity growth in the service sector has widened after 1973 by as much as 4-5 percent a year over and above the degree of underestimation already present before 1973. That does not seem plausible. And then there is Roach’s point about the underestimation of working hours.
I think we just have to keep at trying to measure and understand the course of productivity growth. Maybe one lesson of the computer paradox is that drama and productivity are not the same thing. Indoor plumbing changed our lives too, but its effect on productivity was probably limited.
Safer guns
Efforts to develop more technological safety options, including personalizing guns, are to be encouraged for the small percentage of handgunners who have interest in those features, as sugggested by surveys and buying habits. But in “Making Guns Safer” (Issues, Summer 1998), Stephen Teret et al. use rhetoric more than science to suggest that it would be fair for the government to ban future sales of handguns lacking a currently nonexistent personalization technology that could be dangerous if it did exist (especially if it were not on all guns) in the dim hope of reducing a small portion of gun misuse.
That portion is made to seem larger by the use of tracing data to assert that most crime guns are relatively new, so prospective personalization will be effective quickly. But those data are based on traces, which are disproportionately attempted on newer guns where the success rate of trace attempts may approach 50 percent; it’s traced guns that are new, not crime guns. Even were the latter the case, criminals could adjust to changes in technology. Why should personalization prevent criminals from stealing and misusing guns? Motor vehicles are personalized and are about three times as likely as guns to be stolen and misused; criminals defy personalized residences almost five times as often as they steal and use guns. Hacking into computers, the only other commonly “personalized” household product, has become an adolescent hobby.
Besides failing to curb criminal misuse of guns, mandatory personalization is dangerous for a number of reasons. The idea was originally proposed for police firearms, since, unlike the guns of ordinary citizens, police guns are relatively often taken and misused against the officer. However, police express concerns that would make personalization unacceptable. The personalization would have to be much broader than one-person/one-gun: The same device would have to allow the use of all guns an individual might need and by all persons by whom a particular gun might properly be used. And the multiple personalization could not acceptably slow use at all, not by so much as one-hundredth of a second, not by the time needed to read a fingerprint or detect a signal, and certainly not by having to place a finger, ring, etc. at a precise point.
Furthermore, police-like citizens who have guns for protection-insist that the fail-safe position would have to be that the gun will fire, not that it won’t. A dead battery must not prevent firing. To ensure reliable use, gun owners would simply defeat the personalization feature by using heat, cold, or gunsmithing, or by storing activating devices near personalized guns, in the same way that Teret et al. note that a minority of gun-owners use “unsafe storage” now in order to have guns readily available for protection.
Personalization could endanger lives by reversing the traditional firearms safety training that all guns be treated as loaded and potentially dangerous. It would encourage carelessness about storing and playing with loaded guns by creating the false assumption that personalization would be both universal and effective.
Recent events in places as diverse as a school in Jonesboro, Arkansas, and our nation’s Capitol building, have forced us to confront again the alarming level of gun violence that has woven its way into the fabric of U.S. society. We live, in both cities and suburbs, with an unacceptable level of violent crime. No other weapon is used in the commission of those crimes with anything near the deadly frequency of a gun. The statistics are mind-boggling. In 1994, handguns were used to murder 13,593 Americans. In 1993, people armed with handguns committed more than 1.1 million violent crimes, although from 1987 through 1990, victims used firearms to protect themselves in fewer than 1 percent of all violent encounters. Perhaps more startling, from 1981to 1990, 85 percent of the police officers who were killed with handguns did not discharge their service weapon. Guns are just too easily accessible to those who misuse them.
“Making Guns Safer” tells the tale. In addition to out-of-control violent crimes involving handguns, this country lost 36,000 citizens to gunshot wounds in 1995, including 5,000 who were 19 or younger. In that same year, over 1,400 children used a gun to commit suicide. We know that in homes where a gun is present, in addition to accidental shootings, the risk of suicide increases fivefold and the risk of homicide triples. The annual cost of firearm injuries in pain, suffering, and lost quality of life is estimated to be over $75 billion, and the human toll is incalculable.
It’s time to take a long hard look at advances in technology, such as personalized handguns, that can realistically reduce handgun violence of all kinds. The concept is simple: The gun operates only for the rightful owner. It makes stolen guns useless to criminals. It makes spur-of-the-moment suicides and accidental shootings far less likely, and it tramples on the rights of no man or woman who wishes to legally own and operate a gun.
As a state legislator, on April 17 of last year I introduced a bill in the New Jersey legislature that would permit the sale of only personalized handguns after three years. The bill was assigned to the Law and Public Safety Committee and went nowhere. Your readers should realize that, if they believe personalized handgun technology can save lives, they must contact their state and federal representatives and express their support for legislation requiring its use. They can be sure that without strong public support, in the face of constant and strenuous lobbying by the National Rifle Association against such measures, sensible laws like one requiring personalized handguns will never become the law of the land.
Stephen Teret and his associates offer a perspective on the benefits of making guns safer by personalizing them. They note that a number of technologies are now being developed that would prevent anyone from firing a gun who lacked the requisite magnetic ring or transponder or fingerprint. The authors emphasize the value of such devices in keeping a youth from unauthorized use of his or her parent’s gun, thereby reducing the chance of a serious accident, suicide, or schoolyard shooting.
In the near term, personalized guns may be too costly to achieve much market penetration. However, as Teret points out, either litigation or government regulation could hasten the process. Indeed, a regulation requiring a device of this kind may well pass the cost-benefit test. Suppose that a handgun with a personalized locking mechanism would sell for $200 extra. The implicit value of a statistical life in safety regulations is often $2 million or more. Thus, if just one life were saved for every 10,000 personalized guns sold, then mandating this technology would arguably be worthwhile.
But as valuable as it may be to save lives by blocking intrafamily transfers, in the long run the greater benefit of personalizing guns may come from its effect on the black market. Over half a million guns are stolen each year from homes and private vehicles. These guns may be kept for the thief’s personal use or transferred to someone else for money or drugs. The influx of stolen guns to the informal market enhances gun availability to those who do not wish to purchase from a licensed dealer, making it cheaper and easier for youths and criminals to go armed. Although it is impossible to say what fraction of the million-plus gun crimes committed each year involve stolen guns, there is reason to believe that it is quite large.
Personalized guns would be of no use to someone who lacked the necessary device for releasing the safety. In particular, such a gun would be without value on the black market, unless it were possible to circumvent the locking system cheaply. Thus one aspect of the design challenge for personalized guns is to make it difficult to modify the locking device without authorization. If the technology is successful, years from now a burglar who discovers a newish handgun in a dresser drawer would leave it there, rather than (as now) viewing it as prize loot, the near-equivalent to cash on the black market.
Teret et al. cite statistics on unintentional shooting deaths, but they do not mention that those numbers have been falling even as the number of privately owned guns has risen. For example, there were 181 fatal gun accidents in 1995 to children under 15; in 1970, there were 530. This change is largely attributable to the popularity of handguns, which have tended to replace deadlier rifles and shotguns as home defense weapons.
Gun personalizing technology might be attractive to a few consumers, particularly for use with concealed-carry pistols, but doubts about long-term reliability are a negative consideration. I would not wish to discover at the least opportune moment that corrosion or an old battery in the mechanism has rendered the gun inoperable.
Would a special ring on my gun hand identify me as possessing a concealed weapon? Suppose I have to switch the gun to my other hand? Do I get an assortment of ring sizes for my wife, my adult son, and a friend who asks to try out my 9-mm pistol at the range? Is it at all plausible that criminals would be unable to disassemble stolen guns and circumvent their personalizing mechanisms? Wouldn’t the extra cost of personalization be rather daunting to less affluent gun purchasers, who are only able to afford an inexpensive weapon as it is? Altogether, it seems unlikely that this technology would be a very popular choice.
But of course, Teret et al. are not talking about choice, they are talking about compulsion. From their “children are killing children” opening line to their concluding advocacy of spurious lawsuits, their article is not really about safety or technology. It is about stigmatizing gun owners, harassing manufacturers, and erecting barriers to the possession of effective weapons for self-defense. Yes indeed, the Consumer Product Safety Commission lacks jurisdiction over firearms-precisely to block its use as a vehicle for such mischief.
Farming and the environment
David Ervin is to be congratulated on clearly, directly, and accurately addressing the problem of agricultural water pollution (“Shaping a Smarter Environmental Policy for Farming,” Issues, Summer 1998). The political and economic power of the farm community does not entitle it to contaminate the nation’s lakes and streams.
The remedies that Ervin suggests will be hard to implement. As he recognizes, the market mechanisms used in other contexts, such as trading pollution rights, may be difficult to apply to agriculture. The location of the agricultural pollution sources is likely to be important which, in turn, will sharply limit the number of potential participants in any kind of market. However, as Ervin also notes, there are a number of “win-win” technologies, such as no-till, that can benefit both the environment and the farmer. Hopefully, the research he advocates will develop more such solutions.
David Ervin presents a case for “smart regulation,” which would set measurable agricultural pollution goals and firm deadlines for a variety of voluntary incentives. Failure to meet deadlines would bring about penalties, and excessive damages would bring about civil fines. Anyone falling below minimal good-neighbor performance would not receive any payments. Green payments would reward those who go beyond minimum performance.
Innovative approaches are certainly needed, as regulation by way of direct controls has not worked well. There are too many ways in which the regulated community can wear down the regulator, rendering it more captive than enforcer. Enforcement itself is so unpopular that the enforcer becomes a very reluctant civil servant.
In my view, prescriptive approaches to pollution regulation are inadequate to the enormous tasks facing society today. The problems are fundamental, and they require fundamental responses if society is to gain a more effective handle on burgeoning problems.
Understandably, regulation has proceeded backward. The Clean Water Act and pesticide laws were passed to clean up what was polluted and to try to prevent further pollution. Despite progress, it’s been a catchup game as new problems arise and solutions confound regulators. Penalties tend to be inadequate to address the magnitude of the pollution or are not even imposed. Too often the public picks up the tab.
We need to get ahead of problems to the fullest extent possible. Baseline data is needed to ascertain progress or deterioration. In addition, identification and monitoring of hot spots are needed to evaluate progress and make midcourse corrections. We need to understand what is happening and why. Above all, agriculture needs a concerted and sustained pursuit of basic fundamental understanding, so that the inherent strengths of the managed ecosystem can be used with more modest and intelligent inputs than in the past. Ecologically based pest management would improve ecosystem health not by treating symptoms but by integrating many components that maximize use of natural processes with minimum development of resistance.
Smart regulation would follow if the fundamental R&D and pollution prevention R&D that Ervin discusses come about. Farmers would be able to replace polluting technologies with less invasive methods. Accountability, as Ervin points out, is key. Also, as he urges, recognition of farmers who deliver environmental benefits beyond their community should definitely be rewarded. The president should establish an awards ceremony at the White House to recognize the 10 cleanest farms in the United States.
I hope the points Ervin makes become a central part of the debate about how to reduce agricultural pollution.
Reinventing environmental regulation
In “Resolving the Paradox of Environmental Protection” (Issues, Summer 1998), Jonathan Howes, DeWitt John, and Richard A. Minard, Jr., say that “EPA’s central challenge is to learn to maintain and improve a regulatory program that is both nationally consistent and individually responsive to the needs of each state, community and company.” The authors’ numerous recommendations for resolving that paradox largely involve the U.S. Congress and the U.S. Environmental Protection Agency (EPA). Although I agree with the authors’ call for greater sensitivity to state and local needs at the federal level, I believe that the paradox will ultimately be resolved only by looking beyond Washington, D.C., for solutions.
When much of today’s environmental regulatory program was put into place nearly three decades ago, there were good reasons to centralize that program in Washington. Environmental damages were visible and the major polluters were identifiable and easy to regulate. Without question, progress has been made on regulating the conditions that system was designed to address, such as improving the air quality in Los Angeles and the water quality of the Great Lakes. In some instances, regulation at the federal level remains the best approach; for example, the control of motor vehicle emissions requires common standards throughout the nation. In many other instances, however, past polluters that are now thoroughly regulated have become less significant, whereas individually small emissions from many dispersed sources are the problem that must be addressed.
Some assert that, to address the changing nature of environmental problems, the current regulatory system must be made even larger and more bureaucratic. I believe that a more effective approach would be to redirect these issues to whomever is in the best position to find innovative solutions, including responsible states, companies, and communities.
There are reasons why such a new approach would be effective today. States that 30 years ago had limited expertise in environmental protection now have viable and competent state environmental agencies. In the past three decades, many environmentally conscientious companies have incorporated environmental stewardship into their business practices and adopted environmental management systems such as ISO 14000. Finally, after 30 years of environmental education, citizens have become more knowledgeable about the environment and are actively concerned about their communities. The time has come to assess how much of the responsibility historically wielded by EPA can now be more appropriately assumed by responsible states, progressive companies, local communities, and involved citizenry.
Jonathan Howes, DeWitt John, and Richard A. Minard, Jr., should be congratulated on succinctly articulating the major tenets of environmental reinvention and summarizing some important innovations undertaken by EPA and the states. These ideas reflect not only National Academy of Public Administration (NAPA) studies but ideas emanating from many involved in the reinvention policy arena over the past five years, such as the Aspen Institute; the Yale Next Generation Project; the Center for Strategic and International Studies Enterprise for the Environment; the President’s Council for Sustainable Development initiative; numerous individuals and activities at EPA; and, last but not least, creative state governors, environmental commissioners, and their staffs. The recommendations in the article parallel many of those coming from a series of publications issued over the past four years by the National Environmental Policy Institute’s (NEPI’s) Reinventing EPA and Environmental Policy project. This is not unexpected, as many of the same individuals and institutions are represented in these efforts, resulting in significant cross-fertilization of ideas.
In particular, the authors should be commended for outlining specific elements of an integrating statute. Two years ago, NEPI published a report with many similarities to these recommendations, entitled Integrating Environmental Policy: A Blueprint for 21st Century Environmentalism. It outlined an extensive and far-reaching approach, drawing on the advice of many of the leading public and private officials in environmental policy over the past 25 years.
One aspect of the article that deserves greater attention is the lack of progress made to date, due in large part to the fierce party partisanship on Capitol Hill. This is not unexpected, as it reflects the top-down, command and control manner in which we have developed our environmental laws, regulations, and policy. Whoever controls the established order in Washington controls the issue. Unless we change that overly political and ideological approach, any significant long-term success through policy initiatives, pilot projects, or even legislation will be very difficult to achieve.
One cannot expect to de-bureaucratize, streamline, or reform the way in which we approach environmental management by using the same top-down approach with which the system was developed and in which EPA has the last word in all decisions relating to changing the system. What is needed is a more shared approach to setting policies, which for lack of a better phrase we will call “democratizing environmental policy.” We believe that America, to attain higher levels of environmental benefit, must begin a process of more fully engaging its elected representatives at all levels of government, its communities, its citizens, and its private sector institutions to help set the national agenda. Expand the number and quality of those setting the agenda, and it will have a better chance of being achieved.
Why? Because the remaining environmental challenges are generally localized and are thus more amenable to local solutions designed for specific sites and situations. Those closest to problems are usually able to better assess the most important issues, balance competing environmental interests, and determine solutions to pursue opportunities that are most meaningful to them.
In practical terms, this means policy imperatives based on the following actions. 1) Engage the main parties in a nationwide debate over agenda setting. How do local and regional goals enter into the big picture? How are resources allocated? 2) Allow the citizens of states and localities to prioritize their problems and identify opportunities, applying resources in the most efficient ways, using flexible, results-oriented approaches. This process should contribute to and in large measure set the national agenda. 3) The electorate should hold state and local leaders responsible for achieving environmental results that are agreed on up front. 4) Allow those closest to the problems to identify opportunities for environmental improvement beyond reducing key pollutants. Such opportunities include the development of new green infrastructure that promotes air and water quality, flora, fauna, and recreation, while allowing for creative and appropriate economic development. 5) Engage society in redirecting and perhaps expanding public and private resources from a variety of sources-existing environmental and other-to address and fund these agenda items. 6) Congress should initiate its own organizational and statutory modifications, maybe even the integrating statute described by NAPA and possibly taking the NEPI approach.
This vision is well suited to the entrepreneurial spirit and ingenuity of the American people, which are crucial to further environmental progress. Most important, it promises constructive improvement over the current system and a way to achieve needed legal and regulatory changes.
The success of environmental regulatory programs initiated over the past three decades is undeniable. Command and control regulation ended the uncontrolled pollution of air, land, and water that was common industrial practice before the 1970s. However, as “Resolving the Paradox of Environmental Protection” illustrates, in the 1990s we have reached the point of diminishing returns from the traditional pollutant-by-pollutant regulatory focus. Broad partnerships that draw on the environmental ethic of citizens and the expertise of the private and nonprofit sectors are the key to achieving a sustainable society.
New Jersey is deeply committed to the National Environmental Performance Partnership System (NEPPS) as the cornerstone of effective environmental partnerships. The NEPPS agreement between the New Jersey Department of Environmental Protection (DEP) and the U.S. EPA is the framework we have needed to work successfully with environmental stakeholders. In New Jersey, the nation’s most densely populated state and among the most intensively industrialized, we face a host of complicated and interrelated environmental challenges. The creativity and commitment of partners are essential to meeting them.
For example, New Jersey is tackling nonpoint-source water pollution through holistic watershed management in partnership with community groups, property owners, businesses, and local governments. We have achieved nearly all that we can through site-specific regulation of point sources. Only by working with the individuals and institutions residing within a watershed will we succeed in uncovering the origins of nonpoint-source pollution and fashioning solutions.
When the New Jersey DEP was created in 1970, the disposal of industrial wastes was utterly unregulated. It made sense then for us to measure our progress by counting the number of permits we wrote, inspections we conducted, and fines we levied and collected. Today those activities remain important, but as measures of environmental health they have little relevance. NEPPS expands our attention from work routines to our fundamental goal of a sustainable society.
The golden rule os NEPPS is that you can only manage what you can measure. In New Jersey we are carefully measuring the current state of the environment, explaining that in terms the public understands, setting improvement goals that are ambitious but achievable, and establishing milestones to measure our progress. Science has long recognized the links between air pollution, land use, water quality, and ecosystem health. NEPPS prompts us to recognize those links when shaping our management strategies. Although the process is still new, already it is obvious that meeting our goals will push us to work across programmatic boundaries within the department while fostering partnerships outside the department. The measure of success for our policies is the quality of our environment.
We increasingly recognize environmental problems not as separate challenges but as parts of a whole. It makes sense that we have begun to see solutions the same way. The New Jersey DEP is part of the solution and EPA is another part, but without partnerships that foster cooperation with a broad spectrum of stakeholders we will have only partial solutions.
Act now to slow climate change
In “Implementing the Kyoto Protocol” (Issues, Spring 1998), Rob Coppock accepts the possibility of a forthcoming significant change in the global climate caused by human emissions of greenhouse gases into the atmosphere. This is consistent with the general consensus of the scientific community that is reflected in the findings of the Intergovernmental Panel on Climate Change (IPCC), which I chaired from 1988 to 1997. This agreement is most welcome.
The key issues are: How soon will this threat be transformed into actual changes in different parts of the world, how serious will the damages be, and what should our response be? Coppock views these issues from an almost exclusively U.S. perspective. The views and attitudes of other countries, both developed and developing, must, however, be carefully considered. We can hardly expect that the global climate change issue can be solved without truly global cooperation. We have to act together, and we have to act now.
We do not know how soon a marked change in climate will occur and how serious the effects will be. Coppock’s view is that the expected damage caused by a doubling of carbon dioxide (CO2) concentrations by the latter part of the next century is not an economic problem, at least for developed countries. His conclusion is not supported by available scientific analyses. Although the changes may be modest, it is equally likely that they will be quite serious. They may occur in many parts of the world, and we do not know which countries will be hit most seriously. We do know, however, that developing countries are more vulnerable than developed countries. Further, because of the inertia of the climate system, the effects of past human activities are not yet fully reflected in the climate. And because of the inertia of the socioeconomic system, protective measures will become effective only slowly.
Coppock admits that “the story is different for developing countries” but minimizes the significance of this by stating that “they already face such daunting problems that the additional challenges imposed by global warming present only a marginal increase.” Of course, war, oppression, and poverty are more serious, but significant climate change will be a considerable impediment to sustainable development in developing countries. Coppock’s perspective also undermines any attempt to address the climate change issue in the cooperative spirit that will be essential to deal with issues such as equity among developed and developing countries. This is obviously of direct interest not only to developing countries but to all countries when addressing climate change.
Developed countries and countries in economic transition such as those in Eastern Europe and the former Soviet Union contribute almost 65 percent of the global emissions of CO2 from burning fossil fuels. They are responsible for more than 80 percent of total emissions since the Industrial Revolution. Average per capita emissions are 6 times as high in developed countries and 10 times as high in the United States as they are in developing countries.
The availability of cheap energy has been of fundamental importance for the expansion of industrial society during the past 150 years. It is not surprising that many developing countries wish to follow a similar development path. Some are now rapidly expanding the use of fossil fuels in their attempts to follow the technological course that developed countries took during 20th century. Still, most of them lag far behind. These simple facts form the basis for the developing-country position, also supported by the Climate Convention (ratified by the United States in 1992), that developed countries must take the lead by reducing their emissions and must assist developing countries technologically and financially to change their course in due time. In order to develop a long-term strategy, it is obviously important to assess what emissions of CO2 are permissible if we wish to stabilize the concentration of CO2 in the atmosphere at some level and to find some acceptable principle for burden sharing among the countries of the world.
Coppock’s starting point for his analysis is that we might adopt a policy aimed at stabilizing the CO2 concentrations in the atmosphere at a level twice that found in preindustrial times. This is not acceptable. We must face the possibility that more stringent measures may be required. It should also be recalled that other greenhouse gases must be factored into our plans. As the IPCC notes, “The challenge is not to find a policy today for the next 100 years but to select a prudent strategy and to adjust it over time in the light of new information.”
Coppock also argues that attempting to fulfill the obligations prescribed in the Kyoto Protocol would be counterproductive because retrofitting existing machinery and buildings would expend resources that would be better spent on developing a long-term strategy to combat climate change. But when taking into account the projected growth in the world population in the 21st century, it becomes clear that it will be necessary to begin limiting emissions in the next few decades if we want to prevent the concentration of CO2 in the atmosphere from more than doubling toward the end of the next century. Developing countries will in any case not be able to use fossil fuels in the way that developed countries did during the past half century. In order for them to be able to even modestly increase fossil fuel use, developed countries will have to reduce their CO2 emissions.
I propose a two-track process: First, developed countries should take the lead during the next decade to reduce their emissions largely along the lines agreed to in Kyoto. This should not be difficult, because it requires them to keep emissions in 2010 to the level they jointly achieved already in 1996. Countries that anticipate problems in meeting their quotas have the option of acquiring emissions quota from other developed countries that expect to be well under their quotas.
In addition, the IPCC has shown that energy efficiency gains of 20 percent or more can be achieved at modest cost and sometimes no cost in many sectors of society. Some such measures can be taken rather quickly, and countries should encourage industry to do so. The European Union and Japan emit about half as much CO2 per capita as does the United States, even though their industrial structures are similar. This indicates that the potential for improvement is large.
Second, as Coppock rightly points out, we need long-term strategies for the period beyond 2010. Such work must not be postponed, and developing countries should participate. As Coppock notes, major improvements in efficiency can be achieved in the paper and pulp industry, the metal casting industry, the building sector, and electric utilities. Very efficient automobiles, which are already under development by industry, will also be part of the answer, but more work on future transport systems will be needed.
In the long term, fossil fuels cannot be the prime source of energy. The major corporations in the energy field have already taken some steps toward developing alternatives, and governments will have to play a major role by supporting R&D and stimulating innovation in other ways.
The effort to limit climate change will have to continue for decades. The Kyoto Protocol took the first steps in the right direction, but it does not do enough to address long-term actions. Strategies will have to be adjusted over time as we learn more. The next comprehensive assessment of the issue prepared by the IPCC will become available in 2001.
I read Byron Swift’s “The Least-Cost Way to Control Climate Change” (Issues, Spring 1998) with great admiration. It is as thorough a summation of the implications of emissions trading for climate change policy as any I have seen.
Swift’s article recognizes both the benefits and the practicalities of trading. Significantly, it acknowledges the dual contribution trading makes, not only in enabling industries to meet their environmental targets at least cost but also in making it possible to afford more emission reductions. This duality of the benefits seems obvious enough when articulated, but surprisingly it has traditionally been overlooked by the advocates of trading, who have focused solely on the benefits to industry. For years, environmental groups, taking the advocates of trading at their word, opposed trading because they believed it held risks and offered no benefits to the environment. Experience with the use of trading systems to reduce the levels of lead and sulfur dioxide (SO2) in the environment should correct that perception. Swift capitalizes on that experience in his analysis.
In contrast, one conceptual issue that has not sorted itself out yet is the notion underlying Swift’s statement that a trading market will set the price of credits or allowances at the marginal cost of control. In this regard, a recent analysis of the SO2allowance market by Anne Smith (Public Utilities Fortnightly, May 15, 1998) suggests that the price of allowances is currently discounted below their true marginal cost because of regulatory factors. Although people often speak of harnessing the market, which implies that the “invisible hand” takes over from regulatory decisionmakers, the price signals observed in emissions trading markets reflect design features of the program. These are not free markets in the traditional sense but creatures of regulation. That is, they are not market-based programs, as so often described, but market-driven regulations. Emissions trading programs are fundamentally regulatory in nature, because without an enforceable mandate there is nothing to trade. As a result, abstract notions of marginal cost do not easily apply to emissions trading. Incentives in such a market will reflect its design elements, including the stringency of the standard, the allowance distribution, the emissions monitoring protocols, the enforcement mechanism and penalty structure, and the administrative style of the regulatory authorities. All these design factors, rather than an abstract notion of the invisible hand, determine such outcomes as market price, liquidity, contestability, and the extent of violation.
One idea that Swift returns to repeatedly is the notion that trading creates a dynamic approach to emission control. Taking this notion further, incentives for innovation, if taken to their full potential, provide the means to actually produce an energy revolution comparable to that which occurred in the late 19th century with the advent of fossil-fuel-fired electricity generation. When climate change regulation is paired with trading, we could actually find ourselves with an improved quality of life as innovations introduce new sustainable energy sources at lower cost than current conventional fossil fuel sources. In contrast, were we to attempt to achieve climate change goals by the command-and-control style of regulation, we would find ourselves boxed in by narrow choices and stifled in any attempt to introduce innovative energy sources.
In climate change negotiations, as was the case with lead and SO2, the diverse interests of the affected parties have created an impasse in negotiating remedies. The lead and SO2trading programs demonstrated that trading makes possible agreement that would be otherwise unattainable. Indeed, considering the world of differences among the climate change parties, I would say that trading is essential.