Forum – Summer 2002

Electric system security

“Bolstering the Security of the Electric Power System” by Alexander E. Farrell, Lester B. Lave, and Granger Morgan (Issues, Spring 2002) addresses a pressing concern. Electrical power plants are one of many areas that might be subject to a terrorist attack. Here in Congress we are passing legislation that increases safety precautions and protection against possible terrorism. Plants that generate the nation’s power supply are an important consideration. The Patriot Act and the Enhanced Border Security Act, as well as increased appropriations for Homeland Security, are significant steps we have taken to help ensure the well-being of the American people and their property against terrorist attack.

Although it is impossible to anticipate every potential danger to our society, we can greatly reduce the chances of future attacks by remaining diligent in our war on terror. We must not become complacent because of our current success in preventing some attacks, but must press on with our mission to eradicate terrorist cells in this country and elsewhere.

We are also making strides to protect technology sectors from potential terror attacks. I am a cosponsor of the Cyber Security and Research and Development Act that went through my Science Subcommittee, which aims to establish new programs and provide more funding for computer and network security R&D. It also provides research fellowships through the National Science Foundation, the National Institute on Standards and Technology, and the Commerce Department.

The effect on the American consumer of guarding against terrorism remains to be seen, but I expect it to encompass all facets of commercial society. The cost of greater security for electric generation will eventually increase not just the cost of power for home consumers but the cost of everything produced.

REP. NICK SMITH

Republican of Michigan


“Bolstering the Security of the Electric Power System” thoughtfully examines the challenges in improving the ability of the nation’s electric power system to withstand terrorist attacks. They make a persuasive case for emphasizing survivability over invulnerability as a strategy to protect the electric power system as a whole.

The challenge of ensuring adequate security should be viewed in a broader context than the authors have acknowledged, however. We have limited funds as a society to spend on security, and these assets should be allocated rationally to defend infrastructure of all kinds. The aim should be to develop an integrated and balanced national strategy to protect all sectors of our society, not just the electrical system. That strategy should define an appropriate level of protection and establish the boundaries of the responsibilities of the private sector and local, state, and federal governments. The article, with its focus on the electrical system, does not recognize some of the broader and more fundamental questions that must be addressed.

The authors seem to suggest that a straightforward solution to security concerns is to eliminate “high-hazard facilities,” apparently including dams and nuclear plants. The evaluation of such an approach should appropriately include consideration of the economic, environmental, and other costs. Those placing a higher value on the reduction of greenhouse gases, for example, might not see the authors’ solution as quite so straightforward. And, when viewed in the broader societal context, the authors’ suggestion would presumably imply the elimination of other high-hazard facilities posing similar risks, such as many chemical facilities, petroleum refineries, and the like. This approach has widespread implications for our economy and lifestyle that the authors do not examine.

The authors also assert that adequate institutions for the protection of nuclear power plants have yet to be developed. I believe that the Nuclear Regulatory Commission’s (NRC’s) regulatory oversight processes are adequate to ensure that nuclear plant licensees establish and maintain acceptable levels of site security. Moreover, the authors somewhat mischaracterize the state of nuclear plant security. The Operational Safeguards Response Evaluations conducted by the NRC are force-on-force exercises that are explicitly designed to probe for weaknesses in plant security when the attacking force has complete knowledge of the site defensive strategy. When weaknesses are exposed, NRC licensees are required to take appropriate steps to correct them. The authors’ claim of poor performance thus reflects a common misunderstanding of the purpose and difficulty of these exercises.

RICHARD A. MESERVE

Chairman

U.S. Nuclear Regulatory Commission

Washington, D.C.


Pentagon management

I agree with many of Ivan Eland’s suggestions about running Pentagon acquisition like a business (“Can the Pentagon Be Run Like a Business?,” Issues, Spring 2002). So have numerous Pentagon leaders. For at least the past 30 years, every single secretary of Defense has sought to reform defense acquisition. Why, then, does the Pentagon still not run like a business?

Having observed and participated in defense business for the past 30 years, I offer a one-word answer: mission. The Department of Defense’s (DOD’s) mission is to deter wars and if necessary fight and win them. Efficiency does not appear in this mission statement. As a result, commanders and managers focus on buying the best equipment and making it run well within whatever budget they can justify. That does not mean that these commanders and managers do not care about efficiency. Most are capable, public-spirited individuals who want to give the taxpayers as much defense as possible for their dollars. However, because DOD’s mission does not require efficiency, it inevitably becomes a secondary priority in the crowded schedule of defense leaders. We are currently witnessing this phenomenon in action as senior managers focus intently on the war on terrorism.

Mission is the most important but not the only roadblock to running the Pentagon like a business. There are few incentives to save, because reducing costs often leads to smaller future budgets. Pressure to spend all of one’s budget, in order to establish a base for larger future budgets, also hinders efficiency. Finally, Congress and politics slow or halt efficiencies, though I believe Eland attributes too much of the problem to this factor.

If we want a more businesslike defense establishment, we would have to include efficiency as part of its mission and measure commanders and managers based on their success in business as well as in war. But I would not recommend this approach. DOD’s single-minded focus on winning wars has served our country well many times during its history, including now.

Even if we accept a system that leaves some roadblocks in place, we should not stop trying to create a more businesslike Defense Department. In an organization the size of DOD, even small efficiencies can yield large dividends for the taxpayer. Eland’s various suggestions deserve careful consideration (or, more often, reconsideration). For example, we should follow Eland’s suggestions and strive for more use of commercial specifications and continue to work to engender competition. In a few cases, however, Eland does not acknowledge the disadvantages of his proposals. It is not clear, for example, that accepting monopolies by ending teaming relationships will hold down costs over the long run.

Whatever we do, we should remember that the Pentagon’s mission does not include efficiency. Our motto should be: Keep trying, but be realistic.

ROBERT HALE

Logistics Management Institute

Washington, D.C.

Hale has served as the Air Force comptroller and also directed the National Security Division of the Congressional Budget Office.


Ivan Eland has reported one of my recommendations for improving incentives for defense contractors. Although I do indeed recommend that defense R&D contracts be allowed to be more profitable, the emphasis should not be “upon completion” as Eland has written. Instead, the emphasis should be on allowing profits for doing defense R&D, period, whether upon completion or not.

As Eland notes, too often defense contractors take short cuts in R&D to get into production, because production is profitable and R&D is not. The result is that problems that could have been solved earlier in development can plague the system–and U.S. troops overseas–for decades. This adds to life-cycle costs for logistics, maintenance, repair, and workarounds. It also can leave our troops in the field with poorer capabilities than they and American taxpayers had been led to expect. In recent years, 80 percent of new U.S. Army systems did not achieve 50 percent of their required reliability in realistic operational testing. The Army has been working to correct this situation, but it is a direct result of the reversed incentives in defense contracting.

The Navy and the Air Force have had their own difficulties. In recent years, 66 percent of Air Force systems had to halt operational testing because they were not ready. In 1992, only 58 percent of Navy systems undergoing operational testing were successful.

Defense industry responds to incentives, and if the incentives reward the development of good reliable equipment for our military, industry will respond. If, on the other hand, the incentives are to get into production as soon as possible, U.S. troops can end up with unreliable or even dangerous equipment. The V-22 Osprey with its record of poor reliability and fatal crashes is a case in point.

When commercial industry produces something that doesn’t work, or has to be recalled, the consumer simply stops buying it. The company can fail and go out of business.

But in defense contracting, the government is the customer and often isn’t willing to stop buying a poor product or let the company fail. Too many jobs and other constituent interests are at stake, and the “can-do” attitude that we admire in our military usually takes over to work around the difficulties. The cynical expression, “close enough for government work” has its origins in this situation.

So unless the politics in defense contracting can be changed to be the same as in commercial contracting, and I doubt it can, the Pentagon simply cannot be run like a commercial business. What we can do is work to improve the efficiency of Pentagon business processes and to reduce costs, and the military services try to do this every day.

In the long run, it will be more effective to change the incentives for the defense industry so that its well-being is tied more to the quality of its defense products than to the quantities produced.

PHILIP E. COYLE

Los Angeles, California

Coyle was assistant secretary of Defense and director of Operational Test and Evaluation from 1994 to 2001.


What broadband needs

As Adam D. Thierer points out (“Solving the Broadband Paradox,” Issues, Spring 2002), “the public has not yet caught broadband fever.” This should not be surprising. The rates of adoption of dial Internet access, as well as the utilization patterns of data networks, proved many years ago that the “insatiable demand for bandwidth” was a myth. New products and services take time to diffuse widely. Today, when offered the choice, most people vote with their pocketbooks for extremely narrowband wireless phones over comparably priced DSL or cable modem links.

Although mobility currently trumps broadband in the market, that may not persist for ever. Adoption rates for broadband, although disappointing by the expectations of Internet time, are high, higher than those of cell phones at a comparable stage in the development of the wireless industry. The question is whether we should strive to increase these rates, and if yes, how to do it.

Thierer dismisses “spending initiatives or subsidization efforts” as “unlikely to stimulate much broadband deployment.” That is surely incorrect. As the example of South Korea (with over 50 percent broadband penetration) shows, lower prices can do wonders for demand, and some “spending initiatives or subsidization efforts,” if well targeted, might lower prices in the United States. However, Thierer is probably right that it would be unwise to make giant investments of public money in this area, where technology and markets are changing very rapidly.

Thierer’s main prescription is to deregulate the Baby Bells. In the interests of brevity, I will not discuss the reasons why I feel this would have perverse effects. Instead, let me suggest three other methods for stimulating broadband: one intriguing but totally impractical, one very practical but incremental, and one speculative.

The impractical method for stimulating broadband adoption is to make music free on the Internet. As Thierer notes, Napster and its cognates have been among the main reasons people buy broadband connectivity. Instead of using the law to choke file swapping, perhaps we should encourage the telecom industry to buy off the music studios. Total recorded music sales in the United States come to a grand total of about $15 billion per year, whereas telecom spending is over 20 times higher. Thus, in the abstract, it might be a wise investment for the phone companies to buy out the studios. This is of course wildly impractical for business and legal reasons, but it would quickly stimulate demand for broadband. (It would also demonstrate that the content tail should not be wagging the telecom dog, as it too often does in political, legal, and business discussions.)

A more practical method for stimulating broadband is to encourage migration of voice calls to cell phones (which currently carry well under 20 percent of total voice traffic). This would force the Baby Bells to utilize the competitive advantage of wired links by pushing broadband connectivity. This migration could be speeded up by forcing the Baby Bells to spin off their wireless subsidiaries, and by making more spectrum available for cell phones.

The third technique for stimulating broadband is to encourage innovative new wireless technologies, such as those using the unlicensed bands (as in 801.11b, aka WiFi) and Ultra Wide Band. The technical and economic feasibility of these technologies for providing connectivity on a large scale is unproven as yet. However, if they do work, they might offer a new mode of operation, with most of the infrastructure owned and operated by users.

ANDREW ODLYZKO

Digital Technology Center

University of Minnesota

Minneapolis, Minnesota


I commend Adam D. Thierer for his illuminating article. The debate about how the United States carries out its most ambitious national infrastructure build-out since the interstate highway system is deeply complex. Fortunately, Thierer makes a clear, compelling case for just how much is on the line for the nation.

I couldn’t agree more with his central thesis that “FCC regulations are stuck in a regulatory time warp that lags behind current market realities by several decades . . . and betray the cardinal tenet of U.S. jurisprudence that everyone deserves equal treatment under the law.”

His explanation of the “radically different regulatory paradigms” that govern competing broadband platforms correctly notes that cable and wireless high-speed offerings are virtually regulation-free, while the comparable service of phone companies, DSL, is mired in heavy-handed rules written for voice services. Because DSL is singled out for an avalanche of regulations (including requirements that we share our infrastructure with competitors at below-cost prices), phone companies are deterred from investing aggressively in a truly national 21st-century Internet.

We have the opportunity today to end this separate and unequal treatment if Washington chooses wisely between two broadband proposals currently under consideration in the U.S. Senate. The first is a throwback to the past. With the nation in the grip of recession, Sen. Ernest Hollings (D-S.C.) suggests a multibillion-dollar big-government program. Adamantly opposed to equal regulatory treatment for phone companies, his proposal protects the current regulatory disparity, opting instead to subsidize state-sponsored telecom networks, something Thierer rightly warns is unlikely to keep the United States on technology’s leading edge.

Fortunately for taxpayers, the forward-thinking alternative, offered by Sens. John Breaux (D-La.) and Don Nickles (R-Okla.), would not cost the U.S. government a dime. The Broadband Regulatory Parity Act simply guarantees DSL the same minimal regulatory treatment as cable and wireless high-speed offerings. By ensuring equitable treatment of all broadband investments, this bill would encourage businesses, rather than taxpayers, to aggressively finance the fulfillment of America’s “need for speed.”

It’s a straightforward solution. And, as Thierer points out, its opponents are largely companies that “prefer not to compete.” Although I understand these companies’ desire to maintain their unfair advantage, certainly the nation has a strong interest in seeing the maximum number of companies and platforms vying for customers and investing rapidly in our broadband future.

Given the urgent need for Washington to acknowledge the importance of basic regulatory fairness, I truly appreciate Thierer’s cogent explanation of how a technology-neutral broadband policy will benefit not merely local phone companies but consumers and the U.S. economy. By raising awareness, hopefully this article will intensify the pressure to deliver what all companies in competitive markets deserve: equal treatment from their government.

WALTER B. McCORMICK, JR.

President and CEO

U.S. Telecom Association

Washington, D.C.


Thirty years ago, the Federal Communications Commission (FCC) decided to require telephone companies to make their networks available to computer and data services on a nondiscriminatory basis. Fifteen years later, regulated open access interacted with the open architecture of the Internet to create the most dynamically innovative and consumer-friendly environment for information production in the nation’s history.

Over the course of about a decade, a string of innovations–the Web, Web browsers, search engines, e-mail, chat, instant messaging, file sharing, and streaming audio–fueled consumer demand for dial-up Internet connections. Half of all households now have the Internet at home. Unfortunately, the broadband Internet has not provided this same open environment. Cable companies have been allowed to bring their closed proprietary model from the video market into the advanced telecommunications market. In response, telephone companies have resisted the obligation to keep the high-speed part of their networks open. Both cable and telephone companies have a strong interest in slowing the flow of innovation, because they have market power over core products to protect. Both price the service far above costs, which starves new services of resources.

Cable companies, who have a 75 percent market share in the advanced service market for residential customers, do not want any form of serious competition for their video monopoly. They lock out streaming video and refuse to allow unaffiliated Internet service providers to exploit the advanced telecommunications capabilities of the network for new services. Telephone companies, who have a 90 percent market share in the business market, do not want competitors stealing their high-volume customers, so they make it hard for competitors to interconnect with their networks.

Closed networks undermine incentives and drive away innovators and entrepreneurs. In 1996, there were 15 million Internet subscribers in this country and over 2000 ISPs, or about 15 narrowband ISPs for every 100,000 subscribers. Today, with about 10 million broadband subscribers, there are fewer than 200 ISPs serving this market, or about 2 per 100,000 subscribers. In the half decade since high-speed Internet became available to the public, there has not been one major application developed that exploits its unique functionality.

With high prices and few innovative services available, adoption lags. About 85 percent of American households could get high-speed Internet, but only 10 percent do. Since broadband came on the scene, narrowband has added about three times as many subscribers. This is the result of closed or near-closed systems where market power is used to keep prices up and control innovation.

For two centuries, this country has treated the means of communications and commerce as infrastructure, not just a market commodity. A cornerstone of our open economy and democratic society has been to require that roads, canals, railroads, the telegraph, and telephone be available on a nondiscriminatory basis, while we strive to make them accessible to all our consumers and citizens. As digital convergence increases the importance of information flow, we are making a huge public policy mistake by allowing these vital communications networks to be operated as private toll roads that go where the owners want and allow only the traffic that maximizes the gatekeeper’s profits.

MARK COOPER

Director of Research

Consumer Federation of America

Washington, D.C.


Engineering education

Wm. A. Wulf and George M. C. Fisher make a wide range of excellent points in “A Makeover for Engineering Education” (Issues, Spring 2002). There is an increasing need for engineers to be diversely educated. Curricula must be broadened to include knowledge of environmental and global issues, as well as business contexts for design. And, of course, lifelong learning is also crucial.

However, meeting the authors’ goal of increasing the number of engineering graduates–at least from highly competitive universities offering engineering programs–will be extremely difficult. Capacity at almost all of these institutions is limited to the current output, although some growth potential does exist at new engineering programs or smaller schools.

The limited number of available spaces for new engineering students could be increased if more U.S. universities had curricula in engineering. Only a small fraction of U.S. universities now have engineering degree programs. Today’s engineering programs are resource intensive, requiring more in the way of laboratory work and number of credit hours than a typical undergraduate degree. Currently, a “four-year” engineering degree takes an average of 4.7 years to complete. This intensiveness increases the cost of running an engineering program and makes initiating new engineering programs in our universities difficult. It also acts as a deterrent to potential students who are not willing to narrow their undergraduate experience to the degree that current engineering programs require. Yet many engineering graduates go on to careers in sales, business, or other non-engineering job categories; these graduates benefit from their engineering degree but do not need the intensity or detailed disciplinary training provided by today’s engineering curricula.

Almost all universities with engineering programs have sought or will seek accreditation by the Accreditation Board for Engineering and Technology (ABET) for at least some of their programs. I propose that we encourage universities currently without engineering programs to consider creating a “liberal arts” engineering curriculum not designed to be ABET-accredited. Such a curriculum would allow for a broader range of non-engineering topics to be studied. Being less engineering-intense, this curriculum could be structured to be completed in an actual four years. It would be aimed at students interested in such activities as technical sales or those who plan on a corporate leadership path in technical firms. An understanding of technology and engineering would be of substantial importance, but with less emphasis on the ability to design or create engineering products. Graduates from these programs who later decide they wish to pursue a more technically oriented career could go on to get a master’s degree at a fully accredited engineering college. Also, if demand for engineers (and engineering salaries) increased, there would be a pool of graduates who could become practicing engineers with a relatively brief period of additional study.

The benefits of this new class of engineering programs would likely include attracting a much wider group of students into the engineering world, such as students who would reject today’s engineering programs as being too narrow and intense. It also would provide a pool of “almost engineers” who could, within a year or so, become full-fledged engineers.

FRANK L. HUBAND

Executive Director

American Society for Engineering Education

Washington, D.C.


I concur with Wm. A. Wulf and George M. C. Fisher’s underlying premise that engineering education needs to be reformed to respond to the 21st-century workplace.

We know that businesses are demanding engineers with skills, including the ability to effectively communicate and to work as part of a team. Surveys of companies employing engineers reveal that although new engineering graduates are well trained in their discipline, they often are not fully prepared for the business environment. Employers tell us they would like to see greater emphasis on teamwork, project-based learning, and entrepreneurial thinking.

The schools are beginning to rethink and reorganize their curricula, though perhaps not as quickly as Wulf and Fisher would like to see. For example, chemical engineering students at Virginia Commonwealth University must have experience in industry before graduating, and Johns Hopkins University students must complete at least 18 credits in the humanities or social sciences.

I can assure your readers that the professional societies are also responding to the changing world of engineering. The National Society of Professional Engineers (NSPE) has developed several resources for students and young engineers, including discussion forums, information links, and education programs, that are available online at www.nspe.org. We have created programs through each of our five practice divisions: construction, education, industry, government, and private practice. For example, through the Professional Engineers in Construction mentoring program, young engineers are introduced to licensed construction professionals and given practical guidance on acceptable construction practices.

The role of engineers in our society and their impact on our daily lives are constantly evolving. Engineers improve our quality of life, and engineering education is the foundation of the engineering profession. By providing greater opportunity for innovation and experimentation in engineering education, we can be assured that tomorrow’s engineers will have the skills needed to meet the evolving world of engineering.

DANIEL D. CLINTON, JR.

President

National Society of Professional Engineers

Alexandria, Virginia


Environmental policy for developing countries

I was delighted to read “Environmental Policy for Developing Countries,” by Ruth Greenspan Bell and Clifford Russell (Issues, Spring 2002). I have long felt that decisionmakers for development assistance should look to market mechanisms as an essential ingredient of development strategies, certainly including the management of environment issues. But I’ve also learned to be very wary of formulas that purport to be panaceas for issues as complex as economic and social development. (I was significantly involved in earlier panacea-seeking: the basic human needs thrust of the early 1970s, appropriate technology later in that same decade, sustainable development in the 1990s, etc.) Reliance on market mechanisms for environmental protection risks a fate similar to that of those earlier fads, at great cost to the need for the ultimate panacea: a wise combination of approaches, including market mechanisms, that fit individual countries’ particular needs and political realities. Bell and Russell have it just right.

I was particularly pleased with the analytical frameworks Bell and Russell use to begin to answer their fundamental question: “”What have we learned about the conditions necessary for effective market-based policies?” Not surprisingly, given their own experience, their focus is primarily on the transition countries of East and Central Europe and the former Soviet Union, but their analysis would have been even more powerful had they cited examples of “bone-deep understanding of markets,” “ensuring integrity,” or “genuine monitoring” in countries with even less developed markets, such as Ghana, Bolivia, or Nepal–let alone Honduras or Burkina Faso.

I also wish that the authors had given the distinguished Theo Panayotou from Harvard an opportunity to speak for himself. For instance, since most of his work and success have come from efforts in countries with particularly well-developed market economies, such as Thailand and Costa Rica (if I recall correctly), I strongly doubt that he would advocate singular reliance on market mechanisms for environmental management in Honduras or Burkina Faso.

But these are minor nits to pick. Bell and Russell have made an important and eminently sensible contribution to the policy debate about environmental policy and management in developing countries. I hope our policymakers will pay attention.

THOMAS H. FOX

Washington, D.C.

Fox is a former assistant administrator for policy and program coordination of the U.S. Agency for International Development.


Russell and Bell provide some valuable insights into the tiered approach to encouraging the use of market-based instruments (MBIs) in environmental management in developing countries. The development community, donor organizations, and policymakers could certainly benefit from them. The authors make the case for the need for tailored approaches to environmental management under different conditions in various countries, but they generalize many other concepts without a thorough assessment of the specific experience with MBIs in different countries. In doing so, they contradict the main point they intend to make.

The article starts out by giving the distinct impression that MBIs are equated with emissions trading. Further, it implies that donor organizations push emissions trading and fail, but the article does not mention the other efforts of the countries themselves or of donor organizations with respect to promoting other MBIs in developing countries. It is not quite clear whether the authors’ complaint is about MBIs in general or complexity in emissions trading in particular. In some places, the article almost seems to conclude that command-and-control systems are superior to MBIs, and there is no need to try sophisticated methods such as MBIs for environmental management in developing countries, where even the regular market does not exist.

The article also gives the impression that there have been no successful applications of MBIs in the developing countries and even doubts that the experience with them in the developed world is sufficient to draw convincing conclusions. In fact, we have seen very successful localized applications of MBIs. The article agrees that there could be such cases, but unfortunately, it does not classify them as successes because of their smallness or localized nature. This does not tally with the statement that donor organizations do not currently recognize the variations in conditions that should be considered to promote successful MBIs. Experience shows that this variability occurs not only from country to country but even from place to place or region to region within one country. Understanding of such diversity is the reason for the localized successes in uses of MBIs.

I think that not all activities of donor organizations can be seen by an external reviewer. Much study has been devoted to reviewing past experience and current situations. For example, the Asian Development Bank (ADB) stresses capacity building in its projects and tackles institutional issues in its policy dialogue with governments. Pilot projects are promoted before wide-scale implementation in order to avoid wasting resources. ADB’s approach to environmental work in many developing member countries is a case in point. On average, from 1991 to 2001, ADB has provided lending support for about $764 million worth of environmental improvement projects in its developing member countries. Environmental management strategies and polices promoted in these projects are a mix of command-and-control and MBIs. ADB is currently supporting the pilot testing of air emissions trading through technical assistance grants as part of environmental improvement projects. ADB does not push emissions trading within countries other than by providing information and expertise in understanding the merits of such MBIs. It does provide technical assistance for examining the whole breadth of existing MBIs and the potentials for expansion.

Again, let me say that the main point of the article has many merits and holds valuable suggestions that can shape the future promotion of MBIs. However, it clearly underestimates the wide experience with MBIs, particularly in Asia. Much more patience is needed to digest the worldwide experience on the subject.

PIYA ABEYGUNAWARDENA

Manila, Philippines


The car of the future

“Updating Automotive Research” by Daniel Sperling is insightful and timely (Issues, Spring 2002). In connecting the government’s earlier attempts to improve the efficiency of personal vehicles through the Partnership for a New Generation of Vehicles (PNGV) to the recently announced FreedomCAR, the author raises important questions regarding the effectiveness of the policies behind both initiatives.

Although PNGV is now history, a debate continues as to what, exactly, it was and what it accomplished. Sperling’s account is faithful on both counts. PNGV was morphed into FreedomCAR in March of this year. At that point, PNGV had run for about seven and a half years of its projected 10-year lifetime. It was increasingly apparent that the technologies considered necessary to create a production prototype of an 80-miles-per-gallon sedan had been identified and developed to the point where the dominant remaining issue remaining was their affordability. Spending taxpayer money to improve the affordability of automobile components was increasingly harder to justify.

So the “sunsetting” of PNGV was entirely logical. On the other hand, the reluctance of the Big Three auto manufacturers to commit to a car that incorporated its technologies was a disappointment to all who participated in the program. Especially so since some Japanese car manufacturers offered cars with such technologies.

In recent years, PNGV support for fuel cell­related technologies increased significantly. Although there is currently much publicity regarding fuel cell vehicles, it is important to note that they are at a Model T stage. As pointed out by Sperling, there are many challenging technology issues to resolve before we will see many of these on the road. Therefore, the renaming of the government’s automotive technology program and the sharpening of its focus on fuel cell research is very much in keeping with the government’s role in pursuing long-range, more fundamental research.

What is needed that the government isn’t doing under the FreedomCAR program? To Sperling’s list, which I endorse, I add an extensive field evaluation program of promising technologies. This could be done within the government’s annual vehicle procurement program. The various fuel cell vehicle types could be assigned to federal facilities, national labs, military installations, etc., to be evaluated under controlled conditions. The experience gained would, among other things, reassure the buying public that the technical risk of fuel cell vehicles had been thoroughly evaluated and minimized.

One quibble with Sperling’s paper: He states that “PNGV was managed by an elaborate [emphasis mine] federation of committees . . .” Anyone involved in PNGV would question whether it was managed from above; guided, maybe. The method of operation was to decentralize to the working level, which was a dozen or so Technical Teams maintained by USCAR, the industry coordinating organization. Management meetings were infrequent, supplemented by telephone conference calls. There were seven government agencies participating in PNGV. This allowed industry access to a wide range of technologies. FreedomCAR, in contrast, is supported by a single agency, the Department of Energy (DOE). If a single agency is to be selected, DOE is certainly the most appropriate. However, there is still substantial ongoing research within other government agencies that might advance FreedomCAR’s goals. But this would call for the Bush administration to exert leadership at its highest levels, which it is apparently loath to do.

ROBERT M. CHAPMAN

Consultant

The RAND Corporation

Arlington, Virginia


General Motors (GM) agrees with Daniel Sperling’s characterization of the Department of Energy’s FreedomCAR initiative as “a fruitful redirection of federal R&D policy and a positive, albeit first step toward the hydrogen economy.”

We’re excited about FreedomCAR because it should, over time, help harness and focus the resources of the national labs, U.S. industry, and universities to support the development and commercialization of fuel cell vehicles. Shifting to a hydrogen-based economy is a huge undertaking. Sperling correctly points out that government will continue to have an important role, as will the energy companies and automakers.

As the world’s largest automaker, GM takes its role in this endeavor very seriously. We know cars and trucks. We know how to build, design, develop, and sell them. And the automotive industry contributes significantly to the global economy.

Sperling’s article concludes with several good suggestions for hastening the day when fuel cell vehicles are regular sights on the nation’s roads and highways. Although his specific recommendations dealt with funding issues for key stakeholders in developing the technology, many of the practices the money would support and other ideas to which Sperling referred are already in place at GM.

GM has focused intently on the fundamental science of fuel cells and has invested hundreds of millions of dollars in fuel cell research, because we believe that there are certain technologies that we must own in order to control our destiny. We also are working in partnership with other automakers and have developed key alliances with innovative technology companies, including General Hydrogen, Giner Electrochemical Systems L.L.C., Quantum Technologies Worldwide, Inc., and Hydrogenics Corp. In addition, we are working with dozens of other suppliers on various fuel cell components.

GM has also engaged the energy companies in developing gasoline-reforming technology for fuel cell applications. A reformer extracts hydrogen from hydrocarbons, such as gasoline and natural gas, to feed the fuel cell stack. Our North American “Well-to-Wheels” study conducted with ExxonMobil, BP, Shell, and the Argonne National Laboratory showed that reforming clean gasoline either onboard a vehicle or at gasoline stations can result in significantly lower carbon dioxide emissions. We’re working with energy companies to develop this concept as a bridging strategy until a hydrogen refueling infrastructure can be developed. We are also pursuing stationary applications for our fuel cell technology to provide clean, reliable electricity for businesses while increasing our cycles of learning.

In July 2002, GM will open a new 80,000-square-foot process development center at our fuel cell research campus in upstate New York. The facility, which will be staffed with up to 100 employees, will allow us to determine the materials and processes necessary to mass-produce fuel cells.

Commercialization is well within sight, even though much R&D remains. GM is working hard on our own fuel cell program, and we also fully support a broad-based public policy strategy to accelerate the industry’s progress along this exciting path.

LAWRENCE D. BURNS

Vice President

General Motors Research & Development and Planning

Warren, Michigan


Daniel Sperling succinctly summarizes and puts into perspective the developments in the United States, Europe, and Japan that have led to FreedomCAR. This is indeed a positive first of many more steps that will be needed in the long march to a sustainable energy economy. One might have expected such moves from a Democratic administration, not from a White House run by a conservative Texan from a political party that in the past has been closely identified with Big Oil and other fossil energy sources, and with long-standing hostility to the intertwined issues of man-made greenhouse gases and global warming.

It has been speculated that Energy Secretary Spencer Abraham, a former senator from Michigan with close contacts to Detroit’s auto industry, was persuaded to embark on this hydrogen initiative by the automakers, who over the past decade have invested billions of dollars in hydrogen and fuel cell R&D. He also may have had advice from Robert Walker, reportedly a friend of Abraham’s, a former Republican congressman from Michigan, former chairman of the U.S. House of Representatives’ Science Committee and for years the only visibly vocal Republican hydrogen champion (Walker authored early key legislation, the 1995 Hydrogen Future Act ) in Congress. (To be fair, other than the late George Brown Jr. in the House and Sen. Tom Harkin in the Senate, there weren’t that many outspoken Democratic hydrogen supporters either.)

As to the Partnership for a New Generation of Vehicles’ (PNGV’s) “boomerang effect” on the foreign competition, it is not clear to me that, in the case of Daimler-Benz at least, this was in fact the motivating factor for the company to start its fuel cell program, which, as Sperling points out correctly, spawned the major efforts by GM and Toyota and eventually by most other carmakers. PNGV and the Daimler-Benz/Ballard venture got underway almost in parallel: The Daimler-Benz/Ballard pact was first reported in May 1993, and the formation of PNGV was announced four months later. Rather, it looked more like a logical relaunch of the company’s foray into hydrogen that began in the 1970s. The first hydrogen-powered Daimler-Benz internal combustion­engined minivan was shown at the 1975 Frankfurt Auto Show. These efforts reached a peak of sorts with a four-year test of 10 dual-fuel (hydrogen and gasoline) internal combustion vehicles in West Berlin that ended in 1988 after 160,000 miles.

Nor am I sure that Sperling is on the mark about automakers’ reluctance to expand industry engagement to energy companies. In its press releases, GM routinely points to joint research with energy companies, and both DaimlerChrysler and Volkswagen recently announced pacts with chemical process companies to develop clean liquid designer fuels for fuel cell vehicles. Conversely, most large oil companies have set up divisions (Shell Hydrogen is one example) to work on fuel cells and hydrogen.

But these are minor quibbles. Sperling is correct in his call for government to play an important role in commercializing fuel cells: The removal of institutional barriers (including tax breaks and eased environmental rules for zero-emission vehicles and facilities) as a purchaser of hydrogen/fuel cell vehicle fleets come to mind. Also, government–national, state, local, and regional–must assist in setting up a fueling infrastructure, something that is now getting started: The Department of Energy is creating regional stakeholder groups to come up with recommendations, and California’s South Coast Air Quality Management District has drawn up plans for an initial small string of hydrogen fueling stations in the Los Angeles Basin.

Overall, Sperling is to be commended for pulling together and presenting a number of critical issues affecting this momentous shift in not only America’s but the world’s energy systems; after all, it’s not American warming, or Japanese warming, or European warming but Global warming that we’re fighting.

PETER HOFFMANN

Editor and publisher

The Hydrogen & Fuel Cell Letter

Rhinecliff, New York


Coral reef pharmacopeia

Bravo to Andrew Bruckner for providing a balanced and accurate assessment of the enormous biomedical resources that can be derived from the unique life forms found on coral reefs (“Life-Saving Products from Coral Reefs,” Issues, Spring 2002). His article calls for increased attention to the development of marine biotechnology within the United States and, rightly, comments further on the issues of management and conservation of these highly diverse, genetically unique resources. Although U.S. funding agencies have not heavily invested in marine biomedicine, arguably U.S. scientists have nonetheless consistently remained at the forefront of this science. The difficulty has been that programs have been lacking that link marine exploration and discovery with the significant experience and financial resources needed to develop drugs. Because of this, it can be confidently estimated that less than 5 percent of the more than 10,000 chemical compounds isolated from marine organisms have been broadly evaluated for their biomedical properties.

Why is this incredible resource not being used? For complex, valid reasons, the pharmaceutical industry has turned its attention to more secure and controllable sources of chemical diversity, such as combinatorial and targeted synthesis. Traditional studies of natural products, although recognized to generate drugs, require extra time in the collection, extraction, purification, and compound identification processes. The intensity of competition in the pharmaceutical industry has created the need for very streamlined discovery processes to maintain the current rate of new drug introduction. Nonetheless, marine-derived drugs are indeed entering the development process.

The role of American and international universities in drug discovery has been steadily increasing for the past two decades. More and more, the pharmaceutical industry is licensing academic discoveries. Academic scientists with an understanding of the world’s oceans, and with sufficient expertise in chemistry and pharmacology, have explored coral reefs worldwide, yielding many drug discoveries that are now in clinical and preclinical development. U.S. funding agencies have played a major role in stimulating these activities. The National Sea Grant Program (U.S. Department of Commerce) and the National Cancer Institute [National Institutes of Health (NIH)], have, for more than 25 years, played major roles in supporting marine drug discovery and development. One of the most creative and successful programs is the National Cancer Institute’s National Cooperative Natural Product Drug Discovery Groups (NCNPDDG) program. This program provides research funds for the establishment of cooperative groups consisting of marine scientists and pharmacologists, but the program also includes the participation of pharmaceutical industries. By its design, it creates bridges between scientific disciplines and provides for the translation of fundamental discoveries directly to the drug development process. This program, which has both terrestrial and marine natural products components, is one of the most successful and productive efforts I have observed.

There are new initiatives on the horizon as well. In a recent development, the Ocean Sciences Division of the National Science Foundation and the National Institute of Environmental Health Sciences at NIH have formed an interagency alliance to create a multifaceted national initiative to focus on the “Oceans and Human Health.” This program seeks to create centers of expertise dedicated to understanding the complexities of linking oceans and their resources to new challenges in preserving human health.

Clearly, these activities will increase the U.S. investment in marine biomedicine and biotechnology. But is this sufficient to realize the enormous potential of the world’s oceans? Probably not, but it is a great start toward that end. The ocean and its complex life forms are our last great resource. To overlook the medical advances to be found there would be unwise.

WILLIAM FENICAL

Professor of Oceanography

Scripps Institution of Oceanography

University of California, San Diego

Cite this Article

“Forum – Summer 2002.” Issues in Science and Technology 18, no. 4 (Summer 2002).

Vol. XVIII, No. 4, Summer 2002