Forum

In defense of defense spending

CHARLES V. PEÑA’S “A REALITY CHECK on Military Spending” (Issues, Summer 2005) falls short of confronting the broad challenge of how the United States might best engage on the issues of global security. His approach seems almost to be a casual excuse for cutting the budget: settle for playing the role of a “balancer of last resort” and let others “take greater responsibility for their own regional security.”

His defense budget strategy would not create an alternative paradigm for security that the rest of the world could live with. His abrupt reduction of U.S. forces and overseas deployments would only result in eventual challenges to U.S. security, with no institutions or capabilities in check to counter them, ensuring regional and global security chaos.

Three pieces are missing from Peña’s vision. The first involves the need to put in place a regional and global security architecture that would ensure stability, peaceful transitions, and the ability to confront danger, allowing the United States to play a more restrained role. European militaries are working toward, but still fall short of, assuming a security role that could eliminate the need for U.S. forces.

Africa lacks the institutions and capabilities to ensure regional security and will need considerable outside help. There is no Middle Eastern or north or Southeast Asian security arrangement like NATO, and only a few bilateral agreements in which the United States plays a role (such as with Australia, South Korea, and Thailand). Taiwan has no other security guarantor but the United States and will not accept a regional alternative.

Second, Peña comes up short in describing what the U.S. military’s role should be in dealing with the major global challenges that the United States and others face: terror, proliferation, and instability in failed states. U.S. forces are poorly trained for these missions, yet as Peña recognizes, they are missions for which forces are needed. Here he contradicts himself—these missions are global in scope, not purely regional, but he wants U.S. forces to withdraw from a global presence.

Third, Peña leaves undiscussed how the entire tool kit of statecraft and allied relationships might be used to deal with the security dilemmas the world faces today, dilemmas that underlie terror and proliferation: the global poverty gap; the need for stable, effective, and responsive governance in vast regions of the globe; the raging conflicts of ethnicity and belief that inflame current tensions; and achieving an affordable, secure energy supply.

Peña collapses one security tool— the military—but offers no security vision that addresses these dilemmas with an integrated set of other tools: foreign assistance, diplomacy, public diplomacy, and allied cooperation. Without such an integrated strategy, eliminating the U.S. military just pours fuel on the fire.

GORDON ADAMS

Director, Security Policy Studies

Elliott School of International Affairs

The George Washington University

Washington, DC


CHARLES V. PEÑA NOTES, correctly, that “ever-increasing defense spending is being justified as necessary to fight the war on terrorism.” Then, however, instead of trying to correct that erroneous justification, he falls into the common trap of wanting to design the U.S.’s armed forces to meet only that most imminent of threats to U.S. and allies’ national security without regard to other long-term risks to that security and the military requirements for sustaining it. His article consequently contains some serious errors in strategic reasoning as well as some technically incorrect statements about military systems currently in acquisition.

ADOPTING THE “BALANCER-OF-LAST-RESORT” STRATEGY THAT PEÑA RECOMMENDS WOULD BE TAKEN BY THE OUTSIDE WORLD AS A SIGNAL THAT THE UNITED STATES IS RETREATING INTO THE ISOLATIONISM OF AN EARLIER DAY.

To deal with the latter first:

Peña says that the F-15 Eagle is not challengeable by any potential adversary. However, the Russian Sukhoi Su-30 has similar performance, with the additional maneuverability advantage of vectored thrust, so that superiority in air combat against such aircraft will depend to a great extent on pilot proficiency, tactics, and the quality of the air-to-air weapons, in which the United States will not necessarily be superior. This was demonstrated in recent mock combat exercises, when an Indian Air Force contingent including Su-30s and other Russian and French aircraft “defeated” a force of F-15s. The Russian and French aircraft and their missiles are for sale to any willing buyer.

He says the F-22 Raptor was designed for air superiority against Soviet fighters, but that is only partly true. It is also designed to be better able than current fighters to penetrate Russian ground-based air defense systems that are also for sale on world markets.

He says, incorrectly, that helicopters can perform the same mission as the incoming V-22 Osprey. The Navy’s CH-53 Sea Stallion helicopter in its various versions, which can carry about the same payload as the V-22, has significantly less range and flies at a much slower speed than the V-22. It has less ability to penetrate to the depths that may be necessary in future regional conflicts along the Eurasian periphery, and greater vulnerability to shoulder-fired antiaircraft missiles. In addition, it is an old aircraft and a large maintenance burden to the naval forces that use it. Improving performance and reducing maintenance cost are major reasons why systems are replaced, in civilian as well as military life.

He says the Navy’s F/A-18 Super-Hornet is an “unneeded” tactical fighter. It, too, however, has significantly more range and combat capability than the F-18C/D that it is replacing. This difference appears in many aspects of total system design and cost, including, for example, the need for less tanker capacity to help the aircraft penetrate to distances such as those from the Indian Ocean to Afghanistan, as was necessary during the campaign to eliminate al Qaeda’s established presence in that country. The bomber force alone couldn’t carry that whole task, because its sortie rate is much lower than that of the carrier attack force.

Finally, he says that the Virginia-class submarine is no longer needed because the Soviet submarine threat has gone away. This disregards the facts that China has a significant number of nuclear attack submarines and that quiet conventionally powered submarines are proliferating in waters that U.S. and allied shipping and naval forces will have to transit in any Eurasian regional conflict.

In the strategic area, Peña proposes that we cut our armed forces in half and drop back to a strategy of letting our allies or other regional powers handle conflicts as they arise, with our forces on call if help is needed. This seems to neglect the fact that our ground forces are already stretched thin by operations in Iraq and Afghanistan, to which we are committed for an indefinite period. Where would the additional forces come from if we were called on to help in another regional conflict? Such conflicts are possible in Korea or over Taiwan, or even with Iran or Pakistan, should internal events in those countries turn them into regional antagonists threatening U.S. allies or other countries (such as Taiwan) over which we have extended our protective umbrella.

Peña makes light of the contribution that our 31,000 troops in the Republic of Korea (ROK) make to that country’s defense. He notes that the 700,000-strong ROK army should be sufficient to defend against the million-man North Korean armed forces. This neglects the fact that the North Koreans once before in history demonstrated powerful fighting capability, and the possibility that scarce resources in that economically deprived country may be deflected from supporting the civilian population to keeping their armed forces in top condition to support the bellicose North Korean foreign policy statements. It also neglects the fact that the U.S. Army units in Korea are deployed to defend Seoul, which is only an hour or so march by armored forces from the North Korean border. Withdrawal of our forces would thin the ROK defenses against a formidable potential foe and unacceptably expose the most critical point in an ally’s defenses and survival to imminent capture.

Peña quotes Harold Brown’s Council on Foreign Relations task force as saying that China is 20 years behind us militarily, and he seems to rely on the subsequent statement that we can retain that lead. But he appears to ignore the very important conditional clause in the same statement: “…if the United States continues to dedicate significant resources to improving its military forces,” and counsels stopping that continued improvement. He thus advocates giving China the needed breathing space to pull even with us, in the face of its threats to regional stability over Taiwan and its expressed desire to extend its maritime control 2,500 kilometers into the waters adjacent to its coasts—areas where we have many strategic interests, including those in Japan, Korea, South and Southeast Asia, and Australia/New Zealand.

Finally, Peña seems to neglect the fact that U.S. national security depends on our globally oriented economic wellbeing and strategic leadership among many allies and affiliated nations. Our national strategy is based on the premise that as the world becomes more democratic, the occurrence of destructive wars will decrease. Whether one favors the relatively passive approach of the Clinton strategy, which was based on the assumption that democracy would naturally spread in a world of free and open trade, or the more assertive attempts to spread democracy adopted by the Bush administration, the fact that we have the world’s dominant economy and military forces requires that, in our own interest, we must be a leader of what used to be called the “Free World.” Adopting the “balancer-of-last-resort” strategy that Peña recommends would be taken by the outside world as a signal that the United States is retreating into the isolationism of an earlier day. One cannot be a leader by saying “you go take care of it, and if you get into trouble and Congress approves I’ll come and help you, with forces I may or may not have.” We saw what happened when we tried to adopt such a strategy vis-à-vis the Balkans, and it wasn’t pretty.

As to the affordability of the armed forces under such a leadership strategy, there is little value in comparing our military expenditures with those of any other combination of nations. We can always find some such combination to add up to what we spend, and the more nations we include in the comparison the more profligate we can be made to appear. However, if in protecting our own security and that of our allies we need armed forces that have what the U.S. Air Force has called “global reach, global power,” then we shall have to pay what it costs, in terms of our own cost structures and force needs. Although, as Peña points out, the current defense budget, including military operations in Iraq and Afghanistan, absorbs 3.7% of gross domestic product (GDP), we were able to sustain defense budgets of about 5% of GDP for some years running during the so-called Reagan Buildup.

The current world strategic situation may not look as critical at first glance as the situation when the Soviet threat was foremost in our consciousness. However, a careful look will show that the current situation is more dangerous than it was then. There are more kinds of threats, stretching into the indefinite future, ranging from the Islamist extremists’ jihad against us and our allies to the possibility of major regional wars threatening our world interests. Are we to argue that we cannot afford now what we did at that earlier, simpler time?

SEYMOUR DEITCHMAN

Bethesda, Maryland


IT WOULD HAVE BEEN NICE if an article that uses the words “reality check” in its title were in fact based on reality. Unhappily, Charles V. Peña’s attack on virtually every aspect of U.S. defense policy is not only unreal, it borders on the surreal.

Peña starts by tying his critique of the size of the U.S. military and the resulting defense budget to the war on terror. This is only one of the military’s missions, even at present. As he well knows, the military must prepare for and be capable of prosecuting other conflicts, while also providing support to homeland security, humanitarian relief, and other missions. But even in the war on terrorism, Peña conveniently ignores the contribution that aircraft carriers, strategic bombers, tactical fighters, and armored forces are making to this struggle. Moreover, although conventional forces have demonstrated utility in the war on terror, the light counterterrorism capabilities Peña advocates would be relatively useless if a serious conflict were to occur.

Peña then resorts to the hoary device of comparing U.S. defense spending to that of other countries without providing any context for comparison. The United States spends more than other countries for defense. It also spends more, per capita, on automobiles, health care research, and lawn furniture than any other country. Last year, Americans spent as much on their pets as the entire economies of North Korea, Kenya, or Paraguay. If we are not going to tie our level of spending in other areas to that of foreign countries, why should national security be any different?

Dollar expenditure comparisons are meaningless, particularly when many of the countries he mentions have conscript armies; little in the way of health care requirements, pension plans, or environmental laws; and are willing to risk their soldiers, sailors, and airmen in inferior tanks, ships, and fighter planes. There are differences in cost structures that result in higher U.S. defense budgets. For example, the military has a brigade’s worth of lawyers (5,000 people) in uniform, costing around $600 million dollars annually. China’s defense budget, if normalized for U.S. practices and prices, would be at least four times the figure Peña cites. He also fails to account for the fact that the U.S. defense budget includes funds for power projection capabilities that many other countries do not need, because they are where the action is likely to take place while we are not. The realities of defense costs are much more complex than Peña acknowledges.

From here Peña’s argument becomes surreal. He proposes a 50 percent reduction in the size of the U.S. military, based largely on the arguments that major conventional wars will be rare in the future and that other countries should be required to defend themselves. Although both points are reasonable, they are also inadequate. America’s conventional wars have always been rare. But when they occur, we always strive to win them decisively. Peña’s proposal would drop the United States from the world’s second-largest and unquestionably best military to approximately eighth on the list. But of course that number is misleading because we could never focus all those forces against a single adversary. Thus, the effective size of the U.S. military would be somewhere between 15th and 20th in the list of military powers, behind such world giants as North Korea, Pakistan, Turkey, Iran, Egypt, Algeria, Vietnam, and Syria.

Peña proposes not only cutting the size of the U.S. military in half but simultaneously undermining the technological superiority that has enabled U.S. forces to win their wars for more than a century. Peña argues for eliminating the F-22, V-22, and Virginia-class attack submarine because these programs were started during the Cold War. He wrongly claims that the F-15 has no prospective adversary and that the V-22 is an unproven platform. Without the Virginia-class submarines, the United States will soon have not merely an aging undersea force but none at all. Peña wants the new, smaller U.S. military to fight its future wars with technology that today is 20 years old in the case of attack submarines, 30 years old for fighters, and 40 years old for helicopters.

Peña’s argument amounts to nothing more than slightly warmed-over isolationism. We are to be the “balancer of last resort.” Such a strategy made sense, perhaps, before the advent of a globalized economy, instantaneous communications, high-speed transportation, energy dependence, and the proliferation of weapons of mass destruction. But it is wrongheaded and even dangerous in the 21st century. A four-division Army and a one-division Marine Corps would not make us the balancer but the patsy. Remember the fate of the British Expeditionary Force in France in 1941? This was a force about the same size as that Peña proposes. Remember Dunkirk?

Peña does leave himself an escape clause, the old “intervene only when truly vital interests were at stake” line. However, Peña clearly would define vital interests so narrowly that there would never be a reason to fight. As the United States discovered in two world wars, the Cold War, Desert Storm, and the global war on terror, although the location of U.S. vital interests may change, they are always out there and they always need defending. Peña’s last-resort strategy is more of a forlorn hope.

Ultimately, Peña’s proposals would have the United States defend its territory, friends, and interests with a military smaller than most potential adversaries, one without any technological advantages to make up for the dearth of soldiers and lacking the overseas bases from which to operate. Perhaps we could rely on allies to help us out. Oops, I forgot. Peña’s proposals would leave us without allies too. It seems to me the one who needs to undergo a reality check here is Peña.

DANIEL GOURE

Vice President

The Lexington Institute

Arlington, Virginia


CHARLES V. PEÑA MAKES SOME excellent observations but is blind to the strategic interests of the United States and how best to secure them. He is correct that the United States should use military force only to defend vital national interests and should cut unnecessary defense programs. His argument that the U.S. military requires further transformation is also right. However, his reasoning for these policies and how to achieve them ignores the strategic realities of the 21st century and the precedents that have largely defined them.

He suggests that U.S. overseas bases are vestiges of Cold War-era containment strategy and that it can slash spending by significantly reducing its presence around the world. If it adopted a so-called “balancer-of-last-resort” strategy, the United States would no longer attempt to use a global presence to shape events but reserve the capability to intervene in conflicts that affect its vital interests as they emerge. Peña also points to the disparity of U.S. defense expenditures compared to those of our allies, and again proposes removing U.S. troops and leaving allies to their own local security to save defense dollars.

If the interests of host nations defined U.S. presence, then this assertion would be correct, but it is not. Instead the United States maintains a robust overseas presence because it advances U.S. interests. The Cold War may be over, but it has not been replaced by a more stable world or a world in which U.S. strategic interests have disappeared. Instead, America’s strategic interests remain intact, but the means to secure them have rapidly evolved.

Instead of retreating, the United States must reorganize its overseas bases to be consistent with the modern world, technologically and strategically. Not doing so harms its vested interests by underestimating the stabilizing role of U.S. forces and the negative impact, economically and politically, of removing them. Besides, leaving Europe, the Pacific, or the Middle East would create a vacuum that could be filled by a hostile power.

Also, although the United States does spend more on defense than its allies, it has the most to lose from major changes in the global status quo. Inadequate spending by allies should not be answered by the United States spending less, but by those allies spending more.

One of Peña’s strongest arguments, the need for military transformation, is one of the reasons for recent increases in defense spending. Outfitting a military force for the future is not cheap. Additionally, his assertion that Iraq and the war on terror should not be the gauge for future force requirements is correct. However, his balancer-of-last-resort strategy would cut U.S. active forces from about 1.4 million to around 700,000. As noted above, the United States must maintain a global presence, and therefore such reductions would be unwise.

The military posture of the United States must be defined by strategic interests and not by funding levels. Although it may seem that it is time to bring the troops home, policymakers must consider that perhaps the lack of major threat to U.S. security is a testament to the criticality of its ongoing overseas mission.

JACK SPENCER

Heritage Foundation

Washington, DC


Digital education

HENRY KELLY’S “GAMES, COOKIES, and the Future of Education” (Issues, Summer 2005) provides an excellent synthesis of challenges and opportunities posed by technology-based advances in personalized entertainment and services. An aspect of this situation deserves further discussion: Children who use new media extensively are coming to school with different and sophisticated learning strengths and styles.

Rapid advances in information technology have reshaped the learning styles of many students. For example, the Web, by its nature, rewards comparing multiple sources of information that are individually incomplete and collectively inconsistent. This induces learning based on seeking, sieving, and synthesizing, rather than on assimilating a single “validated” source of knowledge as from books, television, or a professor lecturing.

Also, digital media and interfaces encourage multitasking. Many teenagers now do their homework by simultaneously skimming the textbook, listening to a MP3 music player, receiving and sending email, using a Web browser, and conversing with classmates via instant messaging. Whether multitasking results in a superficial, easily distracted style of gaining information or a sophisticated form of synthesizing new insights depends on the ways in which it is used.

Another illustration is “Napsterism”: the recombining of others’ designs into individual, personally tailored configurations. Increasingly, students want educational products and services tailored to their individual needs rather than one-size-fits-all courses of fixed length, content, and pedagogy. Whether this individualization of educational products is effective or ineffective depends both on the insight with which learners assess their needs and desires and on the degree to which institutions provide quality customized services, rather than Frankenstein-like mixtures of learning modules.

During the next decade, three complementary interfaces to information technology will shape how people learn.

  • The familiar “world-to-the-desktop” interface, providing access to distant experts and archives, enabling collaborations, mentoring relationships, and virtual communities of practice. This interface is evolving through initiatives such as Internet2.
  • “Alice-in-Wonderland” multiuser virtual environment (MUVE) interfaces, in which participants’ avatars interact with computer-based agents and digital artifacts in virtual contexts. The initial stages of studies on shared virtual environments are characterized by advances in Internet games and work in virtual reality.
  • Interfaces for “ubiquitous computing,” in which mobile wireless devices infuse virtual resources as we move through the real world. The early stages of “augmented reality” interfaces are characterized by research on the role of “smart objects” and “intelligent contexts” in learning and doing.

The growing prevalence of interfaces with virtual environments and ubiquitous computing is beginning to foster neomillennial learning styles. These include (1) fluency in multiple media, valuing each for the types of communication, activities, experiences, and expressions it empowers; (2) learning based on collectively seeking, sieving, and synthesizing experiences; (3) active learning based on experience (real and simulated) that includes frequent opportunities for reflection by communities of practice; and (4) expression through nonlinear associational webs of representations rather than linear “stories” (such as authoring a simulation and a Web page to express understanding, rather than a paper).

All these shifts in learning styles have a variety of implications for instructional design, using media that engage students’ interests and build on strengths from their leisure activities outside of classrooms.

CHRIS DEDE

Wirth Professor of Learning Technologies

Harvard University Graduate School of Education

Cambridge, MA 02138


HENRY KELLY’S ARTICLE PROVIDES readers with a timely and comprehensive look at what is needed to address glaring shortfalls in the U.S. education system. The article underscores the lack of investment in R&D on new educational techniques that would use the up-to-date technology currently available. By conveying how increased investment in educational R&D can improve teaching and learning, Kelly is making an excellent case for the adoption of the Digital Opportunity Investment Trust (DO IT) legislation.

BY CONVEYING HOW INCREASED INVESTMENT IN EDUCATIONAL R&D CAN IMPROVE TEACHING AND LEARNING, KELLY IS MAKING AN EXCELLENT CASE FOR THE ADOPTION OF THE DIGITAL OPPORTUNITY INVESTMENT TRUST (DO IT) LEGISLATION.

Although the article notes the low rankings of U.S. students as compared to international students in recent studies, not enough emphasis is placed on the fact that our students are performing alarmingly poorly in the fields of math and science. A study conducted in 2004 found that U.S. students ranked 24th in math literacy and 26th in problem-solving among 41 participating nations and concluded that U.S. students “did not measure up to the international average in mathematics literacy and problem-solving skills” (Program for International Student Assessment at www.pisa.oecd.org). Additionally, U.S. students are becoming less interested in math and science. There has been a steady decrease in bachelor degrees earned in mathematics and engineering in U.S. universities during the past decade.

While our students are not meeting global standards in mathematics and science and are losing interest in these subjects altogether, the United States has become increasingly reliant on foreign talent in these fields. In 2000, 38% of all U.S. science and engineering occupations at the doctoral level were filled by foreign-born scientists (up from 24% in 1990). Filling these critical occupations with foreign talent has become a more complex issue with the war on terror and as global competition for the best and the brightest in science and engineering increases dramatically. During the 1990s, the Organization for Economic Cooperation and Development saw a 23% increase in researchers, whereas the United States saw only an 11% increase.

There is a critical need to change these trends in math and science. We need to build up domestic talent and interest in these crucial areas and provide necessary incentives to attract foreign talent. Increased investment in R&D and educational technology, as outlined in Kelly’s article, can begin to address this need.

Kelly’s article highlights efforts by the National Science Foundation, Department of Education, Department of Defense, and Department of Homeland Security to improve training and educational technologies, but does not stress enough that DO IT is a comprehensive effort that will research and improve teaching and learning techniques that can permeate all U.S. educational institutions. It is important to stress that DO IT legislation would help to fill the current market failure that Kelly mentions (“Conventional markets have failed to stimulate the research and testing needed to exploit the opportunities in education”). DO IT will foster collaboration among educators, cognitive scientists, and computer scientists to research and develop the most effective methods of teaching and learning, using today’s technologies. DO IT will help to ensure that the U.S. education system does not continue to fall behind all other sectors and nations that have embraced the potential of technology.

We are facing a crisis in public education and math and science education; Kelly presents an excellent case for the need to increase educational R&D and succeeds in demonstrating how currently underutilized technologies can improve the learning process.

EAMON KELLY

Distinguished Professor in International Development

President Emeritus

Tulane University

New Orleans, Louisiana


Cyberinfrastructure for research

IN “FACING THE GLOBAL Competitiveness Challenge” (Issues, Summer 2005), Kent H. Hughes persuasively argues that future productivity and economic growth depend on a society’s success at innovation and outlines a series of proposed policy steps that could foster a higher-performance U.S. innovation engine.

One such action that could provide a turbocharged boost to U.S. innovation would be to aggressively implement the February 2003 recommendations of the National Science Foundation Advisory Panel on Cyberinfrastructure, commonly called the Atkins Report (referring to the panel’s chair, Daniel E. Atkins of the University of Michigan).

The study notes that a third way of doing science has emerged: computational science, complementing theoretical and experimental techniques. The report also finds that new information technologies make forms of collaboration in science and engineering possible today that were not possible even a decade ago. “Cyberinfrastructure” refers to a broad web of supercomputers, vast data servers, sensors and sensor nets, and simulation and visualization tools, all connected by high-speed networks to create computational science tools far more powerful than anything we have known in the past. Cyberinfrastructure also refers to the software that links these distributed resources, the security needed to protect them, and the people and institutions needed to maintain and exploit them.

The Atkins Report promises a “revolution in science and engineering” if we invest a billion new dollars a year in these tools and organize ourselves intelligently to use them.

A July 8, 2005, memo from John Marburger and Josh Bolten to federal R&D agencies outlining fiscal year 2007 budget priorities calls for an emphasis on “investments in high-end computing and cyber infrastructure R&D.”

In his recent book, The Past and Future of America’s Economy, Robert D. Atkinson supports investing $1 billion per year in an Advanced Cyberinfrastructure Program, which he asserts “would also lay the groundwork for the next-generation Internet to dramatically expand its possibilities.”

The second step for policymakers, to ensure that investment in scientific infrastructure will pay off, is to lay out a bold national vision for providing broadband for all Americans. Congress and the Federal Communications Commission are revisiting the rules that govern telecommunications and the Internet. They should revise those rules with an eye toward a far horizon, not just rearrange the deck chairs among existing competing providers.

A coalition of 10 higher education organizations has called for adopting “as a national goal a broadband Internet that is secure, affordable, and available to all, supporting two-way, giga-bit-per-second speeds and beyond.” If we were to achieve national broadband connections to every U.S. home and business, supporting synchronous gigabit communications, the technologies developed in a scientific cyberinfrastructure program could propagate to enhanced innovation in e-business, distance education, telemedicine, telecommuting, and expanded forms of leisure and entertainment. The innovation sparked by the first wave of Internet technologies would be dwarfed by the second wave.

Hughes’ vision of a high-performance innovation economy can be spurred in part by a bold national program in advanced cyberinfrastructure for science and engineering, coupled with telecommunications policies designed to bring that power to every U.S. home and business.

GARY R. BACHULA

Vice President for External Relations

Internet2

Bachula formerly served as Deputy Under Secretary of Commerce for Technology.


Supercomputing revival

“BOLSTERING U.S. SUPERCOMPUTING,” by Susan L. Graham, Marc Snir, and Cynthia A. Patterson (Issues, Summer 2005), correctly notes that “Restrictions on supercomputer imports have not benefited the United States nor are they likely to do so.” In fact, import restrictions have impaired U.S. science and probably U.S. industry. For example, it was my privilege to head the Scientific Computing Division (SCD) of the National Center for Atmospheric Research (NCAR) from 1987 to 1998. Climate modeling is a major area of research in the atmospheric sciences and, as noted in the article, it is computationally intensive. Consequently, in the early 1990s the NCAR SCD routinely offered parallel processing on parallel vector processors (PVPs), such as the Cray Y-MP/8 and C90/16. Climate models, in particular, made efficient use of this capability.

In 1995, NCAR climate modelers were completing a new climate model that required computational capability that was an order of magnitude greater than that of its predecessors. The model could efficiently use at least 32 vector processors in parallel. This requirement was a major consideration in a supercomputer procurement that we conducted in 1995–1996. The NEC SX-4, with 32 processors, demonstrated decisively superior performance relative to other offers, so we selected it. A few months later, congressional efforts to overturn the selection, coupled with an antidumping investigation, resulted in severe import restrictions that prevented acquisition of the SX-4. Had we been successful in obtaining the SX-4, an enormous amount of science would have been done in the United States in the late 1990s that was either deferred or accomplished overseas. Further, if the import restrictions not had been imposed, almost certainly some U.S. organization(s), private or public, would have configured an ensemble of SXs in the United States that would have been competitive with the Japanese Earth Simulator. Imagine the possibilities lost!

U.S. industry was probably impaired by import restrictions as well. For example, from the mid-1990s, European and Japanese automobile manufacturers have had the benefit of using Japanese PVPs. Today, many of their products are competing handsomely in the international marketplace. During the same time frame, Airbus has substantially advanced its competitive position in the world market. It would be fascinating to know what role the use of Japanese PVPs may have played in both industries.

Graham et al. also correctly note that “Supercomputing is critical to advancing science.” The supercomputer is a tool of science much like telescopes, microscopes, and particle accelerators, all of which enable us to see and understand things that are otherwise not knowable. Clearly, any government that denies its scientists and engineers access to the best tools available places the nation’s future at risk, militarily and economically. Simply put: Those with the best tools win.

BILL BUZBEE

Fellow, Arctic Region Supercomputing Center

Westminster, Colorado


I COMMEND SUSAN L. GRAHAM, Marc Snir, and Cynthia A. Patterson for clearly laying out the current supercomputing situation in the United States. Their call for increased federal investment, as well as coordination, in this area is much needed. Advances in science and engineering require access to cutting-edge computing capabilities. These capabilities are needed by researchers analyzing increasingly voluminous data sets, as well as those involved in modeling and simulation, and the future promises many new applications that will require supercomputers. National supercomputing resources are currently provided by the National Science Foundation (NSF) and Department of Energy Office of Science (DOE-SC) supercomputing centers, although flat or declining budgets have limited their ability to satisfy the growing needs of the research community.

The National Center for Supercomputing Applications has been successfully providing “killer micros” for scientific computing for nearly a decade. However, a number of important scientific and engineering applications are not well suited to this architecture. Further, the architecture of killer micros is changing as chip designers face the problems associated with high chip frequencies (the traditional means of increasing computer power). Now is the time to reconsider the design of supercomputers for scientific and engineering applications, realizing that matching application to architecture will maximize scientific productivity and minimize cost. The high-energy physics community is already exploring custom supercomputer architectures for their applications.

In designing a new generation of supercomputers, we must not be misled by the “Top 500” list. U.S. computer vendors (IBM and SGI) hold the top three spots on the most recent list, but this ranking is not a reliable indicator of performance on many applications critical to advancing scientific discovery and the state of the art in engineering. Further, to achieve the stated performance levels for the top two entries (both IBM BlueGene/L computers), applications must run efficiently on 40,000 to 65,000 processors; today, few applications scale to more than a few thousand processors. The Japanese Earth Simulator, which follows the supercomputer design philosophy articulated by the late Seymour Cray, is still considered by many to be the world’s preeminent supercomputer for real-world applications. It is ranked as number four on the Top 500 list and achieves that performance level with just 5,120 processors.

Supercomputing is more than hardware. Realizing the benefits of supercomputers requires new computing systems software to enable thousands of processors to effectively work together and scientific applications that achieve high performance levels and scale to 10,000 or more processors. In 2000, the DOESC created the Scientific Discovery through Advanced Computing (SciDAC) Program, which was targeted at just this problem. Although funding for the SciDAC Program was relatively modest, its recent 5-year program review illustrates the remarkable advances that can be made by teaming disciplinary computational scientists, computer scientists, and applied mathematicians. NSF’s Information Technology Research program supported similar activities, although it is now being ramped down. Research, development, and deployment of supercomputing software must be supported if we are to realize the full potential of supercomputing.

THOM H. DUNNING JR.

National Center for Supercomputing Applications

University of Illinois at Urbana-Champaign


Nanotechnology for development

I ENJOYED READING the thoughtful piece on “Harnessing Nanotechnology to Improve Global Equity,” by Peter A. Singer, Fabio Salamanca-Buentello, and Abdallah S. Daar (Issues, Summer 2005). It is a larger commentary on how knowledge today is both a global currency and a global social responsibility. This is particularly the case with so-called transformative technologies, such as biotechnology, information and communications technology, and nanotechnology, where issues of who has access, who benefits, and who controls the knowledge are critical if nations are to be able to manage their own development and destiny. As Singer argue, “The inequity between the industrialized and developing worlds is arguably the greatest ethical challenge facing us today.”

It should not be surprising to Issues readers to see the emerging nanotechnology capacity in innovative developing counties such as China, India, Korea, South Africa, Brazil, and Mexico. These nations (and others) have spent time and resources in developing strategic national approaches to their science and technology capacity for development, in most cases with the concomitant required political leadership.

What is perhaps more interesting in this paper is the linkage between investments in nanotechnology and the United Nations Millennium Development Goals. The authors are correct to point out that the rationale for supporting what to some is high-end research needs to be anchored firmly to the social and economic needs of a nation; more so in the case of countries that have no margin for error with such investments.

The notion of a global governance for nanotechnology raised by the authors is a critical one. This will be the issue of the next few decades as the research and technology outpace social capacity to absorb the impacts. We have seen elements of this argument before in the biotechnology debates, but the growing nanotechnology dialogue provides an opportunity for the developing world to have its say and stake its claims to this emerging knowledge arena.

But more than this will be required. The South must strengthen its capacity in education, research, governance, and training. It must be able to outline its needs clearly, with knowledge and innovation being a strong component of national planning strategies and with legislatures and finance and treasury ministries acknowledging the vital importance of science and technology for development. And it must engage its society in a meaningful debate about options. Only in this way will developing countries be able to effectively respond to the introduction and diffusion of new technologies such as nanotechnology. Singer et al. offer some useful insight on this score that can serve to prime the global debate that is now emerging.

PAUL DUFOUR

Senior Adviser for International Affairs

Office of the National Science Advisor

Toronto, Canada


PETER A. SINGER and his colleagues have made a number of important contributions to our understanding of how truly transformational advances in nanotechnology are likely to be, not only for science but for society, and not only for the developed world but for our global community.

The authors’ analysis and recommendations are to be commended. There is, however, one policy issue to which the authors give too little attention: South-South cooperation.

The rising scientific and technological competence of some developing countries, combined with the potential of nanotechnology to address critical social and economic needs (and thus to attract the interest of governments), offer an opportunity for developing countries to cooperate with one another on an unprecedented scale.

Such cooperation would likely have four interrelated benefits: It would help scientifically lagging countries in the developing world build their scientific capacities; it would narrow the scientific gap between the North and South and within the South; it would advance the frontiers of science in nanotechnology on a global scale; and, perhaps most important, it would increase the prospects that applications of nanotechnology will address concerns of critical interest to the developing world, enabling the technology to serve as a valuable tool for addressing issues of extreme poverty and environmental degradation. All of this would make nanotechnology a truly transformational technology, exerting impacts that would extend far beyond the scientific arena itself.

MOHAMED H. A. HASSAN

Executive Director

The Academy of Sciences for the Developing World

Trieste, Italy

President

African Academy of Science

Nairobi, Kenya


Nanotechnology politics

SENATOR GEORGE ALLEN is a significant supporter of the National Nanotechnology Initiative (NNI), sometimes referred to as the National Nanotechnology Program. He is an important asset to the NNI and the physical science and engineering establishment, as well as to agencies such as the National Science Foundation. However, his recent article on “The Economic Promise of Nanotechnology” (Issues, Summer 2005) is guilty of some exaggeration and hyperbole on at least three levels.

First, his rhetoric apes that of most proponents of the NNI. It includes remarks like “nanotechnology will transform almost every aspect of our lives,” etc. However, nanoscience has made inroads into a host of luxury products (such as cosmetics, clothing, paint, and sports equipment) but still has not produced a “home-run” application (in areas such as new diagnostics, drug delivery and treatment, electronics. etc.). It has improved a broad range of products, especially sensors, but it remains to be seen whether nanoscience when applied becomes a general-purpose technology like the steam engine or the electrical grid were. Nanotechnology is evolutionary, not revolutionary, and hyperbole mostly produces false expectations. When budgets tighten and no revolutionary products have been created, support for the NNI may evaporate. In addition, there is some conflation of the nanoscience-based version of nanotechnology and the science-fictional variant, which is equally problematic because the fictional one is revolutionary while the scientific one is evolutionary.The preponderance of luxury products has also undercut the rhetorical tactic of tagging world social problems as the potential beneficiaries of nanoresearch. It is time to put the hurdy-gurdy away and use more appropriate rhetoric. Engendering support for nanoscience simply does not require exaggeration, and if it does, maybe we need to reexamine the initiative.

Second, Allen’s call for national competitiveness may be misguided. If nanotechnology is revolutionary, it will foment complete replacement of a product or process. There is some evidence to suggest that domestic research will locate some of the high-paying jobs in the United States, but that does not guarantee that the production jobs will be similarly situated. Indeed, experience with the micromechanical device industry has made this abundantly clear. Although we need to support education and basic research at home, as well as promoting intersections within academe and industry, it might behoove us to understand that the world of applied nanoscience may be the quintessential illustration of the flattened-world thesis described in Thomas Friedman’s recent book The World Is Flat . Allen is correct in his assessment of graduation rates of engineers here and abroad, but no one to date has developed a strategy to significantly increase the number of homegrown engineers. Visas are more difficult to get for foreign students and opportunities at home make it less likely that they will want to stay here after graduation. A better solution might be to stick with globalization and internationalize efforts to develop a “nanotechnology economy” and back away from what is beginning to read too much like nationalistic rhetoric.

Third, the jury is still out on the Congressional Nanotechnology Caucus and its 30 members, only 7 of whom are on its Web site, one of whom is no longer in the Senate. It is unclear what “huge successes” it has achieved to date. Maybe we will just have to watch and wait.

DAVID M. BERUBE

Professor of Communication Studies

University of South Carolina

Columbia, South Carolina


SENATOR GEORGE ALLEN IS, as usual, right on the mark about the United States’ need for more home-grown engineers and scientists. I admire Allen for his willingness to take a leadership stance on this most important technology and to initiate serious funding support for our universities. Now we need bright students, bright professors, and, especially, bright technologists to transfer nanotechnology out of the lab and into the market. The country that does the best job of commercializing nanotechnology will reap the benefits of our investment, and it is not a foregone conclusion that we will be that country. We also need more entrepreneurs, need to take more calculated risks, and need to celebrate our successful businesses instead of punishing them with more poorly thought-out regulations like Sarbanes-Oxley.

JAMES R. VON EHR II

Chief Executive Officer

Zyvex Corporation

Richardson, Texas


Is very small beautiful?

IN “GETTING NANOTECHNOLOGY RIGHT the First Time” (Issues, Summer 2005), John Balbus, Richard Denison, Karen Florini, and Scott Walsh correctly point to the weaknesses in both methodology and regulation surrounding manufactured nanoparticles. This weakness is demonstrated in Europe as well as the United States. Although a number of studies have recognized the problem, there is no strategy for action and regulation. The onus is put onto business to behave responsibly. It may well be that few or some, rather than all, nanoparticles prove to display unacceptable toxicity, but in the meantime, where should the benefit of the doubt lie in their release and management while their characterization remains unclear? Because nanoparticles are already finding their way into cosmetics and medical dressings, this is not an idle question.

Further, the development of nanotechnology exists in a broader context than that of consumer risk. Where will the emphasis of nanotechnology exploitation be, and what direction should the public sector be providing?

Will nanotechnology be used for the development of renewable energy or to cheapen the extraction and use of fossil fuels? Will it be used for producing stain-resistant trousers or providing drinking water for the poor? The easy answer would be both. But where will the emphasis lie? And how is the prioritization made? The market will not necessarily prioritize socially or environmentally beneficial R&D.

These are not just theoretical questions. It can be argued that the technologies we are researching now will have a more significant impact on future society than the term of a president or prime minister. And although the decisions of the latter are subject to immense critical debate and scrutiny, the outcomes from the former occur by happenstance and accident. There is no bigger statement of society’s aspirations than the purposes behind where it puts its R&D money, and these aspirations, if not the specific funding decisions, should be open to debate.

Of course, these questions are no different for nanotechnology than for many other technological developments, but if the nanotechnology stuff isn’t just hype, then they are more important here than elsewhere.

The role of government should be to provide direction for public R&D expenditure and to provide a framework for the deployment of technology that is sympathetic to socially beneficial purposes.

DOUGLAS PARR

Chief Scientific Adviser

Greenpeace UK

London, England

http://www.greenpeace.org.uk


LIKE JOHN BALBUS, Richard Denison, Karen Florini, and Scott Walsh, we believe that the immense potential benefits of nanotechnology can only be realized if the development of the technology is balanced to avoid adverse public health, occupational, and ecological risks. Naturally occurring nanomaterials have been with us for centuries, but the revolutionary development of manufactured nanomaterials raises new questions that deserve answers.

At a National Research Council Workshop on Nanotechnology (March 24, 2005), representatives of the American Chemistry Council (ACC) and Environmental Defense(ED made presentations with very similar points on the need for a much-expanded research effort on the environment, health, and safety (EHS) issues associated with nanotechnology. The ED and the ACC CHEMSTAR Nanotechnology Panel, in a joint statement of principles (70FR24574—Docket OPPT—2004-0122, 23 June 2005), stated, “A significant increase in government investment in research on the health and environmental implications of nanotechnology is essential.”

If we want to get nanotechnology right the first time, we must have a high-quality, comprehensive, and prioritized international research agenda that is adequately funded. The agenda should (1) focus on the risk assessment goal, which will require information on the continuum of exposure, dose, and response; (2) develop new interdisciplinary partnerships that bring visionary thinking to the field; (3) support better understanding of the fundamental properties of nanomaterials that are important in the exposure/ dose/response paradigm; and (4) develop processes for establishing validated standard measurement protocols, so that individual or categories of materials can be studied.

Balbus et al. call for a significant increase in funding for a federal research program on the EHS implications of nanotechnology. We support that recommendation and urge its rapid implementation. Developing a truly strategic research strategy of appropriate quality and breadth requires the credibility and talent that organizations like the National Research Council can bring to bear. That research strategy needs to provide for expanded international involvement. A highly credible research strategy would provide evidence to funding organizations that the monies would be spent efficiently and effectively, and help demonstrate that EHS risk questions have answers.

It is refreshing to see that so many parties with different organizational objectives recognize common ground. Devoting energy to such coalitions to build a safe technology is so much more valuable to society than choosing sides for debates.

CAROL J. HENRY

Vice President, Science and Research

LARRY S. ANDREWS

Chair, ACC CHEMSTAR Nanotechnology Panel

American Chemistry Council

Arlington, Virginia

Cite this Article

“Forum.” Issues in Science and Technology 22, no. 1 (Fall 2005).

Vol. XXII, No. 1, Fall 2005