The Future of Meat
Just because the first in vitro hamburger cost $335,000 to produce doesn’t mean we shouldn’t start thinking about how factory-grown meat might transform our food system, the environment, and even our culture.
On August 5, 2013, the first hamburger grown from stem cells in a laboratory, and not in a cow, was served in London. This event was not merely a milestone in the development of the scientific and technological capability to produce factory-grown, or cultured, meat; it was a proof of concept for a foundational emerging technology. If this technology continues to evolve and is deployed at scale, it will have significant social, cultural, environmental, and economic implications. How ought a democratic society begin to understand and prepare for such changes?
Despite the high level of uncertainty regarding the outcomes of technology choices, economic winners and losers, ethical debates, and so forth that are associated with any radical new technological pathway, it is by no means premature to begin a systemic effort to explore possible future consequences of the development of factory meat. The aim of such work, however, should not be to try to develop accurate predictions of what will actually occur as this technology matures, which is probably impossible. Rather, it is to develop and play with scenarios that can enable more adaptive and responsible policy and institutional responses to the unpredictable and far-reaching social consequences of a transition to the production and consumption of factory- grown meat.
Indeed, with the first meat-production facility, or “carnery,” probably only a few years away, an optimistic scenario might suggest that rapid public acceptance of its products could attract investors and soon lead to expanding industrial capacity for producing factory meat. The shift of meat production from field to factory could in turn significantly reduce global climate change forcing and lessen human impacts on the nitrogen, phosphorous, hydrologic, and other cycles, while reducing the land required to produce animal feed could mean more land for producing biofuels and other biological feedstocks for, for example, plastics production. All of which would, of course, be accompanied by an equally rapid realization of unintended consequences. Yet an opposite scenario is, at this point, equally tenable: that for a number of reasons such as inability to reduce costs of production to a competitive point, opposition from threatened economic interests, or simply a society-wide rejection of food produced in such a manner for reasons of aesthetics or subjective preference, cultured meat might be rejected outright. Such a choice would also carry consequences; it might, for example, commit a world that is rapidly increasing its consumption of meat to an ever-expanding environmental footprint of food.
During the August 5 tasting event, which was widely covered by print and video media, Mark Post, the tissue engineer who created the cultured hamburger, said that it took about 3 months to grow the tissue for that particular burger which, he is quick to point out, is faster than raising a cow. (For comparison, one life-cycle assessment estimated that calves sent directly to feedlots in the United States require about 10 months to mature.) Nonetheless, he believes we are still at the beginning of the development process and have a lot of work to do to scale up the production process while maintaining the quality of the tissue cultures and ensuring the sterility and safety of the final products. Some of the remaining challenges include optimizing synthetic (animal-free) nutrient growth media, designing scaffolds (structures to which muscle cells can adhere that mimic the in vivo environment), and facilitating cell exercise in order to impart a familiar and acceptable texture, as well as identifying cost-effective and environmentally appropriate technology options for each stage of the process (environmentally appropriate options are necessary because a significant societal and economic rationale for the technology is its environmental advantages over current production methods). Dr. Post remains confident, however, that these technical issues can be resolved. Some estimates put commercial availability at 10 to 20 years from now. The Missouri firm Modern Meadow has an even shorter time horizon for a similar tissue engineering process aimed at producing leather (making cultured skin is simpler than producing meat). It has said in a Txchnologist article reprinted in Scientific American in 2013 that bioengineered leather products will be commercially available by about 2017.
From an economic perspective, cultured meat is still an experimental technology. The first in vitro burger reportedly cost about $335,000 to produce and was made by possible by financial support from Google cofounder Sergey Brin. Of course, first-of-a-kind technologies are often ridiculously expensive; one 2008 European study, however, concluded that the production costs are likely to eventually be competitive with those of unsubsidized chicken meat. But the technology processes are still under development, and their future makeup, and costs, cannot yet be projected with any certainty, nor can their broader environmental and social implications. Accordingly, any such prediction should be taken as no more than an educated guess. Moreover, the eventual shape of demand and supply curves, and product differentiation possibilities, are also unknown; depending on consumer response and market evolution, for example, there is no reason why very expensive “designer” or “boutique” brands might not be commercially viable even if in vitro burgers never, or only very slowly, become a mass consumption option.
Although the development path of in vitro meat techniques remains uncertain, the basic steps required for initial industrial-scale production seem clear. The first step will be the extraction of a tissue sample from a donor animal that remains otherwise unharmed. From that sample, stem cells of interest will be isolated and, with the addition of nutrients and growth factors, the culture will proliferate and increase in overall mass. The cells will then be induced to differentiate into edible skeletal muscle cells. Along the way, the cells will be exercised via mechanical, electrical, or chemical stimulation in order to achieve a familiar and palatable texture. Finally, vitamins, minerals, and flavors will be added as the tissue is ground into the final product and packaged for shipment to grocery stores and restaurants. In this form, cultured meat will not have the larger-scale structures of fat deposits, blood vessels, and connective tissues that provide familiar cuts of meat with their characteristic appearance and taste. Accordingly, farther in the future, bioprinting techniques may be used to enable the production of meat that mimics more familiar cuts such as steak, roasts, and pork chops, and further differentiation could lead to more- affordable basic cuts as well as high-end products designed to meet specific taste and nutritional profiles. Precisely controlled fat content as well as unique flavors and supplements could yield branded, designer delicacies with much greater variety than animal meats can currently provide. At the scale of the agricultural system, any reduction in total farm animals could also reduce the propensity for diseases to cross the species barrier to humans and, because less prophylactic application of antibiotics would be needed, less bacterial resistance to antibiotics may result, with consequent benefits to human health. Asceptic growth environments could meanwhile prevent food-borne illness.
Once the factory production system is in place, the product—meat—will itself become a design space, and genetic or protein manipulation, changes in production technology, and the integration of other types of nutrients and food products will continue to diversify food away from the familiar forms it has today. At some point in the farther future, cultured meat production may well be coupled with pharmaceutical technology, and the rapid growth of individual genomic mapping, to create food that is designed for particular genomes or that supports healthy personal microbiota ecologies.
Or not. Of the many factors that might influence the pathway to such a future, one is the question of whether people will, at least in the short term, continue to expect meat to look, taste, and feel like, well, meat. Food is a culturally charged domain, and the technological evolution of meat may well outpace cultural acceptance of radically new food production technology. Nonetheless, people may eventually look at a T-bone steak with the nostalgia they feel for the Apple IIe: It was an important contributor to technological evolution and economic productivity, but no one would choose it over an iPad.
Indeed, the scientific and technological challenges to creating a factory meat industry are likely to be no greater than the environmental, economic, and social ones. For example, we have run focus groups that indicate that some consumers have a negative visceral reaction to the thought of lab- grown meat. Yet others believe that such technologies herald the next generation of environmentally friendly and hunger-reducing food technologies. Will environmental groups that campaign hard against genetically modified crops decide instead to lend strong support to cultured meat? Even in the most predictable of worlds, consumer preferences can be capricious, and if cultured meat does not offer early benefits in either taste or cost, will its novelty be sufficient to stimulate the demand necessary to allow the industry to grow?
But the complexities of demand patterns are not the only economic uncertainty. The growth of a cultured meat industry could create new economic winners and losers as food production leaves the ranch in favor of the bioreactor. As the technology scales up, would ranchers and farmers fight hard to stop it? Whereas the U.S., with its enormous factory farms, accommodated genetically modified crop varieties with barely a political ripple, perhaps the threat to the meat industry, and the mythic national symbols of the rancher and the range, will trigger strong opposition to factory meat? In contrast, perhaps the European Union, which has been so suspicious of GMOs, would welcome factory meat as a boon to landscape preservation—especially given that it was first developed in a European university, rather than by a U.S. corporation. New technologies may often generate surprising political, economic, and social realignments.
Such possibilities can help inform rich scenarios for exploring the future of meat. The value of such scenarios, in turn, is to help anticipate the sorts of policy challenges that may emerge. For example, cultured meat will undoubtedly shift the vulnerabilities inherent in the food system. Water and supply chain management techniques may give carneries a significant advantage over conventional meat production processes in coping with variations in rainfall, and allow them to better attenuate subsequent price fluctuations. Such capabilities may enhance global food security. Yet perhaps factories for the mass production of meat, which could be sited in any environment, would displace feedlots and ranches that require certain environmental conditions, to the detriment of the economies of nations that now depend on the production of meat from animals. Few of these effects cannot be mitigated through appropriate policy tools; this is easier to accomplish if they can be anticipated and spotted early on.
The intersection of global hunger and poverty with cultured meat technologies presents a particularly complex challenge. Intuitively, it seems that factory-grown meat designed to be inexpensively produced, and perhaps used as an input to integrated algal/insect/factory-meat products, could constitute an inexpensive source of complete protein for those who are malnourished in developing nations. However, this view not only assumes affordability but also makes the familiar mistake of characterizing hunger as a problem of food scarcity. The world already produces enough food to meet the individual energy requirements of every person on Earth [2,831 calories per person per day in 2009, according to the United Nations Food and Agriculture Organization (FAO)]. Global hunger today is a consequence of many factors, including poverty, natural disasters, failed states, and war, not simply a lack of food production capacity. The development of a cultured meat industry will not address the problems of political power, infrastructure inadequacies, economic inequity, and geopolitics that underlie global hunger. Moreover, perhaps the growth of a bioengineered meat sector will undercut the economic prospects and cultural cohesion of some developing countries by allowing a new shift of economic potential from agrarian economies back to industrialized ones, thus exacerbating the hunger problem. Again, we offer the outlines of such scenarios not to predict, but to suggest the sorts of discussions and analyses that need to begin now in order to develop a suite of possible response options that can enable effective policymaking as the system unfolds in real time.
As with economics and social patterns, cultured meat can be expected to have substantial implications for environmental systems. Over the past century, the onset of industrial agriculture and the Green Revolution (more fertilizer, better pesticides, modern management techniques, better irrigation methods, and more productive cultivars) kept pace with a growing and increasingly urbanized human population of 7 billion people and made a mockery of popular environmental books such as The Population Bomb by Paul Ehrlich (1968) that were confidently predicting mass famine and death by the 1980s. Yet modern agriculture has also contributed to water scarcity, greenhouse gas emissions, increased perturbation of the nitrogen and phosphorous cycles, and other environmental problems. (For example, a 2006 report from FAO found that livestock are responsible for about 18% of annual anthropogenic greenhouse gas emissions, 8% of water withdrawals, and 30% of land use.) For some, cultured meat and associated bioengineering techniques mean that the environmental problems associated with industrial agriculture can be addressed, at least in part. One analysis performed by researchers at the universities of Oxford and Amsterdam and published in Environmental Science & Technology in 2011 concluded that, “In comparison to conventionally produced European meat, cultured meat involves approximately 7-45% lower energy use (only poultry has lower energy use), 78-96% lower GHG emissions, 99% lower land use, and 82-96% lower water use depending on the product compared.” By enabling tighter controls on emissions and the recycling of nutrients that are not directly embedded in the final product, cultured meat could be a critical mechanism for managing increasingly severe human impacts on the nitrogen and phosphorus cycles.
The long view
The potential future implications of cultured meat must also be understood in a broad historical context. As with the Neolithic Revolution 10,000 years ago, and industrial agriculture 150 years ago, bioengineering is poised to once again transform farm landscapes. The potential impacts of factory-grown meat mentioned so far merely represent some of the most obvious and easily anticipated trends. In reality, human food production is highly integrated with other environmental, economic, and social systems in a web of complex global cause-and-effect relationships that are difficult to understand and impossible to control. These complexities will be further compounded if food, pharmaceutical science and technology, and human genomic medicine become an integrated design space. For this reason, it is important to develop anticipatory practices that can be systematically applied to interconnected global systems as new technologies such as cultured meat are introduced and expanded. The hamburger served in August may become merely a footnote in the narrative of sweeping changes that biotechnology-enabled food production might bring, but it is an important reminder that better evaluation and assessment methodologies are needed, and soon. These in turn should be integrated into scenario games that enable stakeholders and policymakers to practice agile responses to the challenges and opportunities such technological evolution will no doubt spawn in abundance.
Yet it is difficult enough to consider near-term possibilities. Since humans began developing agriculture thousands of years ago in many places around the globe, food has been defined in terms of the production technology. To date, such production technologies are determined by what nature has provided—cows for beef production, pigs for pork production, plants for corn and soy production—sometimes tweaked with genes and chemicals from other species. But food is now morphing into a design space, where factory production systems, coupled with genetic manipulation, liberate food from any need to rely on a particular species. How might the ethical dimensions of this transition evolve, as animals become decreasingly necessary as a food source? Might factory food thus facilitate the extension of full human rights to all sentient species? In contrast, a 2008 article in the Journal of Agricultural and Environmental Ethics wondered how much of a moral problem eating factory meat sourced from a human stem cell (effectively creating safe, victimless cannibalism) would be. Or again, in an age when individuals carry around their complete genetic profile in easily accessible form, it may be possible to custom-design food for particular genomes, as food design and preventative medicine merge. Factory food may also become a critical means to help humans manage the carbon, nitrogen, and phosphorous cycles of an increasingly anthropogenic planet.
Such scenarios may seem ridiculous, but equally radical, if currently unimagined, changes are likely as emerging technologies such as factory food scale up, and we should practice thinking about “radical” scenarios just as we practice thinking about more incremental ones. Powerful technologies, such as railroads, automobiles, and the Internet, change the world in profound ways that antecedent generations could not have predicted and often failed even to imagine. Automobiles were clean, resource-efficient, low-emission vehicles compared to the horses they replaced, but a billion automobiles on the road today mean that cars are now changing the evolution of our atmosphere through anthropogenic greenhouse gas emissions. But, of course, had there been no substitute for horses, the modern world could not have evolved, since (among other things) it would be impossible to grow enough food to supply, not to mention process all the waste produced by, horses and other animal forms of transportation in a world with a population and an economy such as ours. The consequences of important emerging technologies are not additive; rather they create significant perturbations of a complex adaptive system, and the world that that subsequently evolves is fundamentally different from what it was before. From the structure of our economies to the evolution of our environment to our ethical standards, a world whose protein supply is significantly provided by factory-grown meat technologies will probably be different in kind from a world without these technologies. Indeed, factory meat is perhaps best understood as a planetary engineering technology, and to pretend otherwise can become just a subtle way of avoiding ethical responsibility for the consequences of our own creations.
Broader implications
For this reason, cultured meat technology is not just of interest in itself, it is also an ideal case for exploring broader questions about how emerging technologies, with all their unpredictability, uncertainty, and potentially substantial impacts in numerous domains, can be usefully studied and understood, even at very preliminary stages of their development, to improve societal capacities to manage their development, diffusion, and consequences. All foundational emerging technologies—the printing press, the steam engine, railroads, computers, and so on—destabilize existing economic, institutional, environmental, social, and cultural assumptions and interests. Despite their potential to transform human and environmental equilibria, and despite the fact that such technology-driven transformations seem inseparable from human evolution itself, systemic evaluation of early-stage technologies with significant potential for societal transformation is not a well-developed body of knowledge and practice. Creating this area of study is a formidable intellectual challenge not just because of the complexity of the systems involved, but also because, by definition, emerging technologies seldom have well-identified characteristics and behaviors, so traditional analytical tools, such as industrial ecology or life-cycle analysis methods used to identify and assess environmental considerations, have at best limited and speculative application. Efforts to develop methods, tools, and institutional structures for evaluating the social implications of emerging technologies are also under way, but progress is halting and investments have been at best modest. In 1990, the Human Genome project, for example, began directing some of its funding into an Ethical, Legal, and Social Implications (ELSI) program, which was intended to identify and examine social issues related to the main research activity. The Center for Nanotechnology in Society headquartered at our home Institution of Arizona State University, and the Synthetic Biology Project at the Woodrow Wilson International Center for Scholars in Washington, DC, seek to explore the social implications and potential governance of the rapidly evolving areas of emerging foundational technologies. Europe houses several small efforts to actually build such capabilities into government R&D enterprises, including the Danish Board of Technology and the Rathenau Institute in the Netherlands. Of course the U.S. Congress chose to eliminate its own fledgling effort in this regard when it eliminated the Office of Technology Assessment in 1995.
Economic tools for technological assessment tend to be the most sophisticated, because countries and firms have long had to make technology choices, and economic considerations have been the most important and immediate input to such choices. Similarly, environmental issues have been analytically bounded and assessed through predominantly scientific and quantitative methods, and environmental analytical techniques such as industrial ecology are also reasonably well developed. The assessment of social impacts of technology, especially when the technology is in its earliest stages, remains the least developed, in part because of the complexity of the systems involved and in part because social assessment is, inevitably, a normative process in which the results of the analysis often reflect values as much as quantitative observations. Of course this is also true for economic and environmental assessments, yet as our research has proceeded, we have been struck by the gap between the availability and sophistication of economic, engineering, and environmental analytical tools, and the relative paucity and inadequacy of tools to enable modeling and quantification in the social and cultural domains. And while it is true that results of social assessments seem particularly contingent given the high levels of uncertainty and nascent state of the technology itself, this is no less true for economic and environmental contexts. So the large gap in practice seems as much to do with a bias toward the quantifiable in assessment methods rather than the intrinsic complexities of the domains, and increasing our capabilities in the social assessment of technologies is a clear challenge to future researchers.
Moreover, existing technology assessment tools tend to have specific disciplinary foci and a resulting set of biases: industrial ecology and its toolbox tend to emphasize environmental perspectives; life-cycle accounting methodologies and related tools focus on economic issues. All too frequently, biases in the evaluation of complex systems such as emerging technologies are not introduced intentionally, but because of limits in the tools available to analyze such systems and the lack of robust integrative analytical frameworks that are able to not only place quantitative results in proper perspective, but identify substantive gaps in the evaluation process. Thus, for example, reliance on an environmental tool such as life-cycle assessment will produce quantitative results that, however uncertain, can bias decisionmakers toward the prioritization of environmental values over others, simply because decisions tend to reflect available information, especially if that information is quantitative and therefore appears robust and definitive. A further research challenge is therefore to provide an integrated framework for technology assessment across disciplinary domains.
Such an integrated approach must start with good cases, and part of our purpose here is to present factory meat as an example of the type of nascent technology that can provide a rich source of scenarios for exploring future societal transformations, with an eye toward understanding not just their particular implications but the broader lessons they can help teach about adaptation to technological evolution. A world where meat comes mostly from factories instead of ranches and feedlots might be a world better able to deal with challenges of food security, the environment, and natural resources, but at this point such a future is hypothetical. We may be only one in vitro hamburger into the age of factory meat, but it is not too early to begin exploring the implications of this potentially transformational technology, both to support more agile and effective responses to unexpected emerging consequences of a potentially radically shift in our food production system and to provide a model case study for how to approach and better understand and manage other emerging technologies, now and in the future.