The Limits of Knowledge: Personal and Public

Human beings and governments typically make irrational decisions. Taking this into account in personal planning and in policymaking offers improved results.

One of the most basic assumptions underlying much of Western thinking is that individuals are rational beings, able to form judgments based on empirical information and logical deliberations in their quest for a course of action most suited to advancing their goals. This is assumed to be true for personal choices and for societal ones—that is, for public policies. A common narrative is that people used to be swayed by myths, folktales, and rituals (with religion sometimes added in), but the Enlightenment ushered in the Age of Reason, in which we are increasingly freed from traditional beliefs and instead rely on the findings of science. Progress is hence in the cards, driven by evidence. This assumption was first applied to nature, as we learned to crack its codes and employ its resources. For the past 200 years or so, it has also been applied to society. We no longer take society for granted as something to which we have to adapt, but we seek to re- make it in line with our designs. For many people, this means such things as improving relations among the races, reducing income inequalities, and redefining marriage, among other actions.

Economics, by far the most influential social science, has strongly supported the assumption of rationality. It sees individuals as people who have preferences and seek to choose among alternative purchases, careers, investments, and other options in ways that best “maximize” whatever they desire. This assumption has also come to be shared by major segments of other social sciences, including not just significant parts of political science (for instance, in the view that voters make rational choices) and sociology (people date to improve their status), but even law (laws are viewed as restructuring incentives) and history (changes in the organization of institutions can be explained in terms of the rational interests of individuals seeking to structure the world so as to maximize net benefits).

But this message is being upended by insights from the relatively new field of behavioral economics, which has demonstrated beyond reasonable doubt that people are unable to act rationally and are hardwired to make erroneous judgments that even specialized training cannot correct. Being created by people, governments have similar traits that spell trouble for rational policymaking and the progress that is supposed to follow. Still, a closer examination suggests that the findings of behavioral economics are not so much a reason for despair as an indication of the need for a rather different strategy. Once we fully accept our intellectual limitations, we can improve our personal decisionmaking as well as our public policies.

Scientific sea change

Some segments of social science never really bought into the progress and rationality assumption. Oswald Spengler, a German philosopher and mathematician best known for his book The Decline of the West, published in two volumes between 1926 and 1928, held that history is basically running in circles, repeating itself rather than marching forward. Social psychologists showed that people can be made to see things differently, even such “obvious” things as the length of lines, if other people around them take different positions. Psychologists demonstrated that we are driven by motives that lurk in our subconscious, which we neither understand nor control. Sociologists found that billions of people in many parts of the world continue to be swayed by old beliefs. However, the voices of these social scientists were long muted, especially in the public realm.

Different reasons may explain why those who might be called the “rationalist” social scientists drowned out the “nonrationalist” ones. These reasons include the can-do attitude generated by major breakthroughs in the natural sciences, the vanquishing of major diseases, and strong economic growth. Progress—driven by reason, rational decisionmaking, and above all, science—seemed self- evident. The fact that the rationalist social sciences used mathematical models and had the appearance of physics, while the nonrationalist ones drew more on narratives and qualitative data, also benefited rationalist social scientists.

Behavioral economics began to come into its own as doubts increased about society’s ability to vanquish the “remaining” diseases (see the war against cancer) and ensure economic progress, and as we became more aware of the challenges that science and technology pose. Above all, behavioral economics assembled a very robust body of data, much of it based on experiments. Recently, behavioral economics caught the attention of policymakers and captured the attention of the media, especially after its widely recognized leading scholar, Daniel Kahneman, was awarded the 2002 Nobel Prize in economics, the queen of rationalistic sciences, despite the fact that his training and research were in psychology.

Because the main findings of behavioral economics have become rather familiar, it is necessary to review them only briefly. The essential finding is that human beings are not able to make rational decisions. They misread information and draw inappropriate or logically unwarranted conclusions from it. Their failings come in two different forms. One takes place when we think fast. In his book Thinking, Fast and Slow, Kahneman called this System 1 thinking—thinking based on intuition. For instance, when we ask what two plus two equals, no processing of information and no deliberations are involved. The answer jumps out at us. However, when we engage in slow, or System 2, thinking, which we are reluctant to do because it is demanding, laborious, and costly, we often fail. In short, we are not rational thinkers.

In seeking to explain individuals’ real-life choices, in contrast to the optimal decisionmaking that they often fail to perform, Kahneman and Amos Tversky, a frequent collaborator, developed “prospect theory,” which has three major bundles of findings.

First, individuals’ evaluations are made with respect to a reference point, which Kahneman defines as an “earlier state relative to which gains and losses are evaluated.” When it comes to housing transactions, for example, many people use the purchase price of their house as the reference point, and they are less likely to sell a house that has lost value than one that has appreciated in value, disregarding changes in the conditions of the market.

The second major element is that evaluations of changes are subject to the principle of diminishing sensitivity. For example, the difference between $900 and $1,000 is subjectively less than that between $100 and $200, even though from a rational viewpoint both amounts are the same. This principle helps to explain why most individuals would prefer to take a 50% chance of losing $1,000 rather than accept a $500 loss: the pain of losing $500 is more than 50% of the pain of losing $1,000.

The third element is that individuals tend to exhibit strong loss aversion, with losses looming larger in their calculations than gains. For example, most people would not gamble on a coin toss in which they would lose $100 on tails and win $125 on heads. It is estimated that the “loss-aversion ratio” for most people is roughly between 1.5 and 2.5, so they would need to be offered about $200 on heads to take the bet.

Proof repeated

Replication is considered an essential requirement of robust science. However, in social science research this criterion is not often met. Hence, it is a notable achievement of behavioral economics that its key findings have been often replicated. For instance, Kahneman and Tversky found that responses to an obscure question (for example, what percentage of African nations are members of the United Nations) were systematically influenced by something that one would not expect people to be affected by if they were thinking rationally, namely a random number that had been generated in front of them. When a big number was generated, the subjects’ responses were larger on average than when a small number was generated. This finding indicates that the perception of an initial value, even one unrelated to the matter at hand, affects the final judgments of the participants, an irrational connection.

The effect demonstrated by this experiment has been replicated with a variety of stimuli and subjects. For instance, Karen Jacowitz and Kahneman found that subjects’ estimates of a city’s population could be systematically influenced by an “anchoring” question: Estimates were higher when subjects were asked to consider whether the city in question had at least 5 million people and were lower when subjects were instead asked whether the city had at least 200,000 people. J. Edward Russo and Paul Schoemaker further demonstrated this effect, finding that when asked to estimate the date that Attila the Hun was defeated in Europe, subjects’ answers were influenced by an initial anchor constructed from their phone numbers. Also, Drazen Prelec, Dan Ariely, and George Loewenstein found that when subjects wrote down the last two digits of their Social Security numbers next to a list of items up for auction, those with the highest numbers were willing to bid three times as much on average as those with the lowest.

Other studies have repeatedly replicated another phenomenon, known as “endowment,” observed by behavioral economists. In endowment, people place a higher value on goods they own than on identical ones they do not. For example, Kahneman, Jack Knetsch, and Richard Thaler found that when half the students in a room were given mugs, and then those with mugs were invited to sell them and those without were invited to buy them, those with mugs demanded roughly twice as much to part with their mugs as others were willing to pay for them. Similarly, Robert Franciosi and colleagues found that when subjects could trade mugs for cash and vice versa, those endowed with mugs were less willing to trade than would be predicted by standard economic theory.

True in real life

Many behavioral economics studies are conducted as experiments under laboratory conditions. This method is preferred by scientists because it allows extraneous variables to be controlled. However, extensive reliance on lab studies has led some critics to suggest that behavioral economics’ key findings may apply only, or at least much more strongly, under the artificial conditions of the lab and not in the field (that is, in real life).

Recent work in behavioral economics, however, has shown that its findings do hold outside of the lab. For instance, a study by Brigitte Madrian and Dennis Shea illustrates how what is called the “status quo bias,” a common facet of behavioral economics, shapes employee decisions on whether to participate in 401(k) retirement savings programs. Because of this bias, many millions of individuals do not contribute to these saving programs, even though the contributions are clearly in their self-interest. In another experiment conducted in the field, Uri Gneezy and Aldo Rustichini found that neoclassical theory expectations regarding incentives and punishments did not predict the behavior of parents at a daycare center in Israel. When Israeli daycare centers struggling with the problem of parents arriving after closing time to pick up their children implemented a fine of 10 shekels to discourage lateness, the number of parents arriving late actually increased—an example of nonrational economic behavior in action.

Shlomo Benartzi, Alessandro Previtero, and Richard Thaler studied what economists call the “annuity puzzle,” or the tendency of people to forego annuitizing their wealth when they retire, even though it would assure them of more annual income for the rest of their lives and reduce their risk of outliving their retirement savings. In a survey of 450 retirement 401(k) plans, only 6% of participants chose an annuity when it was available.

Resistant mistakes

Behavioral economics provides little solace for those who believe in progress. Data show that education and training do not help people overcome their cognitive limitations. For example, 85% of doctoral students in the decision science program at the Stanford Graduate School of Business, who had extensive training in statistics, still made basic mistakes in combining two probabilities. Studies also have shown that even people specifically alerted to their cognitive blinders are still affected by them in their deliberations.

My own work shows that decisionmaking is often nonrational not only because of people’s cognitive limitations, but also because their choices are affected by their values and emotions. Thus, whereas from an economic viewpoint a poor devout Muslim or Jew should purchase pork if it costs much less than other sources of protein, this is not an option these decisionmakers consider. This decision is a priori blocked out for them by their beliefs. As I see it, this is neither slow nor fast thinking, but not thinking. The same holds for numerous other decisions, such as whether to sell oneself for sex, spy for a foreign power, or choose to live in a distant place. True, if the price differential is very high, some people will not heed their beliefs. However, some will honor them at any price, up to giving up their life. What is particularly relevant for decisionmaking theory is that most individuals in this group will not even consider the option, and those who violate their belief will feel guilty, which often will lead them to act irrationally in one way or another.

Emotions rather than reasoning also significantly affect individuals’ political beliefs and behavior. For example, when people in the United States were asked in a Washington Post–ABC News poll whether President Barack Obama can do anything to lower gas prices, roughly two-thirds of Republicans said he can, whereas two-thirds of Democrats said that he cannot. When George W. Bush was in the White House, and the same question was asked, these numbers were reversed. Citizens thus tend to weight their political loyalties over the facts, even flip-flopping their views when loyalty demands it.

Policymakers, who make decisions based not merely on their individual intellectual capacities and beliefs but also benefit from the work of their staff, nevertheless often devise or follow policies that disregard major facts. For example, policymakers have supported austerity programs to reduce deficits when economies are slowing down, instead of adding stimulus and committing to reduce deficits later, as most economic studies would suggest. They have repeatedly engaged in attempts to build democratic governments by running elections in places, such as Afghanistan, where the other elements essential for building such governments are missing. And they have assumed that self-regulation will work even when those who need to be restrained have strong motives to act against the public interest and their own longterm interest. It may seem a vast overstatement until one looks around that most public policies fall far short of the goals they set out for themselves, cost much more than expected, and have undesirable and unexpected side effects. We seem to experience equally great difficulties in making rational public policies as we do when making personal ones.

Adapting to limits and failings

The findings of behavioral economics have led to some adaptations in the rationalist models. For instance, economics no longer assumes that information is instantly absorbed without any costs (an adaption that arguably preceded behavioral economics and was not necessarily driven by it). Thus, it now is considered rational if someone in the market for a specific car stops comparative shopping after visiting, say, three places, because spending more time looking around is held to “cost” more than the additional benefit from finding a somewhat lower price. Aside from such modifications in the rationalist models, behavioral economics has had some effects on ways in which public policies are formed.

Richard Thaler, a professor at the University of Chicago, is a highly regarded behavioral economist. He argued in his influential book Nudge: Improving Decisions about Health, Wealth, and Happiness (coauthored with Cass Sunstein) that people do not make decisions in a vacuum, based on their own analysis of the information and in line with their preferences. They inevitably act within an environment that affects their processing of information and decisionmaking. For instance, if an employer offers his workers health insurance and a choice between two programs, they are not going to analyze or seek out many others. They are a bit more likely to do so if the employer will reimburse them in part for costs if they choose a program other than the ones offered by their workplace.

Thaler hence suggests restructuring “external” factors so as to ease and improve the decisionmaking processes of people, whether they are consumers, workers, patients, or voters. His most often–cited example is signing people up for a 401(k) retirement program but allowing them to opt out rather than asking them if they want to opt in. This policy is directly based on the behavioral economics finding that people do not act in their best interest, which would be to sign up for a pension program as soon as possible. Due largely to Thaler’s influence, Great Britain will be implementing legislation in late 2012 that will change the default option for corporate pension funds, with employees being automatically enrolled unless they elect to opt out.

Thaler called this restructuring “nudging,” because this approach, unlike traditional regulations, does not force anybody to toe a line, but merely encourages them to do what is considered rational, without having to perform an analysis and act. Thaler noted up front, and critics stressed, that this approach will work well only as long as those who nudge have the interest of those who are being nudged at heart.

The other author of Nudge, Cass Sunstein, has been called “the nudgemeister.” President Obama appointed him to head the Office of Information and Regulatory Affairs in the White House. Sunstein has been working to remove regulations that are unnecessary, obsolete, or unduly burdensome and to foster new ones. One of his main achievements that is based on behavioral economics has been to simplify the information released to the public—to take into account individuals’ limited capacity to digest data. This was achieved most visibly in the redesigned dietary recommendations and fuel efficiency stickers for cars.

Stumbling forward

As Kahneman, who among other posts is a Senior Scholar at the Woodrow Wilson School of Public and International Affairs at Princeton University, explained in a personal correspondence, the reason why behavioral economics has not taken over is that “at this point there is no behavioral macroeconomics, no forecasting models based on behavioral foundations, etc. It is probably too early to conclude that these achievements are impossible. In any event, I think it is fair to [say] that behavioral approaches have prevailed wherever they have competed—but they have not competed in many central domains of economics, and the standard model remains dominant by default. It turns out to be extremely difficult to do good economics with more complex assumptions, although steady progress is being made.”

As I see it, behavioral economics suggests that we need a radical change in our approach to personal and collective decisionmaking, an intellectual shift of a Copernican magnitude. I can here merely illustrate the contours of a much less demanding form of decisionmaking. The basic approach turns the rationalistic assumptions on their head; it takes as a starting point that people are unable to gain and process all the relevant information, and they are unable to draw logical conclusions from the data they do command; in other words, that the default is nonrational decisionmaking. It assumes that given the complexity of the social world, we must move forward not like people equipped with powerful headlights on a night brightly lit by a full moon, but like people who stumble forward in a dark cave, with a two-volt flashlight: very carefully and always ready to change course.

If they follow my line of thinking, nonrationalists will assume that they are likely to make the wrong choice, and hence they will seek to provide for as many opportunities as possible to adapt course as a project unfolds and more information becomes available and to make as few irrevocable commitments as possible at the starting point. Simple example: If you are building a house, do not sign off on the architect’s plans but insist that you be allowed to make changes. As you find out what digging the foundations reveals, the cost of some materials rises unexpectedly while that of others falls, new ideas occur to you, and so on. In other words, we are better adapted to our limitations if we can fracture our decisions and stretch them out over time rather than front-load them (which is one major reason for the common failure of long-term, especially central, planning).

The less we know, I suggest, the larger the reserves we need. (It should be noted that our inability to know and to process what we know is smaller in some areas than in others, such as in dealing with infectious diseases versus mental illness.) We should expect unexpected difficulties to arise and retain uncommitted resources to deal with these difficulties. This holds for militaries as well as for people who start a new business and for most everyone else.

The less we know, the more we should hedge. Studies of investment have long shown that people achieve better results if they do not try to determine which investment instrument will do better and invest in that instrument, but divide their investments among various instruments. As the U.S. Securities and Exchange Commission has noted in a “beginners’ guide” to investing: “Historically, the returns of the three major asset categories [stocks, bonds, and cash] [show that by] investing in more than one asset category, you’ll reduce the risk that you’ll lose money and your portfolio’s overall investment returns will have a smoother ride.” The less we know, the more we should not merely hedge, but hedge more widely; for instance, by not merely distributing our investments among different asset categories (with some financial advisers recommending investing in real estate as well as stocks and bonds) but also within each category (investing in at least a dozen stocks rather than just four or five). The same concept applies to decisions by military planners (who should not rely, for example, on one new type of fighter airplane) and to decisions by economic planners (who should not rely on choosing winners and losers, a process otherwise known as “industrial policy”).

An important element of the nonrational approach to policymaking is to acknowledge our limitations and not to overpromise results. In order to gain capital from venture funds or credits from banks, or appropriations from legislatures, those who seek support often portray their expected gains in the most optimistic way possible. If the preceding analysis is correct, such overselling leads to disappointments and resentments when the new program to improve, say, the reading or science scores of the nation’s high-school students does not pan out and we must redesign the program and seek more resources. All of the parties involved would be better off if all such new programs were framed from the start as experiments. It should be understood that their need to be adapted is not a sign of failure, but as the only way to make progress, such as it is, in the social world: slowly, one step back for every one or two steps forward, at greater costs and sacrifices than expected, with fewer results.

All this looks grim only if one ignores the lessons of behavioral economics. If one truly takes them in, rather than assumes that we are born with wings ready to fly, we shall learn to do as well as human beings can in a world in which we must learn to crawl before we can run.

Recommended Reading

  • Dan Ariely, Predictably Irrational: The Hidden Forces that Shape Our Decisions (New York, NY: Harper Collins, 2008).
  • Shlomo Benartzi, Alessandro Previtero, and Richard Thaler, “Annuitization Puzzles,” Journal of Economic Perspectives 25, no. 4 (2011): 143–164.
  • Amitai Etzioni, The Moral Dimension: Toward a New Economics (New York, NY: The Free Press, 1988).
  • Daniel Kahneman, Thinking, Fast and Slow (New York, NY: Farrar, Straus and Giroux, 2011).
  • Daniel Kahneman and Amos Tversky, “Judgment under Uncertainty: Heuristics and Biases,” Science 185, no. 4157 (1974): 1124–1131.
  • Steven Levitt and John List, “Homo Economicus Evolves,” Science 319, no. 5865 (2008): 909–910.
  • George Loewenstein, “Out of Control: Visceral Influences on Behavior,” in Advances in Behavioral Economics, eds. C. F. Camerer, George Loewenstein, and Matthew Rabin (Princeton, NJ: Princeton University Press, 2004), 689–723.
  • Wolfgang Pesendorfer, “Behavioral Economics Comes of Age: A Review Essay on Advances in Behavioral Economics,” Journal of Economic Literature 64 (2006): 712–721.
  • Scott Plous, “Thinking the Unthinkable: The Effect of Anchoring on Likelihood Estimates of Nuclear War,” Journal of Applied Social Psychology 19 (1989): 67–91.
  • J. Edward Russo and Paul Schoemaker, Decision Traps (New York, NY: Simon and Schuster, 1989).
  • Cass Sunstein and Richard Thaler, Nudge: Improving Decisions About Health, Wealth, and Happiness (New York, NY: Penguin, 2009).
Your participation enriches the conversation

Respond to the ideas raised in this essay by writing to [email protected]. And read what others are saying in our lively Forum section.

Cite this Article

Etzioni, Amitai. “The Limits of Knowledge: Personal and Public.” Issues in Science and Technology 29, no. 1 (Fall 2012).

Vol. XXIX, No. 1, Fall 2012