Kahneman Says “Intuitive Thinking” Is “the Origin of Most of What We Do Right–Which Is Most of What We Do”

(p. 415) The investment of attention improves, performance in numerous activities–think of the risks of driving through a narrow space while your mind is wandering-and is essential to some tasks, including comparison, choice, and ordered reasoning. However, System 2 is not a paragon of rationality. Its abilities are limited and so is the knowledge to which it has access. We do not always think straight when we reason, and the errors are not always due to intrusive and incorrect intuitions. Often we make mistakes because we (our System 2) do not know any better.
I have spent more time describing System 1, and have devoted many (p. 416) pages to errors of intuitive judgment and choice that I attribute to it. However, the relative number of pages is a poor indicator of the balance between the marvels and the flaws of intuitive thinking. System 1 is indeed the origin of much that we do wrong, but it is also the origin of most of what we do right–which is most of what we do. Our thoughts and actions are routinely guided by System 1 and generally are on the mark. One of the marvels is the rich and detailed model of our world that is maintained in associative memory: it distinguishes surprising from normal events in a fraction of a second, immediately generates an idea of what was expected instead of a surprise, and automatically searches for some causal interpretation of surprises and of events as they take place.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

The Precautionary Principle Would Have Blocked Many Great Innovations

(p. 351) The intense aversion to trading increased risk for some other advantage plays out on a grand scale in the laws and regulations governing risk. This trend is especially strong in Europe where the precautionary principle, which prohibits any action that might cause harm, is a widely accepted doctrine. In the regulatory context, the precautionary principle imposes the entire burden of proving safety on anyone who undertakes actions that might harm people or the environment. Multiple international bodies have specified that the absence of scientific evidence of potential damage is not sufficient justification for taking risks. As the jurist Cass Sunstein points out, the precautionary principle is costly, and when interpreted strictly it can be paralyzing. He mentions an impressive list of innovations that would not have passed the test, including “airplanes, air conditioning, antibiotics, automobiles, chlorine, the measles vaccine, open-heart surgery, radio, refrigeration, smallpox vaccine, and X-rays.” The strong version of the precautionary principle is obviously untenable. But enhanced loss aversion is embedded in a strong and widely shared moral intuition; it originates in System 1. The dilemma between intensely loss-averse moral attitudes and efficient risk management does not have a simple and compelling solution.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: italics in original.)

Sunk-Cost Fallacy “Can Be Overcome”

(p. 346) The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. I have often observed young scientists struggling to salvage a doomed project when they would be better advised to drop it and start a new one. Fortunately, research suggests that at least in some contexts the fallacy can be overcome. The sunk-cost fallacy is identified and taught as a mistake in both economics and business courses, apparently to good effect: there is evidence that graduate students in these fields are more willing than others to walk away from a failing project.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Kahneman Preaches that People Can and Should Act More Rationally

(p. 338) . . . I have a sermon ready for Sam if he rejects the offer of a single highly favorable gamble played once, and for you if you share his unreason-able aversion to losses:

I sympathize with your aversion to losing any gamble, but it is costing you a lot of money. Please consider this question: Are you on your deathbed? Is this the last offer of a small favorable gamble that you will ever consider? Of course, you are unlikely to be offered exactly this gamble again, but you will have many opportunities to consider attractive gambles with stakes that are very small relative to your wealth. You will do yourself a large financial favor if you are able to see each of these gambles as part of a bundle of small gambles and rehearse the mantra that will get you significantly closer to economic rationality: you win a few, you lose a few. The main purpose of the mantra is to control your emotional response when you do lose. If you can trust it to be effective, you should remind yourself of it when deciding whether or not to accept a small risk with positive expected value. Remember these qualifications when using the mantra:

  • It works when the gambles are genuinely independent of each other; it does not apply to multiple investments in the same industry, which would all go bad together.

(p. 339)

  • It works only when the possible loss does not cause you to worry about your total wealth. If you would take the loss as significant bad news about your economic future, watch it!
  • It should not be applied to long shots, where the probability of winning is very small for each bet.

If you have the emotional discipline that this rule requires, you will never consider a small gamble in isolation or be loss averse for a small gamble until you are actually on your deathbed and not even then.

This advice is not impossible to follow. Experienced traders in financial markets live by it every day, shielding themselves from the pain of losses by broad framing. As was mentioned earlier, we now know that experimental subjects could be almost cured of their loss aversion (in a particular context) by inducing them to “think like a trader,” just as experienced baseball card traders are not as susceptible to the endowment effect as novices are. Students made risky decisions (to accept or reject gambles in which they could lose) under different instructions. In the narrow-framing condition, they were told to “make each decision as if it were the only one” and to accept their emotions. The instructions for broad framing of a decision included the phrases “imagine yourself as a trader,” “you do this all the time,” and “treat it as one of many monetary decisions, which will sum together to produce a ‘portfolio’.” The experimenters assessed the subjects’ emotional response to gains and losses by physiological measures, including changes in the electrical conductance of the skin that are used in lie detection. As expected, broad framing blunted the emotional reaction to losses and increased the willingness to take risks.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipsis added; italics in original.)

Reference Point Ignored Due to “Theory-Induced Blindness”

(p. 290) The omission of the reference point from the indifference map is a surprising case of theory-induced blindness, because we so often encounter cases in which the reference point obviously matters. In labor negotiations, it is well understood by both sides that the reference point is the existing contract and that the negotiations will focus on mutual demands for concessions relative to that reference point. The role of loss aversion in bargaining is also well understood: making concessions hurts. You have much (p. 291) personal experience of the role of reference point. If you changed jobs or locations, or even considered such a change, you surely remember that the features of the new place were coded as pluses or minuses relative to where you were. You may also have noticed that disadvantages loomed larger than advantages in this evaluation–loss aversion was at work. It is difficult to accept changes for the worse. For example, the minimal wage that unemployed workers would accept for new employment averages 90% of their previous wage, and it drops by less than 10% over a period of one year.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Kahneman Grants that “the Basic Concepts of Economics Are Essential Intellectual Tools”

(p. 286) Most graduate students in economics have heard about prospect theory and loss aversion, but you are unlikely to find these terms in the index of an introductory text in economics. I am sometimes pained by this omission, but in fact it is quite reasonable, because of the central role of rationality in basic economic theory. The standard concepts and results that undergraduates are taught are most easily explained by assuming that Econs do not make foolish mistakes. This assumption is truly necessary, and it would be undermined by introducing the Humans of prospect theory, whose evaluations of outcomes are unreasonably short-sighted.
There are good reasons for keeping prospect theory out of introductory texts. The basic concepts of economics are essential intellectual tools, which are not easy to grasp even with simplified and unrealistic assumptions about the nature of the economic agents who interact in markets. Raising questions about these assumptions even as they are introduced would be confusing, and perhaps demoralizing. It is reasonable to put priority on helping students acquire the basic tools of the discipline. Furthermore, the failure of rationality that is built into prospect theory is often irrelevant to the predictions of economic theory, which work out with great precision in some situations and provide good approximations in many others. In some contexts, however, the difference becomes significant: the Humans described by prospect theory are (p. 287) guided by the immediate emotional impact of gains and losses, not by long-term prospects of wealth and global utility.
I emphasized theory-induced blindness in my discussion of flaws in Bernoulli’s model that remained unquestioned for more than two centuries. But of course theory-induced blindness is not restricted to expected utility theory. Prospect theory has flaws of its own, and theory-induced blindness to these flaws has contributed to its acceptance as the main alternative to utility theory.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Sticking with Expected Utility Theory as an Example of “Theory-Induced Blindness”

(p. 286) Perhaps carried away by their enthusiasm, [Rabin and Thaler] . . . concluded their article by recalling the famous Monty Python sketch in which a frustrated customer attempts to return a dead parrot to a pet store. The customer uses a long series of phrases to describe the state of the bird, culminating in “this is an ex-parrot.” Rabin and Thaler went on to say that “it is time for economists to recognize that expected utility is an ex-hypothesis.” Many economists saw this flippant statement as little short of blasphemy. However, the theory-induced blindness of accepting the utility of wealth as an explanation of attitudes to small losses is a legitimate target for humorous comment.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: bracketed names and ellipsis added.)

A Marshmallow Now or an Elegant French Pastry Four Years Later

HowChildrenSucceedBK2012-08-31.jpg

Source of book image: http://images.amazon.com/images/G/01/richmedia/images/cover.gif

(p. 19) Growing up in the erratic care of a feckless single mother, “Kewauna seemed able to ignore the day-to-day indignities of life in poverty on the South Side and instead stay focused on her vision of a more successful future.” Kewauna tells Tough, “I always wanted to be one of those business ladies walking downtown with my briefcase, everybody saying, ‘Hi, Miss Lerma!’ “

Here, as throughout the book, Tough nimbly combines his own reporting with the findings of scientists. He describes, for example, the famous “marshmallow experiment” of the psychologist Walter Mischel, whose studies, starting in the late 1960s, found that children who mustered the self-control to resist eating a marshmallow right away in return for two marshmallows later on did better in school and were more successful as adults.
“What was most remarkable to me about Kewauna was that she was able to marshal her prodigious noncognitive capacity — call it grit, conscientiousness, resilience or the ability to delay gratification — all for a distant prize that was, for her, almost entirely theoretical,” Tough observes of his young subject, who gets into college and works hard once she’s there. “She didn’t actually know any business ladies with briefcases downtown; she didn’t even know any college graduates except her teachers. It was as if Kewauna were taking part in an extended, high-stakes version of Walter Mischel’s marshmallow experiment, except in this case, the choice on offer was that she could have one marshmallow now or she could work really hard for four years, constantly scrimping and saving, staying up all night, struggling, sacrificing — and then get, not two marshmallows, but some kind of elegant French pastry she’d only vaguely heard of, like a napoleon. And Kewauna, miraculously, opted for the napoleon, even though she’d never tasted one before and didn’t know anyone who had. She just had faith that it was going to be delicious.”

For the full review, see:
ANNIE MURPHY PAUL. “School of Hard Knocks.” The New York Times Book Review (Sun., August 26, 2012): 19.
(Note: the online version of the article is dated August 23, 2012.)

The full reference for the book under review, is:
Tough, Paul. How Children Succeed: Grit, Curiosity, and the Hidden Power of Character. Boston, MA: Houghton Mifflin Harcourt, 2012.

“Theory-Induced Blindness”

(p. 276) The mystery is how a conception of the utility of outcomes that is vulnerable to . . . obvious counterexamples survived for so long. I can explain (p. 277) it only by a weakness of the scholarly mind that I have often observed in myself. I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it. . . . As the psychologist Daniel Gilbert observed, disbelieving is hard work, and System 2 is easily tired.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipses added.)

Premortem Reduces Bias from Uncritical Optimism

(p. 265) As a team converges on a decision–and especially when the leader tips her hand–public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty to the team and its leaders. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier. The premortem is not a panacea and does not provide complete protection against nasty surprises, but it goes some way toward reducing the damage of plans that are subject to the biases of WYSIATI and uncritical optimism.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.