Simple Algorithms Predict Better than Trained Experts

(p. 222) I never met Meehl, but he was one of my heroes from the time I read his Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence.
In the slim volume that he later called “my disturbing little book,” Meehl reviewed the results of 20 studies that had analyzed whether clinical predictions based on the subjective impressions of trained professionals were more accurate than statistical predictions made by combining a few scores or ratings according to a rule. In a typical study, trained counselors predicted the grades of freshmen at the end of the school year. The counselors interviewed each student for forty-five minutes. They also had access to high school grades, several aptitude tests, and a four-page personal statement. The statistical algorithm used only a fraction of this information: high school grades and one aptitude test. Nevertheless, the formula was more accurate than 11 of the 14 counselors. Meehl reported generally sim-(p. 223)ilar results across a variety of other forecast outcomes, including violations of parole, success in pilot training, and criminal recidivism.
Not surprisingly, Meehl’s book provoked shock and disbelief among clinical psychologists, and the controversy it started has engendered a stream of research that is still flowing today, more than fifty years after its publication. The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algorithms. The other comparisons scored a draw in accuracy, but a tie is tantamount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convincingly documented.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: italics in original.)

The Illusion that Investment Advisers Have Skill

(p. 215) Some years ago I had an unusual opportunity to examine the illusion of financial skill up close. I had been invited to speak to a group of investment advisers in a firm that provided financial advice and other services to very wealthy clients. I asked for some data to prepare my presentation and was granted a small treasure: a spreadsheet summarizing the investment outcomes of some twenty-five anonymous wealth advisers, for each of eight consecutive years. Each adviser’s score for each year was his (most of them were men) main determinant of his year-end bonus. It was a simple matter to rank the advisers by their performance in each year and to determine whether there were persistent differences in skill among them and whether the same advisers consistently achieved better returns for their clients year after year.
To answer the question, I computed correlation coefficients between the rankings in each pair of years: year 1 with year 2, year 1 with year 3, and so on up through year 7 with year 8. That yielded 28 correlation coefficients, one for each pair of years. I knew the theory and was prepared to find weak evidence of persistence of skill. Still, I was surprised to find that the average of the 28 correlations was .01. In other words, zero. The consistent correlations that would indicate differences in skill were not to be found. The results resembled what you would expect from a dice-rolling contest, not a game of skill.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Neglecting Valid Stereotypes Has Costs

(p. 169) The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong. The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Some Irrationality Occurs Because Not Much Is at Stake, and Rationality Takes Time and Effort

(p. 164) The laziness of System 2 is part of the story. If their next vacation had depended on it, and if they had been given indefinite time and told to follow logic and not to answer until they were sure of their answer, I believe that most of our subjects would have avoided the conjunction fallacy. However, their vacation did not depend on a correct answer; they spent very little time on it, and were content to answer as if they had only been “asked for their opinion.” The laziness of System 2 is an important fact of life, and the observation that representativeness can block the application of an obvious logical rule is also of some interest.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Love Canal as a “Pseudo-Event” Caused by an “Availability Cascade”

(p. 142) An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by “availability entrepreneurs,” individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention-grabbing headlines. Scientists and others who try to dampen the increasing fear and revulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a “heinous cover-up.” The issue becomes politically important because it is on everyone’s mind, and the response of the political system is guided by the intensity of public sentiment. The availability cascade has now reset priorities. Other risks, and other ways that resources could he applied for the public good, all have faded into the background.
Kuran and Sunstein focused on two examples that are still controversial: the Love Canal affair and the so-called Alar scare. In Love Canal, buried toxic waste was exposed during a rainy season in 1979, causing contamination of the water well beyond standard limits, as well as a foul smell. The residents of the community were angry and frightened, and one of them, (p. 143) Lois Gibbs, was particularly active in an attempt to sustain interest in the problem. The availability cascade unfolded according to the standard script. At its peak there were daily stories about Love Canal, scientists attempting to claim that the dangers were overstated were ignored or shouted down, ABC News aired a program titled The Killing Ground, and empty baby-size coffins were paraded in front of the legislature. A large number of residents were relocated at government expense, and the control of toxic waste became the major environmental issue of the 1980s. The legislation that mandated the cleanup of toxic sites, called CERCLA, established a Superfund and is considered a significant achievement of environmental legislation. It was also expensive, and some have claimed that the same amount of money could have saved many more lives if it had been directed to other priorities. Opinions about what actually happened at Love Canal are still sharply divided, and claims of actual damage to health appear not to have been substantiated. Kuran and Sunstein wrote up the Love Canal story almost as a pseudo-event, while on the other side of the debate, environmentalists still speak of the “Love Canal disaster.”

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: italics in original.)

Dyslexics Better at Processing Some Visual Data

(p. 5) Gadi Geiger and Jerome Lettvin, cognitive scientists at the Massachusetts Institute of Technology, used a mechanical shutter, called a tachistoscope, to briefly flash a row of letters extending from the center of a subject’s field of vision out to its perimeter. Typical readers identified the letters in the middle of the row with greater accuracy. Those with dyslexia triumphed, however, when asked to identify letters located in the row’s outer reaches.
. . .
Dr. Catya von Károlyi, an associate professor of psychology at the University of Wisconsin, Eau Claire, found that people with dyslexia identified simplified Escher-like pictures as impossible or possible in an average of 2.26 seconds; typical viewers tend to take a third longer. “The compelling implication of this finding,” wrote Dr. Von Károlyi and her co-authors in the journal Brain and Language, “is that dyslexia should not be characterized only by deficit, but also by talent.”
. . .
Five years ago, the Yale Center for Dyslexia and Creativity was founded to investigate and illuminate the strengths of those with dyslexia, while the seven-year-old Laboratory for Visual Learning, located within the Harvard-Smithsonian Center for Astrophysics, is exploring the advantages conferred by dyslexia in visually intensive branches of science. The director of the laboratory, the astrophysicist Matthew Schneps, notes that scientists in his line of work must make sense of enormous quantities of visual data and accurately detect patterns that signal the presence of entities like black holes.
A pair of experiments conducted by Mr. Schneps and his colleagues, published in the Bulletin of the American Astronomical Society in 2011, suggests that dyslexia may enhance the ability to carry out such tasks. In the first study, Mr. Schneps reported that when shown radio signatures — graphs of radio-wave emissions from outer space — astrophysicists with dyslexia at times outperformed their nondyslexic colleagues in identifying the distinctive characteristics of black holes.
In the second study, Mr. Schneps deliberately blurred a set of photographs, reducing high-frequency detail in a manner that made them resemble astronomical images. He then presented these pictures to groups of dyslexic and nondyslexic undergraduates. The students with dyslexia were able to learn and make use of the information in the images, while the typical readers failed to catch on.
. . .
Mr. Schneps’s study is not the only one of its kind. In 2006, James Howard Jr., a professor of psychology at the Catholic University of America, described in the journal Neuropsychologia an experiment in which participants were asked to pick out the letter T from a sea of L’s floating on a computer screen. Those with dyslexia learned to identify the letter more quickly.
Whatever special abilities dyslexia may bestow, difficulty with reading still imposes a handicap.

For the full commentary, see:
ANNIE MURPHY PAUL. “The Upside of Dyslexia.” The New York Times, SundayReview Section (Sun., February 5, 2012): 5.
(Note: ellipsis added.)
(Note: online version of the commentary is dated February 4, 2012.)

Experience Can Provide Sound Intuitive Knowledge

(p. 11) . . . , the accurate intuitions of experts are better explained by the effects of prolonged practice than by heuristics. We can now draw a richer and more balanced picture, in which skill and heuristics are alternative sources of intuitive judgments and choices.
The psychologist Gary Klein tells the story of a team of firefighters that entered a house in which the kitchen was on fire. Soon after they started hosing down the kitchen, the commander heard himself shout, “Let’s get out of here!” without realizing why. The floor collapsed almost immediately after the firefighters escaped. Only after the fact did the commander realize that the fire had been unusually quiet and that his ears had been unusually hot. Together these impressions prompted what he called a “sixth sense of danger.” He had no idea what was wrong, but he knew something was wrong. It turned out that the heart of the fire had not been in the kitchen but in the basement beneath where the men had stood.
We have all heard such stories of expert intuition: the chess master who walks past a street game and announces “White mates in three” without stopping, or the physician who makes a complex diagnosis after a single glance at a patient. Expert intuition strikes us as magical, but it is not. Indeed, each of us performs feats of intuitive expertise many times each day. Most of us are pitch-perfect in detecting anger in the first word of a telephone call, recognize as we enter a room that we were the subject of the conversation, and quickly react to subtle signs that the driver of the car in the next lane is dangerous. Our everyday intuitive abilities are no less marvelous than the striking insights of an experienced firefighter or physician–only more common.
The psychology of accurate intuition involves no magic. Perhaps the best short statement of it is by the great Herbert Simon, who studied chess masters and showed that after thousands of hours of practice they come to see the pieces on the board differently from the rest of us. You can feel Simon’s impatience with the mythologizing of expert intuition when he writes: “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipsis added.)

Behavioral Economics Does Not Undermine Capitalism

thinkingfastandslowBK2012-06-21.jpg

Source of book image: http://www.brainpickings.org/wp-content/uploads/2011/10/thinkingfastandslow.jpg

Daniel Kahneman first gained fame in economics through research with Tversky in which they showed that some of economists’ assumptions about human rationality do not always hold true.
Kahneman, whose discipline is psychology, went on to win the Nobel Prize in economics, sharing the prize with Vernon Smith. (Since the Prize is not normally awarded posthumously, Tversky was not a candidate.)
I have always thought that ultimately there should be only one unified science of human behavior—not claims that are “true” in economics and other claims that are “true” in psychology. (I even thought of minoring in psychology in college, before I realized that the price of minoring included taking time-intensive lab courses where you watched rats run through mazes.)
But I don’t think the implications of current work in behavioral economics are as clear as has often been asserted.
Some important results in economics do not depend on strong claims of rationality. For instance, the most important “law” in economics is the law of demand, and that law is due to human constraints more than to human rationality. Gary Becker, early in his career, wrote an interesting paper in which he showed that the law of demand could also be derived from habitual and random behavior. (I remember in conversation, George Stigler saying that he did not like this paper by Becker, because it did not hone closely to the rationality assumption that Stigler and Becker defended in their “De Gustibus” article.)
The latest book by Kahneman is rich and stimulating. It mainly consists of cataloging the names of, and evidence for, a host of biases and errors that humans make in thinking. But that does not mean we cannot choose to be more rational when it matters. Kahneman believes that there is a conscious System 2 that can over-ride the unconscious System 1. In fact, part of his motive for cataloging bias and irrationality is precisely so that we can be aware, and over-ride when it matters.
Sometimes it is claimed, as for instance in a Nova episode on PBS, that bias and irrationality were the main reasons for the financial crisis of 2008. I believe the more important causes were policy mistakes, like Clinton and Congress pressuring Fannie Mae and Freddie Mac to make home loans to those who did not have the resources to repay them; and past government bailouts encouraging finance firms to take greater risks. And the length and depth of the crisis were increased by government stimulus and bailout programs. If instead, long-term cuts had been made in taxes, entrepreneurs would have had more of the resources they need to create start-ups that would have stimulated growth and reduced unemployment.
More broadly, aspects of behavioral economics mentioned, but not emphasized, by Kahneman, can actually strengthen the underpinnings for the case in favor of entrepreneurial capitalism. Entrepreneurs may be more successful when they are allowed to make use of informal knowledge that would not be classified as “rational” in the usual sense. (I discuss this some in my forthcoming paper, “The Epistemology of Entrepreneurship.”)
Still, there are some useful and important examples and discussions in Kahneman’s book. In the next several weeks, I will be quoting some of these.

Book discussed:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

The Becker article mentioned above is:
Becker, Gary S. “Irrational Behavior and Economic Theory.” Journal of Political Economy 70, no. 1 (Feb. 1962): 1-13.

The Stigler-Becker article mentioned above is:
Stigler, George J., and Gary S. Becker. “De Gustibus Non Est Disputandum.” American Economic Review 67, no. 2 (March 1977): 76-90.

“Nothing Lasts Forever”

StalinBustInPrague2012-06-04.jpg

“How will we react when history presents us with uncertainty and risk? A sign on a Stalin bust in Prague in 1989 reads ‘Nothing Lasts Forever.'” Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. B1) The psychologist Daniel Kahneman writes that humans naturally “tend to exaggerate our ability to forecast the future, which fosters optimistic overconfidence,” something he terms the “planning fallacy.”
“In terms of its consequences for decisions, the optimistic bias may well be the most significant of the cognitive biases,” he notes. “When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy.”

For the full commentary, see:
JOHN BUSSEY. “THE BUSINESS; The Euro Crisis in Ourselves.” The Wall Street Journal (Fri., June 1, 2012): A13.

Some Tasks Are Done Better in Private Offices

QuietBK2012-05-03.jpg

Source of book image: http://timeopinions.files.wordpress.com/2012/01/quiet-final-jacket.jpg

(p. 4) When the R.C. Hedreen Company, a real estate development firm based in Seattle, commissioned a renovation of a 10,800-square-foot floor in an old downtown office building five years ago, it specified a perimeter of private offices. Collaborative spaces are provided for creative teamwork, but the traditional offices remain the executives’ home ports.

”Individually, a lot of our workday is taken up with tasks that are better served by working alone in private offices,” says David Thyer, Hedreen’s president.
Susan Cain, author of ”Quiet: The Power of Introverts in a World That Can’t Stop Talking,” is skeptical of open-office environments — for introverts and extroverts alike, though she says the first group suffers much more amid noise and bustle.
Introverts are naturally more comfortable toiling alone, she says, so they will cope by negotiating time to work at home, or by isolating themselves with noise-canceling headphones — ”which is kind of an insane requirement for an office environment, when you think about it,” she says.
Ms. Cain also says humans have a fundamental need to claim and personalize space. ”It’s the room of one’s own,” she says. ”Your photographs are on the wall. It’s the same reason we have houses. These are emotional safety zones.”

For the full story, see:
LAWRENCE W. CHEEK. “Please, Just Give Me Some Space: In New Office Designs, Room to Roam and to Think.” The New York Times, SundayBusiness Section (Sun., March 18, 2012): 1 & 4.

The book mentioned is:
Cain, Susan. Quiet: The Power of Introverts in a World That Can’t Stop Talking. New York: Crown, 2012.