Blink
Malcolm Gladwell's Blink: The Power of Thinking Without Thinking is an interesting and easy-to-read book about snap judgments. It's chock full of intriguing examples, some of which are serious scientific research that may come across as neat little anecdotes, some of which are neat little anecdotes, and some whose true nature falls in between (perhaps as "case studies"). The book is a little light on the theory of how these intuitive judgments work, when they go right and when they go wrong, and what should be done to guard against bad intuitions while taking advantage of our abilities to make good intuitive judgments. Such discussion of theory and implications is not absent, just underdeveloped. I'll try to sketch out Gladwell's main points and some support for them (in anecdotal form), with just a few comments of my own. I'll share more of my views in future posts. (Warning: this post is full of spoilers.)
An outline of the main points:
1. Thin-slices: There is enough information present in a few variables in a brief amount of time to make powerful predictions about complex processes.
2. Intuitive Thin-slicing: In some situations, people can rapidly pick up on that information well enough to have an appropriate intuitive response.
3. Unconscious Intuitive Thin-slicing: Often, people cannot explain why they have that appropriate intuitive response.
4. Explanatory Interference: Trying to explain the reasons for their reaction can lead to worse judgments.
5. Judgment Errors: Sometimes people fail at picking up on that useful information using it to have good judgments.
6. Judgment Biases: People's intuitive responses and their deliberative judgments can have systematic and predictable biases, with unfortunate consequences.
7. Distracting Information: A common cause of judgment errors is that people are influenced by irrelevant information.
8. The Power of Situations: It is possible to influence the intuitive judgments that we make by influencing the situations that we encounter.
9. Systems that Promote Good Intuitive Judgment: Even though intuitive judgments seem beyond our control, we can try to organize the situations that we encounter in order to take advantage of our abilities to make good snap judgments and to minimize errors and biases.
1. Thin-slices: Taking a "thin slice" of a process can provide enough information to say a lot about the process, even if it is very complex.
During World War II, the Allies intercepted a lot of encrypted Morse code radio messages from the Germans. Although they hadn't broken the code, they learned to identify the individual style of each radio operator. Each individual's style was so distinctive that the interceptors could recognize which German radio operator was sending the message just from a few seconds of Morse code taps. There is some pattern that is evident in even a brief observation.
Professors also seem to have a style that is evident in even the briefest excerpt. Ratings of a teacher's effectiveness by students who had been in a class of theirs for a semester proved to be remarkably similar to the ratings given by students who only saw a 2 second videotape of the professor, with the sound turned off.
Psychologist John Gottman has found that romantic relationships have a similar kind of pattern. Gottman brings recently married couples into his lab and has them talk with each other about a topic that has become a point of contention between them while he video tapes their conversation and records some of their physiological reactions. Gottman's lab codes the data, including the positive and negative emotions that are shown in the couple's facial expressions each second and the style of interaction between the couple. From a 15 minute conversation, Gottman's lab can predict with nearly 90% accuracy whether they will still be married in 15 years. From an hour long conversation, his accuracy rises to 95%.
Thin slices, then, contain enough information, not only to identify a style or to correlate with long-term observations, but also to predict important objective facts, like whether a couple will get divorced.
2. Intuitive Thin-slicing: Sometimes, people's intuitive reactions take advantage of the abundance of information that is present in thin slices.
An alleged Ancient Greek statue of a youth, or "kouros", was brought to the Getty Museum. 14 months of scientific analysis convinced the museum that the kouros was genuine. In fact, it was a 20th century forgery. Experts in Ancient Greek art who looked at the statue were able to identify it as a fake within just a few seconds. One reported an "intuitive repulsion" to it, another said that the first word that came to mind when he saw it was "fresh."
Experts aren't the only ones who use thin-slicing well. It turns out that college students who spend 15 minutes in another student's dorm room give more accurate personality ratings of that student's conscientiousness, emotional stability, and openness to experience than that student's close friends.
Why is such a thin slice of college students' lives - their dorm rooms - a more informative source of many aspects of their personality than a close, personal relationship with them? Part of the issue is that people have too much information about their friends, and it's hard to sort through it all to find the relevant information. People who only see the dorm rooms don't have so much irrelevant information, and apparently they have enough relevant information because, as Navin Johnson said, "you can tell so much about a person from the way they live."
When predicting if couples will stay together, it's not necessary to exhaustively code the data and analyze it statistically. John Gottman can personally make very good predictions (Gladwell doesn't say precisely how good) just from watching a couple for a few minutes. He has learned to intuitively pick up on the most important signs of impending marital problems (contempt for each other is at the top of the list).
3. Unconscious Intuitive Thin-slicing: People are often unaware of how they make intuitive judgments.
Imagine that you have to solve this problem: two ropes are hanging vertically from the ceiling, too far apart for you to reach both at once, and you have to find ways to tie them together. One solution, which very few people come up with on their own, is to swing one rope, grab the other rope, and then catch the swinging rope. In one study, people get a subtle hint: the experimenter casually walks across the room in a way that involves brushing against one rope and making it swing slightly. Most of the people were able to pick up on the hint and identify the rope-swinging solution. However, only 1 of these people realized that they got the idea from the experimenter's brush with the rope. The rest came up with unrelated explanations of their inspiration. They had no idea of the process that led to their (successful) intuition.
Vic Braden, a highly experienced tennis coach, discovered something both remarkable and frustrating. He found that he was able to predict when a tennis player would double fault before the player's racket had even hit the ball. When he put his ability to an empirical test, he proved to be about 90% accurate. His years of tennis experience gave him the intuitive ability to see some flaw in a player's serving motion before contact with the ball. Braden's uncanny ability was somewhat disturbing to him, though, because, try as he might, he could not figure out what it was that he was picking up on.
People can do amazing things with their intuitive judgments, in highly specific areas or in everyday life, but they are often clueless about how they do it. No amount of careful introspection can unveil what is being done behind the scenes. Braden has recently hooked up with some biomechanics experts who are using advanced computer modeling to try to identify what pattern makes double faults so predictable.
4. Explanatory Interference: Thinking about how you make your intuitive judgments can prevent you from making good intuitive judgments.
Puzzles like the rope problem require some sudden insight or inspiration, and it is difficult to identify its source. In one study, people who were given a bunch of these insight problems and were told that they would have to explain how they tried to solve each problem ended up solving 30% fewer problems than people who did not have to explain their reasoning. Attempting to develop explanations of their intuitions interfered with successful intuiting and inhibited their performance.
In another study, ordinary people ranked five kinds of strawberry jams. Their rankings correlated fairly well with the ratings of expert jam-tasters (r = .55). Expert jam-tasters can explain their ratings based on very specific qualities of texture and taste that they are trained to evaluate. If ordinary people are asked to explain their reasons for their rankings, though, then the rankings that they give end up bearing almost no resemblance to the rankings of experts (r = .11). Ordinary people lack the vocabulary and the training to break the experience of tasting jam down to its component qualities, and when they try to do so they end up basing their final assessments on this information that they do not know how to use. Their gut reactions are better at identifying a good jam than their reasoned analyses.
5. Judgment Errors: Even though the information necessary for a good judgment is available in a small slice of an experience, people often fail to successfully use the right information.
Ordinary people suffered from an overload of information that was unusable to them when they tried to do a component-based ranking of jams. Gladwell experienced a similar information-overload when he watched videos of married couples and tried to predict if they would stay together based on the complex analytical model that Gottman had developed. Gladwell didn't know to focus on a few key factors, like contempt, and he lacked the experience to identify such negative emotions in real time. His predictions ended up being around chance levels.
Even experts make judgment errors because they fail to properly use the information at hand. Hospital personnel frequently have to decide whether someone who comes in with heart pain is suffering a heart attack or some more benign problem like heartburn, and they tend to take into account a variety of factors, including symptoms and risk factors like a stressful or unhealthy lifestyle. Lee Goldman developed a simple algorithm for making the same judgment, using only 4 yes/no factors relating to the ECG, unstable heart pain, blood pressure, and fluid in the lungs. At Cook County hospital, this algorithm outperformed the judgments of every health care professional both at picking out the real heart attacks and at minimizing false positives. Information that is not included in this algorithm, like the patient's lifestyle, is a negligible factor that distracts medical experts more than it informs them.
6. Judgment Biases: Judgment errors are often not random, but rather systematically and predictably biased.
Orchestras used to choose their new members in auditions where music experts watched applicants play. Classical orchestras consisted primarily of white men, and it was generally thought that women's smaller bodies made them incapable of playing stronger instruments like the trombone and that foreigners' lack of experience with Western culture kept them from truly understanding and mastering the Western music that they played. When audition practices became more controlled, so that applicants stayed anonymous behind a screen and experts only listened to their play instead of watching, orchestras rapidly diversified. Although experts who had watched women and foreigners play before had always thought that their performance was somewhat lacking, they proved to be just as capable as white men at making good music. Even though they did not set out to discriminate, the music experts' preconceived notions made their assessments sensitive to extraneous information about sex and ethnicity and inhibited their ability to identify good music.
Biases are not merely a matter of sexism and racism. Height makes a person seem more like a leader, so each inch gives a person about $789 extra in annual salary, and over 58% of CEOs of Fortune 500 Companies are more than 3 inches taller than the average man (i.e. over 6 feet tall). Gladwell argues that Warren Harding, by most accounts one of this country's worst Presidents, rose up through the levels of politics without any significant accomplishments because he looked distinguished and Presidential. Decisions like electing a President, choosing a CEO, or awarding raises and promotions are not intuitive reactions like responses to music. However, the effects of intuitive prejudices are strong enough to withstand the deliberative reasoning that decision-makers go through.
Note that there is no safety in numbers or consensus when it comes to these kinds of biases. If everyone's judgments are biased in the same way, in favor of men, or white men, or tall men, or distinguished-looking men, then the crowd will err just as the individual does.
7. Distracting Information: Many judgment errors and biases result from the influence of information that is not really relevant to the decision.
As many of the above examples demonstrate, a central part of good decision-making, intuitive or not, is using the right information, not using the wrong information, and not being overwhelmed by too much information. It can help if the decision-maker does not even have access to the irrelevant information, as in the case of orchestra auditions.
8. The Power of Situations: The way that things are presented to us can change how we respond to them.
As marketers know well, people's judgments about something depend on the circumstances. It has been found that factors like brand name, coloring, and packaging alter both people's ratings of the taste of a food product in taste tests and the amount that people are willing to spend on a product when making real consumer decisions. For instance, people will pay about 5-10 cents more for ice cream in a round container, rather than a rectangular one, and they will rate the taste of Chef Boyardee products as worse if the Chef (Hector) on the package looks more like a cartoon character.
Gladwell isn't bothered by these examples of the effects of seemingly irrelevant information in the same way that he's bothered about the discrimination against female trombone players. If people like realistic-looking Hector or round ice cream containers better, then why shouldn't the company service those preferences in addition to our preferences for food substance inside the container?
Other cases where seemingly irrelevant environmental influences matter involve priming, which alters what kinds of thoughts come to mind most easily. People who think about being a professor for 5 minutes before playing Trivial Pursuit get 13% more questions right than people who think about being a soccer hooligan. Blacks who have to identify their race on a questionnaire before taking a standardized test get significantly lower scores than those who are not asked about their race. Even subtle manipulations that unconsciously activate negative associations, like to soccer hooligans or stereotypes about African Americans, can inhibit intellectual performance.
9. Systems that Promote Good Intuitive Judgment: We should organize activities so that the circumstances in which we have to make judgments are circumstances in which we intuitively judge well.
Orchestra auditions provide a good example: if judgments are biased by the appearance of the applicant, then don't even let the judges see the applicant. Another examples comes from police work. Studies have shown that people are more likely to mistake some other object for a gun in their split-second judgments if that object is associated with a black person. Reducing the police officers' tendency to shoot people who are actually unarmed, or to harm suspects in general, does not require eliminating all prejudice from the police force. Acting more cautiously, following protocol, and avoiding car chases which get officers revved up past the point where they can think straight can all reduce problems by keeping cops out of situations where they are likely to make bad snap judgments.
As a more positive example, Gladwell describes the case of Paul Van Riper, a US military man who played the role of the leader of a rogue Middle Eastern nation in the Millennium Challenge, a quarter-billion dollar war game designed to test the latest complex information technology that the US military hoped to use to organize its overwhelming military might. Van Riper managed to surprise and seriously damage the far more sophisticated and advanced American side in the war game. Contrary to the centralized, deliberative method of decision-making that the US side employed in the war game, Van Riper organized his side in a way that allowed his generals to make intuitive decisions in the field based on their expert understanding of wartime situations and in line with their side's overall strategy.
Those are nine points from Blink, with lots of examples. I'll give more of my responses and say some more about the underlying theory in future posts. If you're interested, you can check out these other resources for more on Blink:
David Brooks - his take on the book seems about right (though his take on cognitive science as a field does not)
Gladwell has a discussion with James Surowiecki - read the author himself
Richard Posner - UPDATE 1/28: I've now read Posner's review, via the link at aldaily, so here's my mini-review of his review. Posner's overly confident reinterpretations of Gladwell's examples are hit-or-miss (and sometimes hit-and-miss, as on CEO height). His criticism of Blink's lack of theoretical depth or sophistication is in the neighborhood of truth, though far harsher than what is warranted. His complaints about Gladwell's style are simply distracting, given that tastes differ and Posner's are far from representative.
UPDATE 4/6: My long-awaited second post on Blink is finally up here.
Another linky follow-up: Algorithm.
0 Comments:
Post a Comment
<< Home