UPDATE 6/24: I cited Jonah Lehrer below. This has to be read to be believed - Lehrer might as well have titled his article about himself.
One of the best books that I read recently is Nobel Prize winner Daniel Kahneman's "Thinking, Fast and Slow". You'll find a lot of reviews of the book online (e.g., Jim Holt at the NYT), but the best thing to do would be to pick up a copy of the book and read it (Sci Am has reprinted a portion recently). As one of the reviewers in Amazon.com, Adam Smythe, wrote [all bold text in this post is my emphasis]:
Daniel Kahneman, the author of this exceptional book, and Amos Tversky (who died in 1996) made economics and other disciplines a lot more realistic--and tougher--for economists, researchers and students. Prior to their work, economists and others maintained classical theories and explanations that relied on certain seemingly logical assumptions about human behavior. However, people don't always behave the way logic might suggest, for a variety of reasons that Kahneman (and Tversky) explained, starting in the 1970s. Today, the subject of behavioral decision-making is one of the more exciting ones in fields like economics, finance, medicine and even law, thanks to their pioneering work. In recognition of the impact of his work in economics, Kahneman, a cognitive psychologist and professor emeritus at Princeton, won the Nobel Prize in Economics in 2002, specifically for his work on prospect theory.
The title of this book comes from Kahneman's discussion of two simple models of how people think. "System 1" thinking corresponds to fast, intuitive, emotional and almost automatic decisions, though it sometimes leaves us at the mercy of our human biases. "System 2" thinking is more slow-going and requires more intellectual effort. To nobody's surprise, we humans are more likely to rely on System 1 thinking, because it saves us effort, even if it can lead to flawed thinking.
Andrew Revkin at Dot Earth (NYT) posted the video a recent talk by Kahneman and added:
As I noted via Twitter during the meeting, this talk and many other engaging presentations at the event illustrate the importance of adding a fresh facet to the popular notion that today’s citizens, and particularly students, would do well to improve their capacity for critical thinking:
“Critical thinking has to include assessing one’s own thinking.”
Last week, Jonah Lehrer at The New Yorker, mentioned Kahneman's work and wrote about another recent study whose implications are important in a post titled "Why Smart People are Stupid". Here are some relevant extracts from Lehrer's piece, but go read his entire post:
For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.
Although Kahneman is now widely recognized as one of the most influential psychologists of the twentieth century, his work was dismissed for years. Kahneman recounts how one eminent American philosopher, after hearing about his research, quickly turned away, saying, “I am not interested in the psychology of stupidity.”
The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.[...]
The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes.
Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
And here’s the upsetting punch line: intelligence seems to make things worse.
The abstract of the paper "Cognitive Sophistication Does Not Attenuate the Bias Blind Spot" by West, Meserve and Stanovich says:
The so-called bias blind spot arises when people report that thinking biases are more prevalent in others than in themselves. Bias turns out to be relatively easy to recognize in the behaviors of others, but often difficult to detect in one's own judgments. Most previous research on the bias blind spot has focused on bias in the social domain. In 2 studies, we found replicable bias blind spots with respect to many of the classic cognitive biases studied in the heuristics and biases literature (e.g., Tversky & Kahneman, 1974). Further, we found that none of these bias blind spots were attenuated by measures of cognitive sophistication such as cognitive ability or thinking dispositions related to bias. If anything, a larger bias blind spot was associated with higher cognitive ability. Additional analyses indicated that being free of the bias blind spot does not help a person avoid the actual classic cognitive biases. We discuss these findings in terms of a generic dual-process theory of cognition
Cory Doctorow at Boing Boing expands on this in his post "Smart people are especially prone to stupid mistakes" and says:
This has particularly grim implications for a society that thinks it is a meritocracy but is really an oligarchy, because the competitively educated people at the top believe (incorrectly) that they don't need to have their intuitions reviewed by lesser mortals.
That link in turn takes us to a discussion of a Chris Hayes article on how meritocracies become oligarchies ("Why Elites Fail"), with the following apt ending:
Hayes goes on to interview various Wall Street titans and hedge fund managers, and gets their own account of how they feel that they are innately superior -- the smartest guys in the room -- and how great it is that the nation takes its cues from them.
Rather timely for the era we are living in.
Comments