coherence. Contrary to what we believe about ourselves, the
majority of our cognitive processes are System 1, not System
2, in nature. And System 1 is very poor at statistical reasoning.
Indeed, Kahneman reports that his extraordinarily fruitful
research program was motivated by his experience teaching
statistics to university students in Israel. He found what he was
teaching to be very unintuitive, and wondered whether he was
alone. He wasn’t. It turns out that none of us are natural statisticians. Joan Didion was more literally correct than she perhaps
realized when she wrote the famous line, “We tell ourselves
stories in order to live.”
Cognitive Availability and the Affect Effect
Just as the various forms of “thinking fast” impair our ability to
reason in the face of uncertainty, so they also impair our ability to
accurately perceive and consistently manage risks. The availability
heuristic is once again a prime culprit. Wharton economist Howard Kunreuther has observed that immediately after disasters such
as large floods or earthquakes, people who have either witnessed
or experienced the disaster tend to be very diligent about managing these risks: They buy insurance, retrofit their homes, stock up
on emergency supplies, and so on. But as memories of the disaster
recede, so does the level of diligence in managing these risks. Immediately prior to Hurricane Katrina, for example, a majority of
New Orleans residents did not have flood insurance despite the
fact that it was government subsidized. Presumably the risk of a
flood was too distant, not sufficiently cognitively available.
Availability also sheds light on the fact that societies tend to
plan for risks that are only as bad as the worst disaster actually
experienced. Paraphrasing Kunreuther, Kahneman comments:
As long ago as pharaonic Egypt, societies have tracked the
high-water mark of rivers that periodically flood—and have
always prepared accordingly, apparently assuming that
floods will not rise higher than the existing high-water mark.
Images of a worse disaster do not come easily to mind.
A second way in which risk perceptions are distorted has to do
with emotions. Most people probably do not need psychologists
to point out that emotions play a role in clouding our judgments.
Paul Slovic, the leading figure in the study of the psychology of
risk, calls our tendency to let our likes and dislikes determine our
beliefs about the world “the affect heuristic.” In political debates,
for example, people tend not to adopt positions that are most
strongly supported by evidence and rational argument. Rather,
they search for arguments that support the positions that they
like. The moral psychologist Jonathan Haidt characterizes such
phenomena as “the emotional tail wags the rational dog.”
The emotional tail wagging the rational dog affects peo-
ple’s perceptions of risk in an intriguing way. Slovic and his
colleagues have observed high negative correlations between
people’s assessments of the benefits and risks of various technologies. People who appreciate the benefits of nuclear power,
for example, tend to downplay the risks it presents. And conversely, people with a keen appreciation of the risks posed by
nuclear power tend to downplay the benefits it offers. One study
had a particularly striking finding: People were presented with
a message describing the benefits of a certain technology. The
message discussed the benefits, not the risks, of the technology.
Nevertheless, by highlighting the benefits, the message prompted people in the study to lower their assessments of the risks
posed by the technology.
This is a specific instance of a general tendency that
Kahneman discusses. In many situations, we unconsciously
substitute easy questions (“do we like the benefits of this technology?”) for difficult ones (“what are the risks, and do the
benefits justify them?”). Once again, the mind is a machine
for jumping to conclusions.
What Then Must We Do?
All of this is bad news, but perhaps a small silver lining is that
the ubiquity of cognitive biases plays to the strengths of actuarial
training. Over a century ago, H.G. Wells wrote that statistical
thinking one day would be as necessary for efficient citizenship
as the ability to read and write. The discoveries of Kahneman,
Tversky, and their followers only add force to this prescient