Skip to main content
Humanities LibreTexts

21.6: Cognitive Biases and the Misperception of Risk

  • Page ID
    95219
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    All sorts of factors make it easy to misperceive risk. The media report certain types of calamities (e.g., people killed in fires) more often than others that are in fact, more common (e.g., drownings). Then too, the grislier cases stick in our minds. And as if that weren’t enough, there are people who have a vested interest in exaggerating certain risks (you need more insurance; you must take this special dietary supplement to avoid liver cancer). Finally, risks that are serious (like heart disease) may require big changes in our lives, so it is often tempting to downplay them.

    Many of the biases and fallacies we have studied lead us to overestimate the risk of some things and to underestimate the risk of others.

    Sample Size and Bias

    Whenever we draw inferences from small or biased samples, our conclusions will be unreliable. This is as true when the conclusions are about risks and remedies as it is about anything else.

    Neglecting Base Rates

    If we neglect base rate information, our estimates of various outcomes can be highly distorted. Often, we hear figures that sound very dramatic, but they sometimes become trivial when we learn about the relevant base rates. A new drug cuts the death rate from the bubonic plague in half. But the base rate of plague is very low (less than four Americans got it last year). Whenever we hear about risks, our first question should always be: What is the base rate? Typically, we don’t need a very precise answer; a ballpark figure is usually enough.

    Availability

    If we don’t appreciate how large the difference between the probabilities of having a heart attack and the probability of dying at the hands of a terrorist are, even after September 11, it will be difficult to make rational plans about diet and travel. Several thousand Americans will die from heart disease in a year, whereas in every year but 2001, about one in a million Americans die at the hands of terrorists.

    Risk information can be available for many different reasons. The media report some things more than others (e.g., fires more than drownings; plane crashes more than car wrecks). Moreover, people you know tend to talk more about some risks than others (if your uncle was recently mugged, you will hear a lot about muggings). Sometimes a particularly vivid and horrifying sort of accident comes to mind more easily simply because it is more frightening. For example, being electrocuted by wiring in your home sounds very gruesome. But only 200 Americans (less than one in a million) a year die from electrocution. By contrast, over 7,000 die from falls in the home. And of course, no one can forget the sight of the twin towers at the World Trade Center collapsing.

    Probabilities of Conjunctions and Disjunctions

    We tend to overestimate the probabilities of conjunctions and underestimate the probabilities of disjunctions. This means that we underestimate likelihood of failure and overestimate the likelihood of success. This can lead us to underestimate the likelihood of certain risks.

    Cumulative Effects

    We are prone to underestimate the power of cumulative effects. For example, a contraceptive device may work 99% of the time, but if we rely on it frequently over the years, there is a good chance that it will eventually let us down. Suppose, for example, that you use a brand of condom that breaks 1% of the time. Each time you use one, the chances are low that it will break. But if you use that brand of condom a couple of hundred times, the chances of a failure start to mount up. Similarly, the chances of being killed in an automobile accident each time we drive are low, but with countless trips over the years, the odds mount up.

    Coincidence

    Wilbur survives a disease that is fatal to 98% of the people who contract it. Wilbur’s case is rare, and so people will talk about it, it may make the papers or TV, and so we are likely to hear about it. If we focus too much on the lucky few who survive a disease despite doing everything their doctor warned them not to, we may conclude that the risk of the disease is much lower than it is.

    Regression to the Mean

    If we overlook regression to the mean, we may think that certain measures will decrease targeted risks, even when they are ineffective and only seem useful because they happened to coincide with regression to the mean. For example, we may overestimate the power of a given policy (like increasing the number of police or enacting tougher sentencing laws) to cut down on crime. This will mean that we have an inaccurate perception of the risks of various crimes and the best ways to combat them.

    Illusory Correlation

    When we believe in an illusory correlation, we think that changes in one thing tend to accompany changes in another. For example, we may think that certain jobs or occupations have a higher (or lower) correlation with various diseases than they really do. This will lead us to overestimate (or underestimate) the risks of various undertakings.

    Anchoring and Adjustment

    It is possible to set anchors at unreasonably high, or unreasonably low, probabilities for a given type of risk. Even though we frequently adjust for these anchors, we often don’t adjust enough. So, a high anchor can lead us to overestimate the likelihood of a risk and a low anchor can lead us to underestimate it.

    Wishful Thinking and Dissonance Reduction

    It is often easier to deal with a risk by convincing ourselves that it’s not as serious as other people say. When the Surgeon General’s first report on the dangers of smoking came out in 1964, only 10% of nonsmokers doubted the report. 40% of heavy smokers did.

    It’s easy to dismiss a report of a recent study suggesting that one of our favorite foods causes cancer by saying that everything causes cancer and the experts keep changing their minds anyway. This is not an unreasonable reaction to a single study. But many of the greatest health risks, e.g., smoking and heart attacks, are established beyond all reasonable doubt. Unfortunately, the remedies, while having little financial cost, can exact a huge cost in the changes of lifestyle they require. Many people who would pay a lot of money to avoid these risks won’t pay the price of lifestyle change. It is easier to downplay the risk.

    Framing Effects Revisited

    Earlier we learned that people are typically risk averse when it comes to possible gains. We prefer a certain gain (say of $10) to a 50/50 chance of getting $20 (even though these alternatives have the same expected value). In fact, many people prefer a certain gain (say of $10) to a 50/50 chance of getting $25 or even more. By contrast, people tend to be risk seekers when it comes to losses. Most of us prefer the risk of a large loss to a certain loss that is smaller. For example, most people prefer B (a 25% chance of losing $200, and a 75% chance of losing nothing) to A (a 100% chance of losing $50). How we think about risks often depends on how things are framed.

    More specifically, it depends on whether they are framed as gains (200 people are saved) or as losses (400 people die). When we frame a choice in terms of a certain loss we think about it differently than we would if we frame it in terms of insurance.

    When we frame a choice in terms of people being saved we think about it differently than we would if we frame it in terms of people dying.

    Tradeoffs

    Often the only way to decrease one risk is to increase another one. To take a whimsical example first, you will decrease your risks of being hit by a car or falling under a train if you stay home all day in bed. But in the process, you will have increased the risks of injury from falling out of bed, the risk of countless health problems due to lack of exercise, and the risk of being poor since you’ll probably lose your job.

    The same point holds for risks that are a serious worry. Many illnesses are best treated with medication, and if they are serious you may be better off in a hospital. But hospitals are run by people and people in all areas are prone to error. Research out of John Hopkins University shows that medical error is the third leading cause of death in the US, causing around 250,000 deaths nationwide each year. Further endangering you is the possibility of adverse reactions to prescribed medications. At the end of the day, if you’re sick enough, you are better off in a hospital, but it isn’t without risk.

    The U.S. respond to the Covid-19 pandemic focused on tradeoffs. The safest thing in terms of preventing the spread of the virus and saving lives was for everyone to shelter in place. If everyone did this, though, it would mean that we would not have access to food and cleaning supplies. We needed to balance the harms that came from exposure with the harms of not having access to basic supplies. Evaluating how badly we handled negotiating these tradeoffs is a job best left for an ethics or public policy class, but the argument about how we ought to behave was really an argument about tradeoffs.


    This page titled 21.6: Cognitive Biases and the Misperception of Risk is shared under a CC BY-NC 4.0 license and was authored, remixed, and/or curated by Jason Southworth & Chris Swoyer via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.