Skip to main content
Humanities LibreTexts

28.4: Consider Alternatives

  • Page ID
    95294
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    We approach many situations with certain expectations or preconceptions, with a “mindset” that makes it easy to overlook the relevance of various cognitive tools and skills. We often think about a problem in the way it is presented to us, or in the way that we have gotten used to thinking about similar problems. Although there are no magic cures, there is a general strategy that can help overcome this obstacle to good reasoning.

    The basic idea is to think about a situation from several points of view or in several different ways, rather than to simply jumping to the first conclusion that occurs to us. Although this involves somewhat different things in different cases, it is easier to remember if we have a single, catchy label for the general strategy, so we will refer to it as “Consider Alternatives.”

    In some cases, consideration of alternatives means considering actual alternatives or options we have overlooked; in other cases, it means considering hypothetical alternative or options. All this sounds rather abstract, but we can bring it down to earth by considering a few concrete examples.

    Alternative Explanations

    We constantly seek to explain the world around us. When something happens that matters to us, we want to know why. What explains it? What caused it? Wilbur has been friendly to Wilma all semester but suddenly he begins insulting her. She naturally wonders why. Unfortunately, we often jump to conclusions about causes and explanations. One corrective is to ask ourselves whether there might be alternative explanations that would explain what happened better than the explanation that first occurred to us.

    Regression Effects

    We often overlook regression to the mean, the fact that more extreme outcomes or performances tend to be followed by ones closer to the mean (i.e., closer to average; (21.6)).

    We often do something, note what happens, and conclude that our action led to – caused – the outcome. For example, if the implementation of a new policy is followed by a decrease in something undesirable or an increase in something desirable, it is often tempting to conclude that the new measure caused the shift. Unemployment went up last year; this year the City Council lowered property taxes, and now unemployment has come back down to its normal level.

    In at least some cases like this, however, the return to normal would have occurred without the new measure, simply due to regression to the mean. In such cases, we are likely to explain the reduction in unemployment by the decrease in taxes, but we will be wrong, and the new measure will be given credit it doesn’t deserve. Here, the alternative explanation to consider is that the return to a normal level is simply a result of regression to the mean; perhaps it would have happened anyway, even without the new policy.

    Illusory Correlation

    Suppose that some people recover from a given illness after taking megadoses of Vitamin C. It is tempting to conclude that the vitamin led to – caused – their recovery, that it explains why they got better. But might they have gotten better anyway? In the case of many diseases and illnesses, some people do get better without any medication, vitamins, or treatment. To know whether the Vitamin C helped, we need to compare the rate of recovery among those who took it with the rate among those who did not. In short, if we fail to consider the rate of recovery of those who don’t take Vitamin C, we may come to believe in an illusory correlation between taking the drug and getting better.

    In this case, belief in an illusory correlation leads us to give a faulty explanation (the people got better because they took the vitamin). Here the alternative explanation we need to consider is that they would have gotten better anyway, without the vitamin, (and the alternative evidence we need to consider to test either explanation is the group of people who did not take the vitamin).

    Fundamental Attribution Error

    The fundamental attribution error is the tendency to overestimate the role of our dispositions and traits, and to underestimate the importance of the context, in explaining their behavior (23.2). For example, it is easy to believe that people in a crowd who don’t help an injured person are selfish and uncaring, when in fact many kind and decent people are reluctant to help in such situations. There are various situational pressures – causes – that inhibit helping.

    In cases like this we give a wrong, or at least one-sided, explanation when we explain what others do in terms of their character traits or other internal causes. Considering the alternative explanation, that there are strong situational pressures that would lead most people (including them) to do what they did, can help us see that an explanation weighted more in terms of external causes might be more accurate.

    Gambler’s Fallacy

    In the cases thus far, we need to consider alternative explanations for events that have already occurred, but similar points can hold for events in the future. We commit the gambler’s fallacy when we treat independent events as though they were dependent (16.3). For example, even if we believe that a coin is fair, we may think that if we have flipped three heads in a row we are more likely to get a tail on the fourth flip.

    Such a belief can result from accepting a bad explanation while ignoring alternative explanations about why we should not expect a tail the fourth time around. The bad explanation is that we think a tail is more likely because it would “even things out,” or satisfy the “law of averages.”

    The good explanation we overlook is that there is nothing about the coin that would enable it to “remember” what it did on earlier flips. There is no mechanism that can change the probability of a given flip based on the outcome of previous flips. We might well see this quite easily if we paused and asked how the outcome of one flip could affect the outcome of the succeeding flip. What could explain how the coin could remember (answer: nothing—so it doesn’t)?

    Alternative Possibilities

    Sometimes asking how things might have turned out differently helps avoid fallacies and cognitive biases. Here we consider alternative outcomes or possibilities.

    Hindsight Bias

    Hindsight bias is the tendency to overestimate the likelihood that one would have predicted an outcome after they learn that it occurred (8.6). It has been found in judgments about elections, medical diagnoses, sporting events, and many other topics. It is also the key ingredient of Monday morning quarterbacking and second guessing.

    As with many other biases and fallacies, simply warning people of the dangers of hindsight bias has little effect. But various studies suggest that we can reduce this bias by considering how past events might have turned out differently. Consider alternative scenarios: ask yourself what alternatives might have occurred and what things would have made it likely that they would have happened? For example, if we consider how easily a couple of charging fouls against the opponent’s star could have gone the other way, it may be easier to see how they could have won the game. Considering alternative outcomes can make it easier to break free of the “mindset” that the occurrence of this event would have been obvious before it occurred.

    Overconfidence

    We tend to overestimate the probability that our own judgments or predictions are correct. This means assigning high probabilities to hypotheses or claims that turn out, reasonably often, to be false. There is evidence that considering alternatives can help us make a more realistic assessment of our own accuracy.

    For example, if we imagine plausible scenarios in which the candidate we predict to win would turn out to lose, we may become less confident in our prediction. Here the key is to consider ways our views and beliefs might turn out to be wrong. This can help us break free of the belief that our belief or prediction is “obvious.”

    The Either/Or Fallacy

    We commit the either/or fallacy (the fallacy of a “false dilemma”) when we assume that there are only two alternatives when in fact there are more or, more generally, when we assume that there are fewer alternatives than there are (11.2). Here, pausing to ask if we have overlooked genuine alternatives may help us find additional, and with luck better, options.

    Alternative Evidence

    Sometimes we need to consider evidence (data, facts, information) that we have overlooked: here the strategy is to consider alternative (or, perhaps more accurately here, overlooked) evidence.

    Testing

    It often seems easy to explain how we were right, whatever the evidence turns out to be. We read earlier about the members of a doomsday cult who gave away all their possessions because they believed that the world would end on a given day (19.6.1). Their belief turned out to be false; obviously so, since the world did not end when they said it would.

    Most of us would see this as rock solid proof that they were simply mistaken about things. But when the appointed time came and went, many cult members strengthened their beliefs in the pronouncements of their leader (by concluding that their efforts had postponed the end). To take another example, the predictions of many pseudoscientists are often sufficiently fuzzy that they can be reconciled with almost anything that happens later.

    In such cases, it is useful to consider alternative ways that the evidence might have turned out. What outcomes, what possible evidence, would we have taken as disconfirming our views? How would we have reacted if the evidence had turned out differently? This can help us see whether we are basing our beliefs on evidence (which could turn out to disconfirm them), or whether we are simply holding them, “no matter what,” whatever the evidence might be.

    Alternative Frames

    Framing

    Framing effects occur when the way in which options or situations are described influences how we think about them (7.5.1). It is always a good policy to imagine alternative ways to frame things, especially frames in terms of gains (if the current frame is in terms of loses) or in terms of loses (if the current frame is in terms of gains). And when evaluating the arguments of others, it is always wise to ask yourself how the points they are making might be reframed.

    Anchors

    We discussed anchoring effects earlier in this chapter. Here, we simply note that considering alternative anchors can help us avoid insufficient adjustment to anchors that are skewed.

    Alternative Points of View

    Sometimes the alternative we need to consider is some other person’s point of view. We may need to put ourselves in their shoes.

    Straw Man

    We commit the straw man fallacy if we distort someone else’s position to make it easier to attack (10.4). Here, considering alternatives means asking how the person who defends the view we dislike would state their position. What reasons would they give to support it? The point here is just to play fair, to try to find the strongest version of the view in question and evaluate it. Consider the alternative, stronger, ways to defend it.

    Actor-Observer Asymmetry

    We tend to see other people’s behavior as internally caused, but to see our own as externally caused (23.3). Wilbur donated money to the Cancer Society because he is kind and generous. By contrast, we tend to see ourselves as giving a donation because we think the cancer victims need help.

    The actor-observer asymmetry is a bias in our reasoning about people’s actions, both our own and those of others we observe. But it turns out that if an actor imagines herself in the situation of the observer, or if an observer imagines himself in the situation of the actor, this bias is reduced.

    Out-Group Homogeneity Bias

    This is the tendency to see out-groups (e.g., people of other races, religions, countries, sexual orientation) as more homogeneous than they really are (25.5). Groups tend to see themselves as quite varied, whereas members of other groups are thought to be much more alike one another. One way to consider the alternative is simply to learn more about the other group. But imagining what the group must seem like to those in the group may also make it easier to see that they probably vary as much as the members of groups with which we are more familiar. For example, people in familiar groups with quite different backgrounds, occupations, and ages often see things in different ways. So, are members of an out group with quite different backgrounds, occupations, and ages likely to see things in the same way?

    Dissent

    We encountered several cases in Chapter 22 (Ash’s conformity studies, Milgram’s studies of obedience) where the most effective way of reducing conformity or obedience was to have at least one person who refused to go along. Often even one dissenter was enough to eliminate groupthink or conformity or mindless compliance.

    Here one way to consider alternatives is to imagine what a dissenter might say or do. It is also good to encourage the actual presentation of alternative points of view by encouraging free expression. Some group biases are less likely, for example, if group members are allowed to express disagreement, since the group will be exposed to alternative points of view. It will be easier to break the grip of a certain way of looking at the relevant issues.

    Evaluating Sources of Information

    We have no choice but to rely on others for information, but some people are better sources than others. When someone who seems like an expert makes a claim, it is useful to consider perspectives or points of view from which the claims would seem less plausible, and ones from which it would seem more plausible. Then, consider it from the source’s own point of view: would they have any reasons to make this claim if they didn’t think it was true?

    Is there “Invisible Data”?

    Good reasoning requires us to stay focused on issues that are relevant to the points we are reasoning about. Many pitfalls in reasoning, including some of those discussed earlier in this chapter, involve one of two kinds of mistakes about relevance. Sometimes we overlook or ignore evidence or facts that are relevant. And sometimes we rely on evidence or facts that are irrelevant. We might think of the first sort of error as one of omission (we overlook relevant information) and the second as one of commission (we incorrectly treat irrelevant information as though it were relevant).

    We will begin with cases where we overlook or underutilize relevant data. The slogan here, the question to ask ourselves, is: “Is there invisible data?” Is there relevant data that is “invisible” to us because we overlooked it?

    Confirmation Bias

    Confirmation bias is our tendency to look for, notice, and remember confirming or positive evidence (that supports what we think) while overlooking or downplaying disconfirming or negative evidence (which suggests that what we think is wrong (18.5)). We often have a blind spot for outcomes and possibilities at odds with our beliefs and expectations.

    Relevance

    Relevance involves a relationship between one statement and another. A piece of information can be highly relevant to one conclusion but completely irrelevant to others. For example, if Wilma is convinced that people with red hair have bad tempers, she may be more likely to notice or remember cases where red heads fly off the handle and to overlook or forget cases where they don’t. Here the invisible data is obvious—once we mention it.

    Here is another example. People who make admissions decisions at universities typically receive limited feedback on how well they are making their selections. They probably do learn about the performance of the students they do admit, but they rarely get feedback on the people they reject. How well did they do in the university they ended up attending? How well might they have done at this university, the one that rejected them. Here, half of the relevant evidence is “invisible” to the admissions officers, so it is difficult for them to form an accurate assessment about how well they are doing.

    In cases like this it is often difficult to get the invisible data, although there are a few exceptions like football recruiting, where someone who wasn’t offered a scholarship goes on to be a star at another university. But often invisible data is easy to see – as with Wilma and red heads – once it occurs to us to look.

    Illusory Correlations

    Belief in illusory correlations often results from considering only some of the relevant cases, e.g., the people who took Vitamin C and got better. We might learn that the apparent correlation is illusory if we also considered people who did not take Vitamin C (since at least as many of them might have gotten better too). The problem is that we often overlook or ignore data or information about this group, so it remains invisible to us. To take another example (15.4), if we looked for invisible data it might also show that various superstitions are based on illusory correlations.

    Insensitivity to Base Rates

    We often ignore or underutilize information about base rates in judging probabilities and making predictions (17.4). For example, we may think that a person has a high probability of being infected with the HIV virus after testing positive on an accurate, but not-perfect test, without accounting for the relatively low incidence of such infections in the general population. Or we may think that someone has a certain job, like being a professional football player, because they fit the profile or stereotype, we associate with that job, while ignoring the low base rate of professional football players in the general population.

    When we overlook base rates, we ignore relevant information; it remains invisible to us. Pausing to ask whether information about bases rates is relevant can help make that information visible. Remember that you usually do not need any precise knowledge of base rates. Just knowing that there are a lot more of one sort of thing (e.g., bankers) than another (e.g., professional football players) is often enough.

    How Do We Know What Might be Relevant?

    It’s easy for a book to tell us we shouldn’t overlook relevant information, but how can we tell what is relevant, especially in fields we don’t know much about? Things are typically open-ended, and we could go on collecting evidence for years. We often must act soon, however, and it’s important to know when to stop looking and act.

    There is no guideline for this, but in the real world the problem is rarely that people spend too much time looking for relevant information. The problem is that we usually don’t look enough. Indeed, often we don’t look at all. Moreover, the information we need is often obvious, once we pause to think about it. In short, in many other cases fostering the habit of asking: “what’s been omitted?” can help us reason more effectively.

    Is Some Data Too Visible?

    The flip side of overlooking relevant information is basing our reasoning on irrelevant information. Even worse, in some cases we have good information, but for one reason or another we don’t use it. Indeed, bad information sometimes drives out good information. In cases like this the evidence doesn’t start out invisible, but bad or irrelevant information makes it difficult to keep it in sight (as with the dilution effect (17.4)).

    We saw several cases on this sort in Chapter 10, where we examined various species of the fallacy of irrelevant reason. Later we saw that we often focus on samples that are highly available in memory or imagination, while ignoring more relevant and representative samples, even when we know about them (17.2).


    This page titled 28.4: Consider Alternatives is shared under a CC BY-NC 4.0 license and was authored, remixed, and/or curated by Jason Southworth & Chris Swoyer via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.