Chapter 11

Critical Thinking and Cognitive Biases

Mark Battersby and Sharon Bailin

1. Introduction

A primary aim of critical thinking research and teaching is to improve human reasoning with the intent of getting people to be more rational with respect to their beliefs and actions. For the Informal Logic/critical thinking community, this effort has largely taken the form of analysing the structure of arguments and identifying certain types of errors or problems in reasoning, in particular those commonly identified as fallacies. The focus is on exposing the nature of the error– showing why these particular arguments are fallacious. The pedagogical assumption underlying this focus is that once people are aware of these errors, they will notice them in the arguments of others and be able to resist them, and that they will avoid making these errors themselves.

Much valuable work has been done in this area, including contributions to an understanding of the nature of fallacies, the identification and characterization of a growing number of fallacies, and innumerable rich ideas and strategies for teaching critical thinking. The identification of reasoning errors, in this context, has been based largely on the work of philosophers studying arguments and not on empirical studies of reasoners. In addition, relatively little work has been done by philosophers (with some notable exceptions, e.g., Walton 2010) on trying to understand why these errors are so common and persuasive.

Since the 1970s, however, much important work on human reasoning has also been done by psychologists who have undertaken systematic empirical studies of reasoning errors and produced many insightful accounts of these errors (Wason 1966; Wason and Shapiro 1971; Tversky and Kahneman 1974; Slovic 1969; Slovic et al. 1977; Kahneman, Slovic and Tversky 1982; Stanovich 2011; Kahneman 2011). Some of these errors map onto identified informal logic fallacies, but some of them have not been previously identified by philosophers.

The critical thinking community has, however, by and large given little attention to the work of these cognitive psychologists.[1] It is our contention that this work can make a contribution both to reflection on reasoning errors and to the development of an appropriate pedagogy to instruct people in how to avoid these errors.

In this paper, we explore some of the intersections between this psychological research on reasoning and the work of critical thinking theorists, as well as the implications of this research for conceptualizing and teaching critical thinking. The paper addresses this theme in terms of the following aspects:

  • what this work can add to our understanding of reasoning errors in general, and of the reasoning errors identified by critical thinking theorists in particular
  • which reasoning errors identified by this research are not typically identified by the critical thinking community
  • the ways in which this research can inform and help to enhance critical thinking instruction.

2. Psychological Versus Philosophical Accounts

Although both philosophers and psychologists offer detailed accounts of reasoning errors, there are important differences between the accounts. Philosophical accounts are primarily normative. The work of philosophers has consisted in specifying the norms of logical reasoning as well as identifying errors of reasoning which are common in arguments and showing in what way they are logically erroneous or epistemologically deficient.

The accounts of cognitive psychologists, in contrast, are largely descriptive, and to some extent explanatory. Their work consists in conducting empirical studies of people engaged in tasks that require reasoning and critical thinking. By means of these studies, they have been able to identify errors that are commonly made, identify patterns in the types of errors made which reflect cognitive biases (errors which are systematic and predictable), amass evidence regarding the frequency and tenacity of such errors, and investigate the circumstances which tend to be correlated with their occurrence. In addition, based on the data accumulated, some cognitive psychologists have also proposed explanatory accounts of these cognitive biases in terms of their likely origins as well as a conceptual framework for understanding how they function.

3. Enhanced Understanding of Reasoning Errors

The obvious question, then, is what, if anything, can such a descriptive cum explanatory account add to our understanding that might help us in thinking about and teaching critical thinking?

The findings of the various studies conducted by cognitive psychologists detail an extensive range of cognitive errors which are common and predictable. And many of the fallacies identified by informal logic can be seen as particular instances or manifestations of certain of these cognitive biases. The fallacy of popularity, for example, is likely an instance of the bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. And the fallacy of hasty conclusion could be a result of any of: belief bias — where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion; clustering illusion — the tendency to see patterns where actually none exist; and/or confirmation bias — the tendency to search for or interpret information in a way that confirms one’s preconceptions. The elucidation and detailing of various cognitive biases can give us a richer understanding of those errors in reasoning which have already been identified by informal logicians.

Many cognitive biases describe systematic errors in reasoning which are not among those traditionally highlighted by critical thinking theorists, however. A few examples are loss aversion – where the disutility associated with giving up an object is seen as greater than the utility associated with acquiring it; and recency bias — the tendency to weigh recent events more heavily than earlier events (such cognitive biases will be discussed in more detail in the next section). The cognitive bias literature can, then, add to the repertoire of reasoning errors which deserve attention by critical theorists and instructors.

In addition to detailing a list of errors, what the research on cognitive biases also indicates is that these errors are systematic and predictable, but also extremely widespread and very tenacious. These are not errors that are made occasionally by people who have momentary lapses in their thinking. Nor are they necessarily the result of people’s failure to understand the relevant logical norms. The research provides convincing evidence that they are, rather, very common and extremely difficult to resist. This is an aspect of cognitive biases that needs to be taken into account in critical thinking instruction.

Another helpful aspect that arises from the research is information regarding under what conditions these errors are most likely to occur and whether there are circumstances or conditions which can mitigate them. This type of information can be useful for critical thinking instruction in providing a basis for the development of strategies to help avoid these errors.

In addition to the guidance provided by the research itself, the explanatory accounts offered by cognitive psychologists also give us a framework for attempting to understand why we make these errors. The ubiquity and tenacity of cognitive biases demonstrate that these are not simply errors in reasoning; they are errors that persuade. The theoretical accounts offer an explanation for why it may be that we are persuaded by them.

These accounts differ from those generally offered by philosophers, which tend to view the primary source of human unreason as the emotions (the explanations of reasoning errors offered in contemporary textbooks, for example, tend to be in terms of ego involvement or ethnocentrism). While not denying that emotional sources can often be a cause of irrationality, the work of cognitive scientists has shown that many reasoning errors are grounded primarily in natural reasoning processes.

What many psychologists have argued is that humans have, over time, evolved a set of quick inferences tendencies which allow a rapid, almost immediate response or reaction. Some examples of these quick inferences are detecting hostility in a voice, driving a car on an empty road, understanding a simple sentence, or answering a simple math problem. Some of these fast mental activities are innate and automatic while others are based on skills and knowledge which have become automatic through prolonged practice (e.g., driving on an empty road, solving a simple math problem) (Kahneman 2011, pp.21-24). This type of thinking is referred to by Kahneman (2011) as System 1 or fast thinking.[2] This type of quick inference-making is sufficiently reliable to stand us in good stead in many circumstances, providing quick and generally appropriate initial reactions to challenges under routine conditions. But such fast thinking can also lead to cognitive biases as these immediate, unreflective inference-tendencies are not adequate to the task of dealing with more complex challenges. Tasks such as performing complex calculations, monitoring the appropriateness of one’s behaviour, comparing items for overall value, or checking the validity of a complex logical argument require attention, deliberate mental effort, and conscious reasoning. This type of more deliberate, controlled, and effortful thinking is referred to by Kahneman as System 2 or slow thinking.[3] According to Kahneman, slow thinking is required in order to avoid cognitive biases.

So why are cognitive biases so persuasive? The two systems theory would suggest that they persuade us because they arise from natural inferential tendencies. These tendencies are quick and cognitively easy and are generally the first line of attack when we are faced with cognitive challenges. Moreover, it is rational in many circumstances to rely on these tendencies; they are what allow us to function most of the time. But they can lead to errors in some circumstances and it is important in such circumstances to institute strategies to become more controlled and deliberate. The cognitive bias research suggests that this is not always easy as fast thinking occurs automatically. But it is possible.

While these theoretical accounts provide a plausible explanation of the persuasive power of cognitive biases in general, accounts of particular cognitive biases may also help us understand why particular errors are persuasive. This is an element that has been missing in most accounts of fallacies in the critical thinking literature. Fallacies are typically identified in terms of what is erroneous about them. But fallacies are not just any errors in reasoning; they are persuasive errors (Battersby and Bailin 2015; Walton 2010). It is the existence of underlying cognitive biases which make the fallacious inferences tempting. Thus we would argue for the need to conceptualize fallacies not only in terms of the errors they exemplify, but also in terms of their persuasive power.[4] Understanding why particular fallacies persuade us provides us with a tool for helping us to resist their thrall.

For example, while philosophers have identified the error of making hasty generalizations based on anecdotal evidence, cognitive psychologists have identified the cognitive bias of the “availability heuristic” (estimating what is more likely by what is more available in memory, which is biased toward vivid, emotionally charged, or easily imagined examples (e.g., a plausible story). In a famous study, Tversky and Kahneman (1983) asked which was more likely:

  1. a massive flood somewhere in North America this year, in which more than 1,000 people drown
  2. an earthquake in California sometime this year, causing a flood in which more than 1,000 people drown.

Despite the fact that what is described in statement #2 is included in statement #1, a large percentage of people found statement #2 more likely since the latter provides a more plausible and easily imagined story. The philosophical accounts identify this reasoning as an error; the psychological accounts tell us that we tend to be persuaded by this particular error because people generally have a strong tendency to make judgments of likelihood on the basis of ease of imagining an event, an ease which can be much facilitated by a plausible story (Kahneman 2010, pp.159-60).

Another example is provided by the fallacy of questionable cause, which has been pointed out by critical thinking theorists, but the tendency to commit this fallacy can be seen to be grounded in the strong tendency, identified by psychologists, to see causal relationships even between unrelated events in order to make a coherent story. This phenomenon is nicely illustrated by an experiment by Hassin, Bargh and Uleman (2002) in which participants were given the following to read:

After spending a day exploring beautiful sights in the crowded streets of New York, Jan discovered that her wallet was missing.

When asked to recall the story afterwards, participants associated the word pickpocket with the story more frequently than they did the word sights despite the fact that sights appeared in the story while pickpocket did not. The juxtaposition of the ideas lost wallet, New York, and crowds prompted participants to infer a coherent causal story to explain the loss of the wallet despite the lack of any evidence presented in the story to support this inference.

An important aspect of System 1 or fast thinking highlighted by cognitive psychologists is that it is coherence-seeking – it is prone to construct a coherent story out of whatever information is available, whatever its quality and however limited. A common error in reasoning which is a result of this tendency is jumping to conclusions (hasty conclusion), and a particularly troubling manifestation is the failure to look at both sides of an issue or to seek alternatives. A striking illustration of this phenomenon is provided by one study (Brenner, Koehler and Tversky 1996) in which participants had to make a decision based on one-sided evidence. All the participants were given the same scenarios providing background material to a legal case, but then one group heard only a presentation by the defence lawyer, one group heard only a presentation by the prosecutor, and one group heard both presentations (each lawyer framed the issue differently but neither presented any new information). Despite the fact that all the participants were fully aware of the setup and could easily have generated the argument for the other side, the presentation of the one-sided evidence had a significant effect on the judgments.

Moreover, the consideration of only one side of the issue also resulted in the bias of overconfidence. The participants who heard one-sided evidence were more confident of their judgments than those who heard both sides. This is not surprising as it is easier to construct a coherent story with less information. The strength of this tendency to make confident judgments based on limited evidence is a robust and significant finding of the cognitive bias research and strongly suggests the need for deliberate measures and strategies to counter this tendency.

4. Identifying Additional Errors in Reasoning

The list of errors in reasoning identified by the cognitive science research which go beyond those typically identified by Informal Logic is too lengthy to detail here. We shall, instead, focus on one of the most striking discoveries by Kahneman and Tversky, the phenomenon of anchoring — the influence of irrelevant initial information when estimating a value or making a judgment. In the standard research example, subjects are given a random number, a number which they know is random, and then asked questions such as how many of the states in the UN are from Africa. Those given a larger number guess a relatively larger number of African states and those given a smaller number estimate a smaller number of states. We all recognize that when negotiating, it is common practice for the seller to price her object high and for the buyer to try and low ball. But these strategies, while they may be exploiting the phenomena of anchoring, also introduce relevant considerations. They give us some idea what price the seller or buyer is seeking. What is striking about the phenomenon of anchoring is that the anchoring numbers are known to the subjects to be irrelevant. This might seem to be just a quirky curious fact about human psychology, but a number of studies have demonstrated that it is a phenomenon with profound social implications.

In one study, for example, German researchers examining the effects of anchors on judicial decision-making were able to show that even trained judges knowing that the information they were given was irrelevant were still influenced in their decision-making in a manner similar to the naïve subjects described above. The researchers ran a number of different experiments providing the judges with information of varying degrees of relevance. In one example, participants were presented with a realistic case description of an alleged rape and were told that during a court recess they received a telephone call from a journalist who asked “Do you think that the sentence for the defendant in this case will be higher or lower than 1 (or 3) years?” Subsequently, they were asked for their own decision and also asked how certain they felt about the decision. Participants who had been exposed to the high anchor chose a considerably higher sentences (mean 33 months, standard deviation of 9.6) compared to those with the low anchor (mean 25 months, standard deviation 10) and participants generally felt fairly certain about the decision. Other experiments have yielded similar, troubling results (Englich 2006).

5. Enhancing Critical Thinking Instruction

In what ways might this research inform and help to enhance critical thinking instruction? Cognitive psychological accounts suggest that noticing that we are succumbing to the influence of a cognitive bias is actually quite difficult. As Kahneman suggests, “The best we can do is … learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high” (Kahneman 2011, p.28).

Recognizing certain inferences as errors is certainly a sine qua non for avoiding such mistakes, and critical thinking pedagogy has focused effectively on this task. It is not sufficient, however. The cognitive bias research has demonstrated just how strong and ubiquitous are these tendencies. Thus we would argue that helping students to see the naturalness and allure of cognitive biases would be important for helping them to resist their pull. In particular, we have argued for the need to teach students to identify fallacies not only in terms of the errors they commit but also in terms of their persuasive power.[5]

One of the most important points to emerge from the cognitive bias literature with implications for pedagogy is the necessity to put the brakes on our tendency to rush to inference under certain circumstances. Dealing with complex mental challenges and drawing complex inferences require the kind of deliberate, controlled, and effortful thinking characteristic of System 2 or slow thinking. Thus what is required when trying to make a judgment is a conscious attempt to make our thinking more deliberate. Strategies such as following a procedure or a set of guiding questions (Bailin and Battersby 2016, pp.26-36) and consciously monitoring our thinking process (Bailin and Battersby 2016, pp.274-275) are essential aspects of rational decision making.

In addition, it is possible to institute strategies to counter the effects of some of these quick inferential tendencies. The tendency to make confident judgments on the basis of limited evidence seems to be particularly strong and one manifestation of this tendency is the failure to look at both sides of an issue or to seek alternatives (sometimes called “my side bias” by cognitive psychologists). The common habit of philosophers of seeking counterexamples to any claim is a crucial antidote for this tendency. The strategy of actively seeking out counter evidence to one’s views, looking for and seriously considering the arguments on various sides of an issue, and deliberately considering alternative positions when making a judgment can go a long way toward countering this tendency of rushing to judgment. The development of the habit of considering counterexamples and alternatives is a crucial aspect of critical thinking instruction and is necessary in order to frustrate the natural tendency to leap to conclusions.

The cognitive bias research has also served to highlight the power of the framing effect – the tendency to draw different conclusions from the same information depending on how that information is presented (for example, people are more likely to accept a risk if they are told that there is a 10% chance of winning rather than a 90% chance of loosing). Deliberately attempting to reframe or change the way one views a situation may be helpful in countering this tendency. For example, one can attempt to view marijuana use as a harm issue rather than as a crime issue and see what effect this has on one’s judgment about the legalization of marijuana. The question then becomes: how do the harms resulting from illegality compare to any reasonably anticipated harms to health? When engaging in argumentation, one can try to view the enterprise in terms of making the best judgment rather than in terms of winning or losing. And trying to identify with being reasonable rather than with a particular view can be a helpful strategy for developing open-mindedness and fair-mindedness in inquiry (Bailin and Battersby 2016, p.274).

The bias of overconfidence – the tendency to have more confidence in one’s judgment than is warranted by the weight of evidence – is another common cognitive bias which may be somewhat mitigated through deliberate efforts. The strategies outlined above for promoting an examination of the full range of arguments on all sides of an issue is necessary in order to make a judgment with the appropriate degree of confidence, as is making students aware of the need to give explicit consideration to how much weight various arguments carry in making an overall judgment (Battersby and Bailin 2011, pp.152-157; Bailin and Battersby 2016, pp.239-244).

An important concept which runs through the cognitive bias literature is that of mental effort. Fast Thinking is quick and easy, virtually effortless, but slower, more deliberate thinking requires more mental effort. Kahneman and others have suggested that our minds have a tendency to go for the easier route much of the time (Kahneman 2010, pp.39-49). For example, the research has shown repeatedly that people have a strong tendency to see an erroneous answer to a simple math problem as correct or an invalid syllogism as valid when the conclusion is believable (the belief bias error) (Evans 2008). The intuitive answer suggests itself immediately and people generally do not bother to check the reasoning. These are cases when the reasoning could be checked without too much difficulty. Nonetheless overriding the intuitive response requires some mental work, and most people do not appear to be initially inclined to put in this effort.

An important idea for our pedagogical purposes is Kahneman’s argument that this failure is due at least in part to insufficient motivation (2010, p.46). Indeed, the fact that many people willingly put considerable mental effort into certain activities (e.g., Sudoku) when they find them interesting and engaging suggests that a task can elicit mental energy when it is seen as being worth the effort. Thus one of our challenges as educators is to help students to see thinking critically as being worth the mental effort.[6]

References

Bailin, S. and M. Battersby. 2016. Reason in the Balance: An Inquiry Approach to Critical Thinking, 2nd Edition. Cambridge, Mass: Hackett; 2010. 1st Edition, McGraw-Hill Ryerson.

______. 2018. “Developing an Evidence-Based Mode of Believing in an Age of ‘Alternative Facts.’” Proceedings of 9th ISSA Conference. Amsterdam.

Battersby, M. and S. Bailin. 2011. “Guidelines for Reaching a Reasoned Judgment.” In Conductive Argument: An Overlooked Type of Defeasible Reasoning, 145–157, edited by J.A. Blair and R.H. Johnson. London: College Publications.

______. 2015. “Fallacy Identification in a Dialectical Approach to Teaching Critical Thinking.” Inquiry: Critical Thinking Across the Disciplines 30, 1: 9-16.

______. 2107. “Reasoning Together: Fostering Rationality Through Group Deliberation.” Proceedings of the 2nd European Conference on Argumentation, Fribourg, 2017.

Brenner, L., D. Koehler, and A. Tversky. 1996. “On the Evaluation of One-sided Evidence.” Journal of Behavioral Decision Making 9: 59-70.

Englich, B., T. Mussweiller and F. Strack. 2006. “Playing Dice with Criminal Sentences: The Influences of Irrelevant Anchors on Experts’ Judicial Decision Making.” Personality and Social Psychology Bulletin 32: 188.

Evans, J. 2008. “Dual Processing Accounts of Reasoning, Judgments and Social Cognition.” Annual Review of Psychology 59: 255-278.

Hassin, R., A. Bargh and J. Uleman. 2002. “Spontaneous Causal Inference.” Journal of Experimental Social Psychology 38: 515-22.

Kahneman, D. 2011. Thinking, Fast and Slow. London: Penguin.

Kahneman, D., P. Slovic and A. Tversky, eds. 1982. Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press.

Kenyon, T. 2014. “False Polarization: Debiasing as Applied Social Epistemology.” Synthese 191, 11: 2529–47. https://doi.org/10.1007/s11229-014-0438-x.

Kenyon, T. and G. Beaulac. 2014. “Critical Thinking Education and Debiasing.” Informal Logic 34, 4: 341–63. https://doi.org/10.22329/il.v34i4.4203.

Kuhn, D. 2015. “Thinking Together and Alone.” Educational Researcher 44, 1: 46-53.

Maynes, J. 2015. “Critical thinking and Cognitive Bias.” Informal Logic 35, 2: 183–203.

______. 2017. “Steering into the Skid: On the Norms of Critical Thinking.” Informal Logic 37, 2: 114–128.

Mercier, H. and D. Sperber. 2017. The Enigma of Reason. Cambridge, Mass: Harvard University Press.

Slovic, P. 1969. “Analyzing the Expert Judge: A Descriptive Study of a Stockbroker’s Decision Processes.” Journal of Applied Psychology 53: 255-263.

Slovic, P., B. Fischhoff and S. Lichtenstein. 1977. “Behavioral Decision Theory.” Annual Review of Psychology 28: 1–39.

Stanovich, K. 2011. Rationality and the Reflective Mind. Oxford: Oxford University Press.

Tversky, A. and D. Kahneman. 1973. “Availability: A Heuristic for Judging Frequency and Probability.” Cognitive Psychology 5: 207-232.

______. 1974. “Judgment Under Uncertainty.” Science 185: 1124-1131.

______. 1983. “Extensional Versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment.” Psychological Review 90: 293 315.

Walton, D. 2010. “Why Fallacies Appear to be Better Arguments Than They Are.” Informal Logic 30, 2: 159-184.

Wason, P.C. and D. Shapiro. 1971. “Natural and Contrived Experience in a Reasoning Problem.” Quarterly Journal of Experimental Psychology 23: 63–71.

Wason, P.C. 1966. “Reasoning.” In New Horizons in Psychology, edited by B.M. Foss. Harmondsworth: Penguin.


  1. For recent work on understanding cognitive biases and their significance to critical thinking, see Kenyon 2014; Kenyon and Beaulac 2014; Maynes 2015, 2017; Mercier & Sperber 2017).
  2. This type of thinking has been referred to variously as automatic, experiential, heuristic, implicit, associative, intuitive, and/or impulsive (Evans 2008).
  3. This type of thinking has been referred to variously as controlled, rational, systematic, explicit, analytic, conscious, and/or reflective (Evans 2008). See Evans for an overview of a number of dual-systems theories of reasoning and cognition.
  4. In Reason in the Balance (Bailin and Battersby 2016), we define a fallacy as an argument pattern whose persuasive power greatly exceeds its probative value (i.e., evidential worth). We then describe each fallacy in terms of two aspects: 1. “logical error” – an explanation of why the argument has limited or no probative value, and 2. “persuasive effect”– an explanation of why the argument has a tendency to be persuasive.
  5. See note #4.
  6. We have further explored the issue of bias and pedagogical strategies for addressing bias in subsequent work. We have argued that the confrontation of conflicting views inherent in our inquiry approach (and in particular the use of the dialogical arguments table) can help mitigate confirmation bias (Bailin and Battersby 2016, 2018) as can group deliberation involving the confrontation of diverse views (Bailin and Battersby 2018; Battersby and Bailin 2017). Battersby’s paper “Enhancing Rationality: Heuristics, Biases, and the Critical Thinking Project” (in this volume) criticizes how the term “bias” is used by behavioural economists in relation to economic decision-making, arguing that it is based on an ideological notion of rationality as purely instrumental and self-interested. Thus these alleged biases should not be included in critical thinking instruction.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Inquiry: A New Paradigm for Critical Thinking Copyright © 2018 by Windsor Studies in Argumentation & The Authors is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book