Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo
Introduction to Philosophy

2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

Introduction to Philosophy2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

Menu
Table of contents
  1. Preface
  2. 1 Introduction to Philosophy
    1. Introduction
    2. 1.1 What Is Philosophy?
    3. 1.2 How Do Philosophers Arrive at Truth?
    4. 1.3 Socrates as a Paradigmatic Historical Philosopher
    5. 1.4 An Overview of Contemporary Philosophy
    6. Summary
    7. Key Terms
    8. References
    9. Review Questions
    10. Further Reading
  3. 2 Critical Thinking, Research, Reading, and Writing
    1. Introduction
    2. 2.1 The Brain Is an Inference Machine
    3. 2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection
    4. 2.3 Developing Good Habits of Mind
    5. 2.4 Gathering Information, Evaluating Sources, and Understanding Evidence
    6. 2.5 Reading Philosophy
    7. 2.6 Writing Philosophy Papers
    8. Summary
    9. Key Terms
    10. References
    11. Review Questions
    12. Further Reading
  4. 3 The Early History of Philosophy around the World
    1. Introduction
    2. 3.1 Indigenous Philosophy
    3. 3.2 Classical Indian Philosophy
    4. 3.3 Classical Chinese Philosophy
    5. Summary
    6. Key Terms
    7. References
    8. Review Questions
    9. Further Reading
  5. 4 The Emergence of Classical Philosophy
    1. Introduction
    2. 4.1 Historiography and the History of Philosophy
    3. 4.2 Classical Philosophy
    4. 4.3 Jewish, Christian, and Islamic Philosophy
    5. Summary
    6. Key Terms
    7. References
    8. Review Questions
    9. Further Reading
  6. 5 Logic and Reasoning
    1. Introduction
    2. 5.1 Philosophical Methods for Discovering Truth
    3. 5.2 Logical Statements
    4. 5.3 Arguments
    5. 5.4 Types of Inferences
    6. 5.5 Informal Fallacies
    7. Summary
    8. Key Terms
    9. References
    10. Review Questions
    11. Further Reading
  7. 6 Metaphysics
    1. Introduction
    2. 6.1 Substance
    3. 6.2 Self and Identity
    4. 6.3 Cosmology and the Existence of God
    5. 6.4 Free Will
    6. Summary
    7. Key Terms
    8. References
    9. Review Questions
    10. Further Reading
  8. 7 Epistemology
    1. Introduction
    2. 7.1 What Epistemology Studies
    3. 7.2 Knowledge
    4. 7.3 Justification
    5. 7.4 Skepticism
    6. 7.5 Applied Epistemology
    7. Summary
    8. Key Terms
    9. References
    10. Review Questions
    11. Further Reading
  9. 8 Value Theory
    1. Introduction
    2. 8.1 The Fact-Value Distinction
    3. 8.2 Basic Questions about Values
    4. 8.3 Metaethics
    5. 8.4 Well-Being
    6. 8.5 Aesthetics
    7. Summary
    8. Key Terms
    9. References
    10. Review Questions
    11. Further Reading
  10. 9 Normative Moral Theory
    1. Introduction
    2. 9.1 Requirements of a Normative Moral Theory
    3. 9.2 Consequentialism
    4. 9.3 Deontology
    5. 9.4 Virtue Ethics
    6. 9.5 Daoism
    7. 9.6 Feminist Theories of Ethics
    8. Summary
    9. Key Terms
    10. References
    11. Review Questions
    12. Further Reading
  11. 10 Applied Ethics
    1. Introduction
    2. 10.1 The Challenge of Bioethics
    3. 10.2 Environmental Ethics
    4. 10.3 Business Ethics and Emerging Technology
    5. Summary
    6. Key Terms
    7. References
    8. Review Questions
    9. Further Reading
  12. 11 Political Philosophy
    1. Introduction
    2. 11.1 Historical Perspectives on Government
    3. 11.2 Forms of Government
    4. 11.3 Political Legitimacy and Duty
    5. 11.4 Political Ideologies
    6. Summary
    7. Key Terms
    8. References
    9. Review Questions
    10. Further Reading
  13. 12 Contemporary Philosophies and Social Theories
    1. Introduction
    2. 12.1 Enlightenment Social Theory
    3. 12.2 The Marxist Solution
    4. 12.3 Continental Philosophy’s Challenge to Enlightenment Theories
    5. 12.4 The Frankfurt School
    6. 12.5 Postmodernism
    7. Summary
    8. Key Terms
    9. References
    10. Review Questions
  14. Index

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

Connections

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Video

Cognitive Biases 101, with Peter Bauman

Confirmation Bias

One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Tribalism

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy. The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

Video

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy, in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Bias Description Example
Confirmation bias The tendency to search for, interpret, favor, and recall information that confirms or supports prior beliefs As part of their morning routine, a person scans news headlines on the internet and chooses to read only those stories that confirm views they already hold.
Anchoring bias The tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something When supplied with a random number and then asked to provide a number estimate in response to a question, people supply a number close to the random number they were initially given.
Availability heuristic The tendency to evaluate new information based on the most recent or most easily recalled examples People in the United States overestimate the probability of dying in a criminal attack, since these types of stories are easy to vividly recall.
Tribalism The tendency for human beings to align themselves with groups with whom they share values and practices People with a strong commitment to one political party often struggle to objectively evaluate the political positions of those who are members of the opposing party.
Bandwagon fallacy The tendency to do something or believe something because many other people do or believe the same thing Advertisers often rely on the bandwagon fallacy, attempting to create the impression that “everyone” is buying a new product, in order to inspire others to buy it.
Sunk cost fallacy The tendency to attach a value to things in which resources have been invested that is greater than the value those things actually have A business person continues to invest money in a failing venture, “throwing good money after bad.”
Gambler’s fallacy The tendency to reason that future chance events will be more likely if they have not happened recently Someone who regularly buys lottery tickets reasons that they are “due to win,” since they haven’t won once in twenty years.
Table 2.1 Common Cognitive Biases

Think Like a Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

Do you know how you learn best?
Kinetic by OpenStax offers access to innovative study tools designed to help you maximize your learning potential.
Order a print copy

As an Amazon Associate we earn from qualifying purchases.

Citation/Attribution

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
Citation information

© Jun 21, 2022 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.