Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo
Introduction to Philosophy

2.3 Developing Good Habits of Mind

Introduction to Philosophy2.3 Developing Good Habits of Mind

Learning Objectives

By the end of this section, you will be able to:

  • Define epistemic humility and the Dunning-Kruger effect.
  • Identify three strategies to increase the ability to think objectively.
  • Analyze emotional responses to information.

One of the ways to respond to cognitive biases is to develop good habits of mind. There are no quick fixes or easy solutions to cognitive biases. Remember, these biases are a result of the way the brain works. Nevertheless, metacognition and critical reflection, as well as good mental habits, can help combat these natural tendencies in thought that otherwise leads us astray. The strategies outlined below can help you become a better philosopher. You should compare them with the methods philosophers use to arrive at truth, covered in Chapter 1.

Connections

See the introduction to philosophy chapter to learn more about how philosophers arrive at the truth.

Strive for Objectivity

We are likely to assume that our experience or our perspective is generally true for others. To be more objective in thinking about issues, problems, or values, we should actively engage in strategies that remove us from our naturally subjective mindset. In this section, we will explore several strategies for approaching philosophical problems with less subjective bias.

Abstract from Specific Circumstances

Most people’s point of view is based on generalizing from their specific circumstances and experiences. However, if your view of morality, consciousness, or free will is tied to notions that come from a specific time or location, then your view is not likely to be objective. Your personal experience has limitations when it comes to understanding what is going on in the world at large. To arrive at more general and representative notions, use your imagination to separate the specific properties of your experience from your worldview. This process of abstraction can make the concept appropriately general. For instance, if you wish to imagine a governing arrangement among citizens, you will probably default to the governmental organizations you are familiar with in your community, state, or nation. But these institutions differ from the way government works in other countries or in different eras of history. So when you think about justice in political organizations, it is important to imagine those not limited by your personal experience, moment in history, or location.

In some cases, however, the specific features of your experience are indispensable to the philosophical position you wish to take. In such instances, your specific experience provides critical information that needs to be preserved. For example, the prevailing views in philosophy as well as any other subject are biased in that they reflect the views of the dominant cultural group who wrote the texts. If you are a person who belongs to a nondominant or minority group or a group that has been historically marginalized, your personal experience may shed new light on a problem. In such cases, specific experience can help you, as well as others, reshape the general view so that it is more comprehensive and inclusive. In these cases, abstracting from the particular circumstances may not be useful.

Promote Alternative Points of View

Actively considering points of view contrary to your own is most useful in political or ethical areas of philosophy. But a similar strategy may also be useful in metaphysics or epistemology. For instance, when considering issues in metaphysics, you may believe that parts of experience—like consciousness, God, or free will—cannot be explained by the natural sciences. Or, conversely, you may think there is a scientific explanation for everything. When considering these views philosophically, try to actively promote the alternative point of view. Sometimes this strategy is called steelmanning the opposing argument. When you steelman an argument, you make the strongest possible case in favor of it. This is the opposite of strawmanning an argument, in which you construct a weaker version of the argument to easily defeat it. You may be tempted to strawman arguments you naturally disagree with, but you will become a better philosopher when you steelman those arguments instead.

Connections

Learn more about the strawman fallacy in the chapter on logic and reasoning.

Identify Counterexamples

Generating counterexamples is an effective way to test your own or others’ claims. A counterexample is an instance that renders an argument invalid by satisfying all the premises of the claim but demonstrating the conclusion is false. Suppose someone wants to argue that the only legitimate way to know something is to have direct experience of it. To produce a counterexample to this claim, we must imagine something that everyone knows is true but that would be impossible to experience directly. Here is an example: I know my mother was born. Clearly, given that I was born, I had a mother, and she, too, must have been born to have given birth to me. My mother’s birth necessarily preceded my birth by many years, so it would be impossible for me to have any direct experience of my mother’s birth. And yet, just as surely as I know I was born, I know that my mother was born. Counterexamples are powerful tools to use in evaluating philosophical arguments. If you practice using this tool, you will become a better critical thinker.

Connections

See the section on counterexamples in the chapter on logic and reasoning for more discussion of this topic.

Maintain Skepticism of Strong Emotions

While emotions play an important role in thinking, they can also cloud judgment. Strong reactions to claims made by philosophers, other students, your professor, or anyone else may prevent you from considering the argument objectively. You should be wary of any strong attachment or aversion you feel toward a philosophical claim. Emotions can guide us, but they may threaten our ability to objectively consider the arguments being made.

To respond to strong emotions, use the tools of metacognition to reflect on the source of those emotions and attempt to manage them. There may be good reasons for your emotions, but recognize that those reasons, not the emotions themselves, are philosophically relevant. Manage emotions by taking a step back from your personal investment in the issue and considering it from another perspective. Sometimes a short break can allow the immediate emotional reaction to subside. Sometimes imaginative strategies can help; for example, substitute the features of the problem that trigger strong emotions for features that are more neutral. This advice is not to suggest that emotions are harmful or have no place in philosophical thinking. Instead, the purpose of this strategy is to remind you that the way to derive meaning and guidance from your emotions is to reflect on them and think through the causes, origins, or reasons for the emotions.

Adopt Epistemic Humility

A final concept that is a critical component for becoming a better critical thinker is adopting a stance of epistemic humility. As we have already seen, our thinking can be clouded by cognitive biases. Additionally, our perspective on the world is always colored by our own experience and rooted in the particular place and time in which we live. Finally, even our best scientific knowledge of the universe explains only a fraction of it, and perhaps even less of our own experience. As a result, we should recognize these limitations of human knowledge and rein in our epistemic confidence. We should recognize that the knowledge we do possess is fragile, historical, and conditioned by a number of social and biological processes.

Sky full of thin, broken clouds. The sun shines dimly through.
Figure 2.5 The principle of epistemic humility calls upon us to recognize that the knowledge we possess is fragile, fallible, and colored by our own experiences. (credit: “Life is a long lesson in humility.” by e.r.w.i.n./Flickr, CC BY 2.0)

Question Yourself: Do I Really Know What I Think I Know?

We retain all sorts of beliefs from many different sources: memory, testimony, sense perception, and imagination. Some of these sources may be reliable, while others may not. Often, however, we forget the source of our beliefs and claim to “know” something simply because we have believed it for a long time. We may become very confident in believing something that never happened or did not happen in the way we remember it. In other cases, we may have been told something repeatedly, but the ultimate source of that information was unreliable. For instance, most people recommend wearing warm clothes outside when the temperature drops so that they do not “catch a cold.” This is the sort of wisdom that may have been passed down through generations, but it makes little sense from a medical standpoint. There are not many ways that getting a chill or even lowering the body temperature will lead to a respiratory infection. Colds are caused by viruses, not by a drop in temperature. Without thinking through the source of the belief that “if you get cold, you may catch a cold,” you end up believing something that is not true.

Be Aware of the Dunning-Kruger Effect

An even more pernicious form of epistemic overconfidence is revealed in the psychological phenomenon known as the Dunning-Kruger effect. David Dunning and Justin Kruger demonstrated a widespread illusion in which incompetent people or novices rate their own knowledge of a subject more highly than they ought to, while highly competent people or experts rate their knowledge slightly lower than they ought to. These findings do not mean that the experts considered themselves to be less competent than novices. In fact, experts are fairly accurate in rating their own knowledge. However, they tend to assume that everyone else has a similar level of expertise. By contrast, novices consider themselves to be far more competent in comparison to others and misrepresent their own incompetence, which can be a dangerous in many situations.

The lesson from the Dunning-Kruger effect is that you should be extremely wary when assessing your expertise about anything, but especially about something that is a new area of learning for you. The reality is that your intuitive sense of your own knowledge is likely to be inaccurate. It takes time to build expertise in a subject area, and the expert is more capable of assessing their own knowledge accurately.

Order a print copy

As an Amazon Associate we earn from qualifying purchases.

Citation/Attribution

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
Citation information

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.