Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo
Introduction to Philosophy

2.1 The Brain Is an Inference Machine

Introduction to Philosophy2.1 The Brain Is an Inference Machine

Learning Objectives

By the end of this section, you will be able to:

  • Describe the role of emotion in thought.
  • Explain how cognitive systems produce inferences without conscious thought.

One of the first steps to becoming a more critical and reflective thinker is to understand how and why you are prone to making mistakes in thinking. These mistakes are not the result of a lack of intelligence but are a function of the way our minds work and how they naturally lead us astray.

From a biological perspective, we have been shaped by hundreds of thousands of years of evolution, which have primed our brains to become extremely effective inference machines. An inference is the mental process that allows us to draw conclusions from evidence. While we tend to think of inference as a deliberative and conscious process, we infer all kinds of things unconsciously, effortlessly, and immediately; in fact, most of sense perception is a kind of inference. Inference making has been crucial to human survival, but our conclusions are not always correct. By becoming aware of how our brains function to ward off threats and provide us with “cognitive ease,” or a feeling of well-being and comfort, we can begin to correct for and guard against faulty thinking.

The Brain’s Adaptive Ability to Plan Ahead

One insight of evolutionary biology is that every cell and organ in our body is adapted to its local environment for the purpose of making it more likely that our genes will survive into the next generation. Consequently, it’s helpful to think about the brain’s role in propagating our genes. Our brains facilitate our survival and promote our ability to find a partner and reproduce by using thought, calculation, prediction, and inference. For this reason, our natural and genetically primed ways of thinking do not necessarily serve the goals of philosophy, science, or truth.

Silhouette of seated human figure, with the brain outlined within the skull. A thought bubble rises from the figure’s head.
Figure 2.2 The “mind-brain” problem points to the unclear relationship between our thoughts, feelings, and perceptions, and the neurological and electrochemical interactions that take place in the brain. (attribution: Copyright Rice University, OpenStax, under CC BY 4.0 license)

Philosophical Caveats about “Brain Talk”

Before we get much further, note that it is important to be cautious when we talk about brains and minds, which are distinct concepts. In fact, the relationship between mind and brain is one of the central problems of metaphysics, known as the “mind-body problem,” which might just as well be called the “mind-brain problem.” Briefly stated, the mind-body problem is the problem of understanding the relationship between the organic gray and white matter in our skulls (the brain) and the range of conscious awareness (the mind). We know that the brain and central nervous system provide the physical basis for our thoughts, perceptions, emotions, imagination, and desire—in short, our entire mental life. But biology does not tell us what the relationship is between our private mental life and the neurological, electrochemical interactions that take place in the brain. Is the relationship of the mind to the brain like the relationship between lightning and electrical discharge or a rainbow and the refraction of light through water droplets? In other words, is “the mind” just the term we use to label certain kinds of brain activity? Some philosophers think so. However, mental activity is not easily associated with any specific brain activity. Additionally, there seems to be something about the subjective experience of our mental life that is lost when we attempt to explain it fully in terms of brain activity. So other philosophers maintain that the mind is something different from the brain. Nonetheless, the mind and the brain are closely and somewhat mysteriously connected. As a result, it can be helpful to use the resources of psychology and cognitive science (the study of the brain’s processes) to help us understand how to become better thinkers. We can think of the resources from psychology and cognitive science as providing us with a description of how the brain actually behaves. By contrast, when we study critical thinking, we are interested in how we ought to think. Being aware of how we do think may help us devise effective strategies for how we ought to think, but we should understand that the descriptions provided by psychology are not determinative. In this chapter, we explore psychological findings that can help you become more reflective about the ways your thinking can go wrong.

Connections

Read more about the nature of the mind and the mind-body problem in the chapter on metaphysics.

Representation as Projection

While you may consider thinking to be made up of ideas or thoughts, philosophers and cognitive scientists use the term representation to describe the basic elements of thinking. Representations are information-bearing units of thought. This notion of representation can be traced back to Aristotle and has played a significant role in the history of philosophy, but in contemporary philosophy the term representation is more precise. When we think about things, whether through perception, imagination, memory, or desire, we represent those things. What is represented may be something present and real, or it may be fictitious, imagined in the future, or remembered from the past. Representations may even be unconscious. That is, the mind may have some defined content that is directed toward an object without the person being aware that they have produced such a representation.

During the process of representation, even in a relatively simple case of visual perception, the brain makes a complex set of inferences. For instance, consider the checkerboard below. You might imagine that when you perceive something like a checkerboard, your brain passively takes a mental picture of the grid. In this analogy, the eye functions like the lens of a camera, and the brain develops the picture to present to the mind. But there are several problems with this model. First, where is the picture in your brain? Who is viewing the picture in your head? There are further problems with the camera analogy that can be revealed when we examine optical illusions. Look at the checkered set of squares in Figure 2.2. Are the horizontal lines parallel?

A black-and-white checkered board with squares that do not align directly under one another creates an illusion that the squares are not the same size.
Figure 2.3 The horizontal lines on this grid are parallel, but unless you look at the image from the side, it is impossible to “see” this. This is one of many examples of common perceptual illusions. (credit: “Optical Illusion” by Selena N. B. H. CC BY 2.0)

In fact, the horizontal lines are parallel, but unless you look at the image from the side, it is impossible to visualize this. There are countless examples of these types of perceptual illusions. We represent the world outside as a stable picture that is completely filled in, in full focus, and uniformly colored. In reality, our visual field is limited and hazy around the edges, and colors change dramatically depending on lighting conditions, distance, movement, and a host of other factors. In fact, your brain is not passively capturing the world, like a camera, but is actively projecting the world so that it makes sense to you.

Neuroscientist David Eagleman (2011) uses the analogy of the front page of a newspaper to describe how perception works. The front page is a representation of the world’s events for a given day. Of course, it does not present a full or complete picture of the world, but a summary intended to highlight the events of consequence, those that have changed, and those that we are most likely to care about. Like a newspaper editor, your brain is working overtime to project an image of the world based on what is relevant to your survival. You unconsciously adjust the images you perceive to give you the impression that they are far away, nearby, moving, and so forth. Instead of the fully formed, three-dimensional image of the world we seem to see, we actually perceive a kind of sketch, highlighting what we need to know to navigate safely in our environment and obtain what we need. You probably think that sense perception is the clearest and most certain way you can know the world around you. As the adage says, “Seeing is believing.” To become a better critical thinker, however, you will need to become skeptical of some of your basic beliefs. There are times when you absolutely should not believe your lying eyes.

Emotions and Reason: Homeostasis and Allostasis

In addition to the editorial license of mental representation, thinking is not always as rational as we imagine. The neuroscientist Antonio Damasio (1994) was one of the first to popularize the notion that rational thought is tempered by emotions. He is critical of what he perceives as the philosophical bias against emotion in the history of philosophy. In Descartes’ Error, he says modern philosophers have neglected the role of emotions in thought, imagining that the goal of rational thinking is to eliminate the influence of emotions. Instead, his years of clinical work with patients revealed to him that emotions cannot be separated from reason. Our most rational thoughts are, in fact, guided, informed, and influenced by emotions. According to Damasio, reasoning and intelligence function best when we care about something. Without feelings, says Damasio, we are less rational, not more rational.

Damasio (1994) explains that emotions serve to maintain homeostasis in the brain through the chemical messengers known as neurotransmitters. Homeostasis is the biological tendency to find a neutral state of equilibrium (the word stasis means “standing still,” and homeo means “same or similar”). This process relies on a feedback loop where current bodily states are monitored, observed, and then altered to bring the body back into balance. Most homeostatic processes in the body are unconscious, but emotions are linked to conscious awareness. For instance, when your blood sugar is low and your body needs calories, there is a series of chemical processes that give rise to the feeling of hunger. This is a conscious signal that you need to eat; it promotes behavior that ensures survival. Similarly, a rustling sound in the bushes at night will trigger a series of physiological responses (heightened senses, increased heart rate, pupil dilation, etc.) that correspond to the feeling of fear and promote behavior, such as fight or flight, that are necessary for survival. What Damasio demonstrates is that emotions have their own feedback mechanism, so that an idea or image can generate physiological responses even in the absence of an external stimulus. Because emotional responses and conscious thought are closely linked, decision-making can be influenced by this emotional-physiological feedback mechanism. Our thinking can go astray because we are afraid of bad outcomes, and that fear dominates a more rational calculation about which course of action is most beneficial (1994, 172–175).

In addition to maintaining equilibrium, the brain also anticipates future events and circumstances by projecting likely scenarios based on a catalog of past experiences and concepts generated through social norms and social interactions. The process of regulation that prepares the body to anticipate future needs before they arise is called allostasis (allo means “other or different”). Psychologist Lisa Feldman Barrett (2017) explains that the brain stores neural pathways that are triggered by external or internal stimuli to provide the closest match to the current situation. The neural pathways form a kind of template of action, promoting behavior like increased heart rate, pupil dilation, or motion. Feelings are a goal-oriented response to certain situations: they prepare us to behave and react in certain ways that promote what is beneficial to the body and sharpen and shape our awareness of the world.

In summary, the brain makes inferences about the world through perceptions, emotions, and concepts that are largely unconscious and deeply ingrained in our psyches. This process allows us to navigate fluidly and accurately through a world with so many and varied stimuli. Our reactions to stimuli are partially homeostatic, meaning that the body tends to bring itself back into an optimal state of equilibrium, and partially allostatic, meaning that the body prepares for and anticipates future situations. Together, these impulses construct a picture of the world that we experience seamlessly and dynamically. Our experience is far more complicated than the crude mental model we imagine. We are projecting and constructing the world we experience as much as we are recording and viewing it. And that fact has important consequences for the kind of reflective and critical thought we ought to engage in when we try to think clearly about the world.

The Evolutionary Advantage of Shortcuts

Human beings have evolved to navigate the world most effectively and efficiently by engaging conscious awareness only when necessary. For that reason, you can walk through the grocery store while thinking about what you are going to cook for dinner. You do not have to consciously think about where to go, how to slow down to make way for other people, or how hard to push the shopping cart so that it maintains momentum in front of you even as its weight changes as you add groceries to the basket. All that biomechanical activity can be outsourced to unconscious mechanisms as you scan your shopping list. The brain is quite good at engaging in habitual activities without the assistance of conscious thought. And that is a good thing because conscious thought is expensive in energy terms. Consider the picture that follows.

A standing woman looks pensively into the distance.
Figure 2.4 Many inferences can be made about this woman’s inner experience based on her expression and posture. While such inferences can be made quickly, they cannot be verified without further investigation, and they are highly susceptible to error, bias, and stereotyping. (credit: “CL Society 226: Woman with mobile phone” by Francisco Osorio/Flickr, CC BY 2.0)

You are probably immediately able to provide complex inferences about this picture, such as the woman is worried, concerned, or anxious about something. The inferences you make about this image are easy, fast, and complex. They are driven by the kind of emotional and conceptual thought processes that are unconscious and efficient. While these inferences are quick and easy, you may also be aware that they are provisional without more information. Given more data about the circumstances surrounding this picture, you might revise your perception about what is going on. This is exactly the sort of thinking that drives the emotional projections discussed in the previous section.

A different type of thinking is required to solve a math problem. The following example comes from psychologist Daniel Kahneman’s book Thinking Fast and Slow (2013). Try to solve the following in your head:

24 × 14 =

Do you know the answer? For most people, multiplying two-digit numbers without pen and paper (or a calculator) is quite difficult. You might need perhaps 10 or 20 seconds of effortful thinking to solve the problem in your head since you do not have the unconscious mechanisms to do so automatically. Long-term social and evolutionary pressures have shaped our brains to find efficient solutions to complex questions about facial expressions. The same cannot be said for math problems. Knowing the solution to a math problem may be useful, but it is not the sort of thing generally required for survival and reproduction. On the other hand, quickly reading other people’s emotions is at times vital for survival. There are other interesting differences between these two kinds of thinking. While it is difficult to solve the math problem, once you solve it, you can be 100 percent certain the answer is correct. By contrast, it is easy to generate a story about facial expressions, but this story is highly susceptible to error, bias, and stereotyping. As a result, critical thinkers should be careful not to jump to the first, most obvious solution.

Energy Demands on Deliberate Thinking

Solving a math problem requires rational thought and effort. When we engage in rational thought, our brains use up precious energy stores that may be required for the maintenance of the body. Because evolutionary pressures seek to keep us alive long enough to pass our genes to the next generation, we have a biological tendency to avoid effortful thinking. In a sense, it is evolutionarily wise to be lazy.

The resources demanded by conscious thought can be understood in terms of the familiar notion of “attention.” When a task requires significant attention, it places increased energy demands on the brain. Periods of high-attention activity can be stressful, as the body increases blood flow to the brain, delivering more glucose and oxygen for increased mental activity. Additionally, attention is limited and focused on specific tasks. Consider the “selective attention test” developed by Daniel Simons and Christopher Chabris. Watch the video below and see how you perform on this test.

Video

Selective Attention Test

How many passes did you count? Did you miss anything in the process? When our attention is focused on a novel and complex task, we become less aware of other stimuli outside the specific area of focus. Additionally, we may become fatigued, stressed, or anxious while engaged in paying close attention. Not surprisingly, our brains prefer automated shortcuts.

Heuristics and Learning

Kahneman (2013) calls these mental shortcuts heuristics, or rules of thumb for drawing inferences. Problem-solving with heuristics is largely unconscious, automated, effortless, and efficient, but it is not always correct. Rational thinking or computation requires conscious attention and effort and may not even be possible without some practice. We are forced to engage in effortful thinking when confronted with something new and possibly dangerous—or even with something slightly outside of our normal routine. For example, you have probably driven home from work or school along a familiar route on “autopilot,” preoccupied with your thoughts. Maybe you have even gotten home and felt as if you cannot remember how you got there. By contrast, you have probably experienced the stress of navigating a new, unfamiliar city. In the first case, navigation can be carried out using easy, largely automatic processing, whereas in the second case, navigation requires the intense resources of active attention and rational calculation.

Sometimes complex activities can become effortless, but unlike when we are on “automatic pilot,” such activities feel pleasant and fulfilling. When you become fully immersed in a complex activity to the point at which it becomes effortless, you have entered the state of “flow” (Csikszentmihalyi 2008).

Flow states are possible only for someone who has achieved some level of proficiency at a task. They are characterized by intense concentration and awareness as well as a sense of personal control or agency, but they are pleasurable because the challenge of engaging in the task is commensurate with your ability. By contrast, a novice may find the same tasks stressful and frustrating. This phenomenon can be illustrated using the notion of the “learning curve” that describes how a novice grows in proficiency.

What this means is that a person may be able to rely on intuitions, gut reactions, and other automatic responses in a field in which they are an expert, but the novice should be skeptical of these methods of thinking. As a novice, your mental heuristics are frequently faulty, so you are susceptible to prejudice, implicit bias, and error.

Consider the case of buying a car. Someone who is deeply familiar with the automobile market as either a buyer or a seller may be able to estimate the true value of a car easily, but the average person would need to do a great deal of research to arrive at a true estimate. Because of the effort required for nonexperts to appraise car value, they are easily influenced by dealer incentives, marked-up list prices, financing options, and other tricks of the trade. Given that we are all susceptible to these types of errors, it seems like a good idea to try to become more self-aware and critical and not rely exclusively on gut reactions or intuitions when encountering new material. Since you are probably a novice in philosophy if you are reading this textbook, you ought to be suspicious of your gut reactions to and intuitions about philosophical questions. Keep an open mind, and don’t assume you already understand the philosophical problems you will encounter in the chapters that follow. Being open to new ideas and allowing yourself to admit some degree of ignorance are important first steps in becoming a better thinker.

Heuristics and Substitution in Decision-Making

The cognitive biases that we will examine in the next section are based on a more fundamental “substitution heuristic.” This term describes our tendency to answer a difficult question or problem by substituting it with an easier question to answer. While substitution often results in an incorrect or inappropriate response, it gives us a sense of satisfaction or “cognitive ease” in thinking we have solved a problem. For instance, when you are asked to evaluate something complex and uncertain, like the future value of an investment or the political prospects of a politician, you are likely to substitute that complex calculation for an easier one. In particular, you may substitute your positive or negative feelings toward the politician or the investment product. But your feelings are likely to be guided by your preconceptions.

When the brain defaults to heuristics that produce a less-than-optimal result or even an incorrect decision, it is operating with a cognitive bias. A cognitive bias is a pattern of “quick” thinking based on the “rule of thumb.” A person operating under a cognitive bias does not use logic or careful reasoning to arrive at a conclusion. Cognitive biases are like perceptual illusions. Just like perceptual illusions, cognitive biases are the result of the natural and, ordinarily, efficient operation of the brain. Even though mental heuristics often work perfectly well to help give us an estimation of reality without the mental effort required to generate a more comprehensive picture, cognitive biases are the result of misleading and faulty patterns that arise from this process.

Order a print copy

As an Amazon Associate we earn from qualifying purchases.

Citation/Attribution

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
Citation information

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.