Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo
Introduction to Philosophy

5.5 Informal Fallacies

Introduction to Philosophy5.5 Informal Fallacies

Learning Objectives

By the end of this section, you will be able to:

  • Explain the four general categories of informal fallacies.
  • Classify fallacies by general category.
  • Identify fallacies in ordinary language.

Reasoning can go wrong in many ways. When the form of an argument is problematic, it is called a formal fallacy. Mistakes in reasoning are not usually caused by the structure of the argument. Rather, there is usually a problem in the relationship between the evidence given in the premises and the conclusion. Take the following example:

I don’t think Ms. Timmons will make a good mayor. I’ve got a bad feeling about her. And I’ve heard she’s not a Christian. Furthermore, the last time we had a female mayor, the city nearly went bankrupt. Don’t vote for Ms. Timmons.

Notice that to assess the above argument, you have must think about whether the reasons offered function as evidence for the conclusion that Ms. Timmons would be a bad mayor. This assessment requires background knowledge about the world. Does belonging to a specific religion have any bearing on one’s qualification for mayor? Is there any credible connection between a mayor’s gender and the likelihood that person will cause a bankruptcy? If the reasons are not adequate support for the conclusion, then the reasoner commits an informal fallacy. In the above argument, none of the reasons offered support for the conclusion. In fact, each reason commits a different fallacy. The first reason is based on an appeal to emotion, which is not relevant. The second reason points to a characteristic (religion) that is irrelevant in judging competency, and the third reason creates a spurious connection between the candidate and a previous female mayor, putting them both in the same failed category based solely on the fact that they share the same gender.

There are many specific types of informal fallacies, but most can be sorted into four general categories according to how the reasoning fails. These categories show how reasoning can go wrong and serve as warnings for what to watch out for in arguments. They are (1) fallacies of relevance, (2) fallacies of weak induction, (3) fallacies of unwarranted assumption, and (4) fallacies of diversion.

Connections

See the chapter on critical thinking, research, reading, and writing to learn more about overcoming biases.

Fallacies of Relevance

In fallacies of relevance, the arguer presents evidence that is not relevant for logically establishing their conclusion. The reason why fallacies of relevance stick around is because the evidence seems relevant—meaning it feels relevant. Fallacies of relevance prey on our likes and dislikes. Indeed, the very first fallacy of relevance is called “appeal to emotion.”

Appeal to Emotion

Emotional appeals can target any number of emotions—from fear to pity and from love and compassion to hate and aversion. For the most part, appeals to emotion of any kind are not relevant for establishing the conclusion. Here’s an example:

I know the allegations against the governor seem serious. However, he’s in his 80s now, and he fought for our country in the Korean War, earning a Purple Heart. We don’t want to put an elderly veteran through the ordeal of a trial. I urge you to drop the charges.

In this example, the arguer appeals to our feelings of pity and compassion and to our positive feelings about the governor. We might admire the governor for his military service and feel sympathy for his advanced age. But are our feelings relevant in making the decision about whether to drop criminal charges? Notice that the arguer says nothing about the content of the charges or about whether the governor is innocent or guilty. Indeed, the arguer says absolutely nothing that’s relevant to the conclusion. How we feel about somebody is not a logical determinant to use in judging guilt or innocence.

Ad Hominem Attacks

The ad hominem attack is most often committed by a person who is arguing against some other person’s position. “Ad hominem” in Latin means “toward the man.” It is so named because when someone commits this fallacy, the reasons they give for their conclusion concern the characteristics of the person they are arguing against rather than that person’s position. For example, the arguer may verbally attack the person by making fun of their appearance, intelligence, or character; they can highlight something about the person’s circumstances like their job or past; or they can insinuate that the person is a hypocrite.

You may wonder why such arguments are effective, and one reason is sloppy associative reasoning, wherein we problematically assume that characteristics held by an arguer will be transferred to their argument. Another related reason is that too often we allow ourselves to be ruled by emotion rather than reason. If we are made to feel negatively toward a person, those feelings can cloud assessment of their arguments. Consider the following example:

My fellow councilwoman has argued for the city solar project. But what she failed to mention was that she has been arrested twice—once for protesting during the Vietnam War and another time for protesting the 2003 invasion of Iraq. She’s a traitor and a liar. Any project she espouses is bad for the city.

This is clearly an ad hominem attack. The arguer wants to undermine the councilwoman’s position by making us feel negatively toward her. The fact that a person engaged in protests in the past has no bearing on their arguments for an energy project. Furthermore, the arguer goes on to call the councilwoman a traitor and a liar and offers no evidence. Attaching negative labels to people is one way to manipulate an audience’s emotions.

There are other types of ad hominem attacks, and the most successful is probably the one called tu quoque, which means “you too” in Latin. When someone commits a tu quoque ad hominem fallacy, they attempt to undermine a person’s argument by pointing to real or perceived hypocrisy on the part of the person. They assert or imply that their opponent, in the past or currently, has done or said things that are inconsistent with their current argument. Often tu quoque is used as a defensive maneuver. Take the example of a teenager whose father just caught her smoking cigarettes and reprimanded her. If she knows that her father smoked when he was her age, her defensive response will be “You did it too!” She is likely to think he is a hypocrite who should not be heeded. However, the daughter reasons poorly. First, a person’s actions have no bearing on the strength of their arguments or the truth of their claims (unless, of course, the person’s arguments are about their own actions). That her father smoked in the past (or smokes currently) has no bearing on whether smoking is in fact dangerous. Smoking does not suddenly cease to be dangerous because the person explaining the dangers of smoking is a smoker.

You might think, however, that we should not trust the reasoning of hypocrites because hypocrisy is a sign of untrustworthiness, and untrustworthy people often say false things. But remember that there is a difference between a truth analysis and a logical analysis. If smoking has bad consequences on health and development, then that counts as a good reason for the father to not allow his daughter to smoke. But interestingly, some cases of perceived hypocrisy make the supposed hypocrite more trustworthy rather than less. And the smoking example is one such case. Of all the people who might be able to speak of the dangers of picking up a smoking habit at a young age, the father, who became addicted to cigarettes in his teenage years, is a good source. He speaks from experience, which is a second reason the daughter reasons incorrectly in thinking she should not listen to him because he was or is a smoker.

Let’s take a different scenario. Suppose a married person argues that it is immoral to cheat on one’s spouse, but you know he has a mistress. As much as you may hate it, his status as a cheater is not relevant to assessing his argument. You might infer from his hypocrisy that he does not believe his own arguments or perhaps that he suffers guilt about his actions but cannot control his cheating behavior. Nonetheless, whatever the cheater believes or feels is simply not relevant to determining whether his argument is good. To think that whether a person believes an argument affects the truth of that argument is tantamount to thinking that if you believe X, the belief itself is more likely to make X happen or make X true. But such an approach is magical thinking, not logic or reason.

Fallacies of Weak Induction

The fallacies of weak induction are mistakes in reasoning in which a person’s evidence or reasons are too weak to firmly establish a conclusion. The reasoner uses relevant premises, but the evidence contained therein is weak or defective in some way. These errors are errors of induction. When we inductively reason, we gather evidence using our experience in the world and draw conclusions based on that experience. Earlier in the chapter I used a generalization about the return of the red-winged blackbirds in March. But what if I based my generalization on just two years of experience? Now my conclusion—that the blackbirds return every mid-March—seems much weaker. In such cases, the reasoner uses induction properly by using relevant evidence, but her evidence is simply too weak to support the generalization she makes. An inductive inference may also be weak because it too narrowly focuses on one type of evidence, or the inference may apply to a generalization in the wrong way.

Hasty Generalization

A hasty generalization is a fallacy of weak induction in which a person draws a conclusion using too little evidence to support the conclusion. A hasty generalization was made in the red-winged blackbird case above. Here is another example:

Don’t eat at the restaurant. It’s bad. I had lunch there once, and it was awful. Another time I had dinner, and the portions were too small.

This person draws the conclusion that the restaurant is bad from two instances of eating there. But two instances are not enough to support such a robust conclusion. Consider another example:

Sixty-five percent of a random poll of 50 registered voters in the state said they would vote for the amendment. We conclude that the state amendment will pass.

Fifty voters is not a large enough sample size to draw predictive conclusions about an election. So to say the amendment will pass based on such limited evidence is a hasty generalization. Just how much evidence we need to support a generalization depends upon the conclusion being made. If we already have good reason to believe that the class of entities that is the subject of our generalization are all very similar, then we will not need a very large sample size to make a reliable generalization. For instance, physics tells us that electrons are very similar, so a study drawn from observing just a few electrons may be reasonable. Humans (particularly their political beliefs and behaviors) are not the same, so a much larger sample size is needed to determine political behavior. The fallacy of hasty generalization highlights the empirical nature of induction—we need a basic understanding of the world to know exactly how much evidence is needed to support many of our claims.

Biased Sample

A biased sample has some things in common with a hasty generalization. Consider the following:

Don’t eat dinner at that restaurant. It’s bad. My book club has met there once a week for breakfast for the past year, and they overcook their eggs.

This seems much better than the restaurant example offered above. If the book club has gone to the restaurant once per week for a year, the arguer has more than 50 instances as data. However, notice that the arguer’s evidence concerns breakfast, not dinner, and focuses on the eggs. Suppose the restaurant has an entirely different, more extensive dinner menu; then we cannot draw reliable conclusions about the restaurant’s success at dinner. This is an example of a biased sample. With a hasty generalization, the problem is that not enough evidence is used. In a biased sample, the problem is that the evidence used is biased in some way.

Appeal to Ignorance

Appeal to ignorance is another type of fallacy of weak induction. Consider the following line of reasoning:

In my philosophy class, we reviewed all the traditional arguments for the existence of God. All of them have problems. Because no one can prove that God exists, we can only conclude that God doesn’t exist.

Notice that the arguer wants to conclude that because we do not have evidence or sufficient arguments for God’s existence, then God cannot exist. In an appeal to ignorance, the reasoner relies on the lack of knowledge or evidence for a thing (our ignorance of it) to draw a definite conclusion about that thing. But in many cases, this simply does not work. The same reasoning can be used to assert that God must exist:

In my philosophy class, we reviewed different arguments against the existence of God. All of them have problems. Because no one can prove that God doesn’t exist, we can only conclude that God exists.

Any form of reasoning that allows you to draw contradictory conclusions ought to be suspect. Appeals to ignorance ignore the idea that absence of evidence is not evidence of absence. The fact that we lack evidence for X should not always function as evidence that X is false or does not exist.

False Cause Attribution

The fallacy of false cause occurs when a causal relation is assumed to exist between two events or things when it is unlikely that such a causal relationship exists. People often make this mistake when the two events occur together. The phrase “correlation does not equal causation” captures a common critique of this form of false cause reasoning. For example, a person may think that swimsuits cause sunburns because people often get sunburned when wearing swimsuits. There is a correlation between sunburn and swimsuits, but the suits are not a cause of sunburns.

False cause fallacies also occur when a person believes that just because one event occurs after another, the first event is the cause of the second one. This poor form of reasoning, in tandem with confirmation bias, leads to many superstitious beliefs. Confirmation bias is the natural tendency to look for, interpret, or recall information that confirms already-established beliefs or values. For example, some sports fans may notice that their team won sometimes on days when they were wearing a specific item of clothing. They may come to believe that this clothing item is “lucky.” Furthermore, because of confirmation bias, they may remember only instances when the team won when they were wearing that item (and not remember when the team lost when they were also wearing the item). The resulting superstition amounts to believing that wearing a special team jersey somehow causes the team to win.

A box contains the words, correlation does not equal causation.
Figure 5.7 Correlation Is Not the Same as Causation (attribution: Copyright Rice University, OpenStax, under CC BY 4.0 license)

In short, as emphasized by Figure 5.7, just because two things are often correlated (connected in that they occur together in time or place) does not mean that a cause-and-effect relationship exists between them.

Connections

See the chapter on critical thinking, research, reading, and writing to learn more about confirmation bias.

Fallacies of Unwarranted Assumption

Fallacies of unwarranted assumption occur when an argument relies on a piece of information or belief that requires further justification. The category gets its name from the fact that a person assumes something unwarranted to draw their conclusion. Often the unjustified assumption is only implicit, which can make these types of fallacies difficult to identify.

False Dichotomy

False dichotomy, or “false dilemma,” occurs in an argument when a limited number of possibilities are assumed to be the only available options. In the classic variation, the arguer offers two possibilities, shows that the one cannot be true, and then deduces that the other possibility must be true. Here is the form:

  1. Either A or B must be true.
  2. A is not true.
  3. Therefore, B is true.

The form itself looks like a good argument—a form of disjunctive syllogism. But a false dichotomy is an informal fallacy, and such errors depend upon the content of arguments (their meaning and relation to the world) rather than the form. The problematic assumption occurs in premise 1, where it is assumed that A and B are the only options. Here is a concrete example:

A citizen of the United States either loves their country, or they are a traitor. Since you don’t love your country, you are a traitor.

The above argument assumes that loving the United States or being a traitor are the only two possible options for American citizens. The argument assumes these options are mutually exclusive (you cannot be both) and jointly exhaustive (you must be one or the other). But this position requires justification. For example, a person can have mixed emotions about their country and not be a traitor. False dichotomy is poor reasoning because it artificially limits the available options and then uses this artificial limitation to attempt to prove some conclusion. A false dichotomy may include more than two options. The important thing to remember is a false dichotomy limits options in an argument without justification when there is reason to think there are more options.

Begging the Question

Begging the question occurs when an arguer either assumes the truth of the conclusion they aim to prove in the course of trying to prove it or when an arguer assumes the truth of a contentious claim in their argument. When the former happens, it is sometimes called circular reasoning. Here is an example:

  1. The Bible states that God exists.
  2. The Bible is true because it is divinely inspired.
  3. Therefore, God exists.

The problematic assumption occurs in premise 2. To say the Bible is “divinely inspired” is to say that it is the word of God. But the argument aims to prove that God exists. So premise 2 assumes that God exists in order to prove God exists. This is patently circular reasoning. The name “begging the question” is confusing to some students. One way to think about this fallacy is that the question is whatever is at issue in a debate or argument. Here the question is “Does God exist?” To “beg” the question means to assume you already know the answer. The above argument assumes the answer to the question it is supposed to answer.

The name “begging the question” makes more sense for the second form of the fallacy. When a person begs the question in the second sense, they assume the truth of something controversial while trying to prove their conclusion. Here is an example you might be familiar with:

  1. The intentional killing of an innocent person is murder.
  2. Abortion is the intentional killing of an innocent person.
  3. Therefore, abortion is murder.

This is a valid argument. Structurally, it uses good logic. However, the argument is an example of begging the question because of premise 2. Much of the debate over abortion revolves around the question of whether a fetus is a person. But premise 2 simply assumes that a fetus is a person, so the argument begs the question “Is a fetus a person?”

Fallacies of Diversion

The final class of informal fallacies is the fallacy of diversion, which usually occurs in contexts where there is an opponent or an audience. In this instance, the arguer attempts to distract the attention of the audience away from the argument at hand. Clearly, the tactic of diverting attention implies that there is someone whose attention can be diverted: either an audience, an opponent, or both.

Strawman

Men made of straw can easily be knocked over. Hence, a strawman occurs when an arguer presents a weaker version of the position they are arguing against to make the position easier to defeat. The arguer takes their opponent’s argument, repackages it, and defeats this new version of the argument rather than their opponent’s actual position. If the audience listening to or reading the argument is not careful, they won’t notice this move and believe that the opponent’s original position has been defeated. Usually when a strawman is created, the misrepresented position is made more extreme. Here is an example:

Senator: It is important that the path to citizenship be governed by established legal procedure. Granting citizenship to undocumented immigrants who came to this country illegally sets up a dangerous and unfair precedent. It could encourage others to illegally enter the country in hopes that they too can be granted clemency at a later date. We must only reward the status of citizenship to those who followed the laws in coming here.

Opponent: Clearly, we can reject the Senator’s position, which is obviously anti-immigrant. If he had it his way, we’d never allow any immigration into the country. We are a nation of immigrants, and disallowing people from other countries to join our nation is against everything this nation has stood for historically.

The opponent misrepresents the senator as being wholly anti-immigration and then argues against that manufactured position—a classic strawman move. The senator’s original argument focuses narrowly on the question of whether to create a pathway to citizenship for people already in the country who came here illegally. The repackaged argument is much easier to defeat than the senator’s actual argument since few people are in favor of not allowing any immigration into the country.

Red Herring

A red herring fallacy is like a strawman, except the arguer completely ignores their opponent’s position and simply changes the subject. The arguer diverts the attention of the audience to a new subject. A red herring is a smelly smoked fish that was used to train hunting dogs to track smells by dragging this fish along a path as practice. So the fallacy gets its name because it means to trick people into following a different path of reasoning than the one at hand. You may wonder how a person can get away with simply changing the subject. Successful use of the red herring usually involves shifting the subject to something tangentially related. Here is an example:

My daughter wants me to exercise more. She said she is worried about my health. She showed me research about cardiovascular fitness and its impact on quality of life for people my age and older. She suggested I start biking with her. But bicycles are expensive. And it is dangerous to ride bicycles on a busy road. Furthermore, I do not have a place to store a bicycle.

This arguer first summarizes the daughter’s position that they ought to exercise more. But then they take the suggestion of bicycling and veer off topic (getting more exercise) to the feasibility of cycling instead. The comments on bicycling in no way address the daughter’s general conclusion that the arguer needs to exercise more. Because the argument changes the subject, it is a red herring.

Table 5.3 summaries these many types of informal fallacies.

General Category Specific Type Description
Fallacies of relevance—rely on evidence that is not relevant for logically establishing a conclusion    
  Appeal to emotion Appeals to feelings (whether positive or negative) rather than discussing the merits of an idea or proposal
  Ad hominem attack Argues against someone’s idea or suggestion by attacking the individual personally, rather than pointing out problems with the idea or suggestion
Fallacies of weak induction—rely on evidence or reasons that are too weak to firmly establish a conclusion    
  Hasty generalization Draws a conclusion using too little evidence to support the conclusion
  Biased sample Draws a conclusion using evidence that is biased in some way
  Appeal to ignorance Relies on the lack of knowledge or evidence for a thing (our ignorance of it) to draw a definite conclusion about that thing
  False cause attribution A causal relation is assumed to exist between two events or things that are not causally connected; “correlation does not equal causation”
Fallacies of unwarranted assumption—rely on information or beliefs that require further justification    
  False dichotomy A limited number of possibilities are assumed to be the only available options
  Begging the question Either assumes the truth of a conclusion in the course of trying to prove it or assumes the truth of a contentious claim
Fallacies of diversion—rely on attempts to distract the attention of the audience away from the argument at hand    
  Strawman Utilizes a weaker version of the position being argued against in order to make the position easier to defeat
  Red Herring Ignores the opponent’s position and simply changes the subject
Table 5.3 Types of Informal Fallacies
Order a print copy

As an Amazon Associate we earn from qualifying purchases.

Citation/Attribution

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
Citation information

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.