58 logical fallacies and cognitive biases

The fascinating science of how to be stupid.

Our brains are prone to logical fallacies and cognitive biases.

I know that I’m stupid sometimes — most of us are.

Still, this shouldn’t in any way stop anyone of us from wanting to improve. This is one major reason as to why I’ve been so fascinated with logical fallacies and cognitive biases.

Learning about human behaviours is helpful in public relations where we deal with communication challenges on a daily basis.

This list of fallacies below is inspired by Youtuber Jill Bearup who found her inspiration by taking the free online course Logical and Critical Thinking by Auckland University. I’ve also added a number of typical biases from Psychology Today, behavioraleconomics.com, and a few basic media theories often covered on this blog before.

1. Fallacy of composition

Just because there might be great or bad parts to a whole, such parts in themselves doesn’t make the whole great or bad by default. For maximum dramatic effect, journalists are typically decided on whether something should be put on a pedestal or brought down. Therefore, they will be biased towards stories that fit with the reigning narrative — a sort of amplification of the agenda-setting theory.

“We’ve got the best player in the world on our team, so we must also have the best team in the world.”

2. Fallacy of division

Just because something is true for the whole, that doesn’t also make it true for the parts. This is the opposite to the fallacy of composition. For instance, just because a religion as a whole are advocating peace and kindness, that doesn’t mean that significant parts of the religion can’t be violent.

“We’ve got the best team in the world, so naturally we must also have the best individual players in the world.”

3. The gambler’s fallacy

The gambler’s fallacy is connected with our tendency to believe that there are such things as streaks of good or bad luck. In retrospect, we can detect streaks in results given by chance, but there’s no statistical governance operating behind such streaks.

“I should place another bet, because I’m on a roll right now!”

4. Tu quoque (who are you to talk?)

We can sometimes tell ourselves that a piece of advice is factually incorrect based on who we originally heard the advice from. Just because someone makes a statement that is outside of their expertise, this doesn’t make the actual statement untrue by default. You should be mindful of the fallacious appeal to authority, of course, but you shouldn’t jump to conclusions.

“My friend said it’s great to go to the gym, but my friend is out of shape, so that can’t be good advice.”

5. Strawman

A strawman argument is when your opponent grossly misrepresents the intended meaning of your original meaning or position. Strawman arguments seems to be increasingly popular in social media where you often have to describe your opponent’s position yourself before attacking it — and it might be tempting to misrepresent your opponent’s position. The opposite of the strawman argument is the steelman argument where you actually improve on your opponent’s argument.

“Atheists don’t believe in anything so they don’t believe it matters if one does bad things.”

6. Ad hominem

Instead of attacking the arguments, you attack the person. This is typically an effort to undermine any arguments by ignoring them and instead attacking the person’s good name or character.

“Our prime minister’s wife left him, and if he can’t keep a wife happy, how could he run our country?”

7. Genetic fallacy (fallacy of origin or fallacy of virtue)

To assume that something is correct or incorrect based solely on the credibility of the source.

“My priest wouldn’t lie to me and my priest is telling me that God exists, so therefore God must exist.”

8. Fallacious appeal to authority

Just because someone’s an expert in some area, this doesn’t automatically make them an expert in other areas as well. The news media very much favours trusted experts to come back to comment on areas further and further away from their actual area of expertise.

“I’m not an epidemiologist, however, I am a doctor, and I think we must take measures to ensure herd immunity.”

9. Red herring

Offering a piece of information which is irrelevant for the main point with the intention to mislead or distract.

“Many people are campaigning for the environment, but I think that we must tackle unemployment and make sure that we take better care of our homeless.”

10. Appeal to emotion

An attempt to distract you by making you feel bad for someone while anyone’s emotional state is irrelevant to the cause at hand. As human beings, we often find it hard to defend ourselves against people or groups of people claiming to be sad, hurt, or scared.

“Are you sure that you want to blame these immigrants when they’re clearly suffering from their past experiences.”

11. Appeal to popularity (the bandwagon effect)

The false belief that a lot of people can’t be wrong about the same thing at the same time. Both logic and history has taught us that majorities can be objectively wrong, but it’s socially comforting to remind your opponent that you have the majority position on your side — which makes your position correct by default.

“One billion flies can’t be wrong — eat shit.”

12. Appeal to tradition

The false belief that a lot of people can’t be wrong about something for a very long time. Just because human beings have been religious throughout our history, this fact in of itself doesn’t proof the existence of supernatural powers.

“This simply can’t be wrong since this has been common practice forever.”

13. Appeal to nature

The misleading concept that science are at odds with nature and that you therefore shouldn’t trust science. Just because we’ve evolved in unison with nature without vaccines, this fact doesn’t automatically make vaccines dangerous.

“We trust in safekeeping our natural immune systems and therefore we don’t trust in vaccines.”

14. Appeal to ignorance

The misconception that if you can’t prove something to be false, then it must be true.

“Since God can’t be disproven, God must exist.”

15. Begging the question

To use a circular argument; any form of argument where the conclusion is assumed in one of the premises.

“God exists because it says so in the Bible.”

16. Equivocation

To use words of multiple meanings interchangeably to gain the upper hand in an argument.

“I have the right to believe in God; therefore, believing in God is right.”

17. False dichotomy (black or white)

Assuming that something must be either A or B and not both, nor neither. The media logic often dictates that narratives must be simplified and amplified to be easier to quickly understand; as soon as something is two or more things simultaneously, we tend to find this hard to wrap our heads around. A person could be good or bad at the same time — or neither.

“You’re either with us (on everything), or you’re against us (on everything).”

18. Middle ground fallacy

The false notion that any right answer must lie somewhere in between extreme positions. If someone says that the water is cold and someone else says it’s hot, it’s easy to assume that the temperature is likely to be somewhere in between. But it could still be really hot — or really cold.

“Someone said that this is really dangerous and someone said that this is perfectly safe, so as long as I’m cautious, I should be fine.”

19. Decision point fallacy (Sorites paradox)

The false assumption that something can’t be correct or incorrect because there’s no precise cut-off between two points. This is a fallacy we’re seeing in the news media connected to the coronavirus pandemic all the time; since we can’t say which precise number of deaths is good or bad, many argue that we can’t say anything about any number.

“We haven’t had lots of pandemic deaths in Sweden because we know of other countries who clearly have had lots of deaths — and since we haven’t had as many as those countries, we clearly haven’t had lots of deaths.”

20. Slippery slope fallacy

The false assumption that one step in a certain direction is bound to lead to many steps in that same direction. Sure, eating a piece of candy could lead to an increased consumption of sugar which could lead to a serious disease which ultimately could cost you your life. But eating a piece of candy won’t lead directly to your death.

“If you eat that sandwich you’re going to get fat.”

21. Hasty generalisations (anecdotes)

When someone draws very general conclusions based on a very small subset of subjective circumstances. We are often quick to draw generalised conclusions based on anecdotal evidence, especially if those circumstances are individual.

“I got mugged in the street yesterday and I hate the fact that society is becoming increasingly more unsafe.”

22. Faulty analogy

The attempt to disqualify an argument by making a point using an irrelevant analogy. Analogies can be powerful communication tools, but it’s also easy to make comparisons that simply aren’t fair or accurate.

“Abortion is murder.”

23. Burden of proof

The attempt to push the opponent to disprove your claims. It’s the person making the claim who has the burden of proof, but a popular technique is to push the responsibility over at the person you’re arguing against.

“I believe in God and you must prove me wrong to change my mind.”

24. Affirming the consequent

Just because an if-then statement is true in a certain situation, this doesn’t make the if-then statement true in all situations.

“A cat meows, so everything that meows is a cat.”

25. Denying the antecedent (fallacy of the inverse)

If a statement with specific conditions is correct, this doesn’t make the statement correct or incorrect for all other types of conditions.

“A cat meows, so if it doesn’t meow, then it isn’t a cat.”

26. Moving the goalposts

Manipulating the argument by changing the specifics of your initial claims — after being questioned or even proven wrong.

“Yes, there might be some innocent people in jail, but I was only talking about the ones who actually did committed their crimes.”

27. No true Scotsman

To disqualify someone or something based on a false or biased ideal.

” All real men have beards, so if you don’t have a beard, then you can’t be a real man.”

28. Personal incredulity

Just because you find something hard to believe or imagine, that in itself doesn’t make it untrue.

“I can’t believe that the universe and everything in it arose from nothing, so it can’t be true.”

29. False Causality

The false assumption that correlation equals causation.

“Crime rates went up when the price of gas went up, so for the sake of everyone’s safety, we must lower our taxes on fossil fuels.”

30. Texas sharpshooter

To decide on your position and then go out to find only data to support that position — and no other position. This fallacy is especially prominent in this digital age of ours, when it’s possible to go online to find arguments defending almost any imaginable position.

“I’ve found numerous studies supporting my position and I have no idea if there are any studies supporting your position as well.”

31. Loaded question

To ask a question with an assumption already built into the question.

“Have you stopped beating your wife?”

32. Chesterton’s Fence

If we don’t understand or see the reason for something, we might be inclined to do away with it. However, even if we at first don’t understand it, most things have been put in place for a reason. We should therefore leave it be unless we fully understand its purpose.

“There’s a fence here, but I can’t really see what it’s good for, so let’s do away with it.”

33. Survivorship bias

I’ve also written about survivorship bias and Abraham Wald.

“All returning warplanes with damages were shot in the wings, so we should reinforce the wings to make them safer.”

34. The Dunning-Kruger effect

A cognitive bias in which people tend to overestimate their own competence as they quickly make initial progress in a new field.

“I’ve just started learning about this and I’m amazed at how much more I know now compared to before when I knew next to nothing, so I’m quite sure that I’m something of an expert on this subject now.”

35. Confirmation bias

Most of us have a tendency to recall or interpret information in a way that reinforces our existing cognitive schemas.

“I refused to take my medicine and I got well, so I do my best to stay away from medicine always.

36. Heuristic anchoring

When faced with an initial number, we often tend to compare any subsequent numbers to that initial anchor.

“The third house shown to us by the broker was over our budget, but still a bargain compared to the first two houses she showed us.”

37. The curse of knowledge

We tend to assume that other people have at least enough knowledge to comprehend and digest what we’re saying.

“A preschool class came by the lab yesterday and asked about my work, so I talked about genome sequencing for a good half hour and got no follow-up questions.”

38. Optimism/pessimism bias

We find it easier to believe that negative things can happen to others than to ourselves. But some people have a tendency to be biased in the opposite way; they overestimate the likelihood of negative events.

“We’re so blessed that those terrible things couldn’t ever happen to us.” / “What happened to them will also happen to us — only worse.”

39. The sunk cost fallacy

Sometimes we stick to a behaviour simply because we’ve already invested time, money, and other types of resources into it. To abandon such an investment would force us to face an irreversible failure.

“I ordered too much food, so we’ll simply have to over-eat for a few weeks to get our money’s worth.”

40. Negativity bias

As human beings, we tend to react more strongly to negative impacts than to positive impacts of similar or equal weight.

“Our daughter graduated with honors from college yesterday, but then on our way home, our car broke down and ruined the rest of the day.”

41. Declinism

We tend to think that everything will decline, especially in light of new developments. This might be due to a form of cognitive laziness; we don’t wish to change the way we think in tandem with the times. In relation to media science, I’ve written about declinism in The curious case of Cambridge Analytica and the big data techlash.

“Things are continually just getting worse.”

42. The backfire effect

When challenged, the outcome might be that we cling even firmer to our beliefs — instead of questioning ourselves. I’ve also written about this in How to fight populism.

“People really seem to hate our ideology, but this only proves that we’re right about everything.”

43. The fundamental attribution error

When someone else makes a mistake, we tend to attribute this mistake to their character or their behaviour, but when we ourselves makes mistakes, we tend to attribute such mistakes to contextual circumstances.

“People are idiots in traffic; especially when I’m stressed.”

44. In-group bias

We have evolved to be subjectively preferential to people who belongs to the same social group. This isn’t necessarily a bad behaviour per se, but we must watch out for situations where we are put in a position in which we simply can’t be expected to be fair and objective.

“I might be biased, of course, but I dare say, objectively, that my daughter was the best performer in the whole orchestra.”

45. The Forer effect (the Barnum effect)

We have a tendency to fill any gaps in information given to us using our existing cognitive schemas. This is, for instance, why it’s so easy to think that an horoscope is eerily accurate and this is because we fail to recognise that vague statements might not just apply to oneself, but to a large number of other people, too.

“I read my horoscope yesterday and the information was uncannily accurate, so I’m certainly convinced that there are some things about the cosmos that influences our lives in a way that we can’t yet understand.”

46. Cognitive dissonance

We have a tendency to sort information based on our existing cognitive schemas (see also the Forer effect). One outcome of this is that we tend to disregard any information that sits badly with what we already believe while absorbing anything that confirms our existing beliefs with ease.

“The Earth is flat and I haven’t seen any credible evidence to the contrary.”

47. The hostile media effect

This can be seen as the equivalent in media science to the psychological fallacy of the backfire effect. Studies have shown that people with strong opinions on a specific issue tend to believe that the media is biased towards their opposition. The effect will be even stronger if the individual believes that there is a silent majority out there who are particularly susceptible to erroneous or misleading media coverage.

“I know that the media is telling me that I’m wrong, but that’s perfectly understandable since their primary objective is to stop me from exposing the truth.”

48. Cherry picking (the fallacy of incomplete evidence)

This fallacy is closely related to Texas sharpshooter and fallacy of division. Especially online, this fallacy fuels the most part of the reasoning behind popular conspiracy theories. In a world where information is abundant and easily accessible, it’s easy for anyone to make whatever case seem plausible. As an example, see this fragmented yet powerful attempt by the French author Rupert Furneaux demonstrates how to cast doubt on the existence of Napoleon Buonaparte, one of the most famous characters in history:

1. The name Napoleon is just a variation of Apoleon or Apollo, and as God of the Sun, he was named Buonaparte, which means “the good part of the day” (when the sun shines).

2. Just as Apollo was born on the Mediterranean island Delos, Napoleon was born on the Mediterranean island Corsica.1
Napoleon’s mother Letitia can be identified as Leto, Apollo’s mother. Both names mean joy and happiness, signaling the sun keeping the night at bay.

3. Letitia had three daughters — as did Leto, Apollo’s mother.

4. Napoleon’s four brothers represent the four seasons. Three of his brothers became kings, except for one brother who became Prince of Canino (derived from ‘cani,’ white, winter, aging).

5. Napoleon was driven out of France by Northern armies, as Appolo, the Sun God, was driven away by the North Wind.

6. Napoleon had two wives, as did Apollo. They represent the Earth and the Moon. Apollo never had any children with the Moon, but the Earth gave him a son, representing the fertilization of all green plants on Earth. Napoleon’s son was allegedly born on the 21st of March, the equinox in which the plane of Earth’s equator passes through the center of the Sun (the Summer Solstice).

7. Apollo saved Greece from the dragon Python, and Napoleon saved France from the horrors of revolution (derived from ‘revolvo,’ something that crawls).

8. Napoleon’s twelve generals are symbols for the twelve creatures of the zodiac, and his four generals represent North, West, South, and East.

9. Napoleon, the Sun Myth, always conquered in the South but was always defeated by the cold winds of the North. Like the Sun, Napoleon rose in the East — he was born on Corsica) — and dawned in the West — he died on St. Helena.

49. The spiral of silence

Most social animals harbours an instinctive fear of isolation and in-groups maintain their cultural stability partially by excluding individuals with non-conforming opinions or behaviours. This can create a culture in which group members are self-censoring both their opinions and their behaviours by going silent. I’ve written about this effect in Woke in business: How to navigate the moral war of political correctness.

“My opinions are perceived as wrong and it’s better for everyone if I stay silent.”

50. The yes ladder

This is a marketing exploit where the persuader aims to get you to say yes to something substantial (“big ask”) by methodically getting you to say yes to something smaller first (“small ask”).

“I wasn’t going to buy the pink umbrella at first, but then I subscribed to their newsletter and via the newsletter I downloaded a free photobook with pink umbrellas — and now I own five pink umbrellas.”

51. Bystander effect

People are socially less inclined to offer support or aid if there are lots of other people around who could do it instead. I’ve written about this very special psychological effect in A web of lies: The death of Kitty Genovese and Conversion theory: the disproportionate influence of minorities.

“Everyone cares deeply about personal safety, so everyone will download our new CSR-app to help each other.”

52. Reciprocation effect

If someone is nice or generous towards us, we often tend to feel a sort of obligation to reciprocate. While this, in itself, is a beautiful and normal part of human behaviour, it’s something that special interests can take advantage of.

“I can’t believe the car broke down so fast — the guy I bought it from threw in so many extra features.”

53. Commitment and consistency

Once we commit to something, we invest a part of ourselves in that decision. This makes it harder for many of us to abandon such commitments, because it would mean giving up on a part of ourselves. This bias is, of course, closely related to yes ladders, declinism, appeal to tradition, and sunk cost fallacy.

“I’ve made my decision and therefore I’m sticking with it.”

54. The fallacy of social proof

This fallacy is the commercial extension of the bandwagon effect; by showcasing social proof we are comforted by decision made by others. Ideally, we should always make sure that reviews and displays of engagement are relevant (and real) before making any decisions for ourselves, but this doesn’t always happen.

“Their product seem to have lots of happy users, so the risk of getting scammed is low.”

55. Liking and likeness

“We most prefer to say yes to people we know and like,” says Robert Cialdini in Influence: The Psychology of Persuasion. This bias is closely related to the in-group fallacy described above. According to Social Media Examiner, a few extensions of this principle are:

We like people who are similar to us. We like people who compliment us. We like things that are familiar to us. Cooperation toward joint efforts inspires increased liking. An innocent association with either bad or good things will influence how people feel about us. Physical attractiveness creates a halo effect and typically invokes the principle of liking.

“He is gorgeous, successful, and speaks in a way that resonates with me, so why shouldn’t I trust everything he says?”

56. The appeal to authority

It’s generally difficult for us to distinguish between perceived authority and real authority. Many internet companies are using testimonials by people with fancy titles that their audience have never heard about — and it works. This is closely related to the fallacious appeal to authority.

“This product was recommended by several leading doctors and that’s true because the ad said so.”

57. The principle of scarcity

Most of us are notoriously scared of missing out (also known as FOMO, fear of missing out). This makes assign perceive things as more valuable the more rare they are. I’ve also written about this in Craving more — creating an illusion of scarcity.

“I’m so happy that I managed to snag that pink unicorn umbrella before the discount ran out!”

58. Loss aversion

The pain of losing can psychologically be twice as powerful as the joy of winning. This often allows us to take disproportionate risks in order to avoid losing compared to the risks we’re ready to take to win. This is, of course, closely related to commitment and consistency and the sunk cost fallacy.

“Our last investment led to a loss of market share, so we must increase our investment to regain it.”

Cover photo by Jerry Silfwer

Reference List

Arkes, H. R., & Blumer, C. (1985), The psychology of sunk costs. Organizational Behavior and Human Decision Processes, 35, 124-140.

Cialdini, R. (2006). Influence: The Psychology of Persuasion, Revised Edition. Harper Business: The United States.

Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK:  Cambridge University Press; with foreword by former APA President, Dr. Diane F. Halpern.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.

Forer, B. R. (1949). The Fallacy of Personal Validation: A classroom Demonstration of Gullibility. Journal of Abnormal Psychology, 44, 118-121.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.

Scott, P. J., & Lizieri, C. 92012). Consumer house price judgments: New evidence of anchoring and arbitrary coherence. Journal of Property Research, 29, 49-68.

Simon, H. A. (1957). Models of man. New York: Wiley.

Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science, 361(6398), 178-181.

Thaler, R. H. (1999). Mental accounting matters. Journal of Behavioral Decision Making, 12, 183-206.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.

West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.

.