Interactive

Common logical fallacies

This interactive provides an overview and examples of some common logical fallacies. Click on the labels for a definition of each logical fallacy, ways to spot it and examples.

Select here to view the full transcript and copyright information.

This interactive image map provides an overview and examples of some common logical fallacies that are used to undermine and distort facts, information and debates. Understanding logical fallacies will help students recognise them in news media, social media and other information sources. Avoiding fallacies can also improve a student’s ability to produce sound arguments of their own.

Click on the labels for a definition of each logical fallacy, ways to spot it and examples.

There are wraparound resources to support the information in this interactive. The article Countering false information provides pedagogical suggestions and links to the New Zealand Curriculum. The article Misinformation, disinformation and bad science looks at how intentional disinformation undermines facts. Examples of bad science and countering false information provides links to New Zealand and international sources – some of which are briefly described in this interactive.

Background illustration by Megan Salole/Big Picture Thinking for The Workshop.

Note: Some identifying information has been redacted from the images in the pop-up boxes. However, all copy remains verbatim (as it was written).

Index

Transcript

False dichotomies

False dichotomies are also known as the either/or fallacy. It is when a limited number of choices are presented as mutually exclusive when, in reality, more options are available. This type of argument often reduces complex issues to a binary decision.

An example is this statement in response to vaccination calls during the COVID-19 pandemic: “If the COVID vaccines work, why are vaccinated people getting COVID?” This implies that vaccines must either provide complete immunity or be considered ineffective.

How to spot a false dichotomy

  • Ask if there are other options. Are there alternatives being ignored?

  • Challenge the exclusivity. Do the presented options truly cover all possibilities?

  • Consider the context. Does the complexity of the situation suggest that more options are likely to be present?

  • Look for exaggeration. Are the two presented options overly extreme?

The picture from this article suggests a binary perspective: either genetically modified (GM) crops are a complete solution to climate change or they are of no value at all. In reality, GM crops may contribute to climate change mitigation as part of a broader strategy that includes various agricultural and technological approaches.

Image: ARC2020

Scapegoating

Scapegoating is when an individual or group is selected to take the blame for a problem.

Examples might be blaming a single elected official like a mayor or an organisation for an uncontrolled wildfire.

Complex problems usually have complex causes. A single person or organisation may have contributed to a situation but can’t be held accountable for every factor that contributed to a problem.

How to spot scapegoating

  • Watch for a complex problem being blamed on one specific cause or entity without sufficient evidence.

  • Is the information overlooking broader context? Ask if the argument ignores a larger, more complex web of factors contributing to the issue.

  • Is there a lack of supporting evidence? Do the claims use anecdotes, stereotypes or generalisations rather than verifiable data?

  • Look for attempts to divide or distract. Does the argument create an us versus them narrative, diverting attention from factors actually responsible?

In the case of a wildfire, there will often be multiple factors contributing to the fire and work to control it. This would include prior fire mitigation work and planning, fire behaviour, weather and climate conditions, building materials and vegetation in the area on fire.

Look at the pictured headline from this article. What does it suggest?

Now look at this alternative article. What additional information does it provide? Why might the two articles and headlines differ so much?

Image: Breitbart News

Ad hominem attacks

Ad hominem attacks are when the person, group, organisation or business making an argument is attacked rather than their argument. An ad hominem attack can use past history, bring up associations with others (a ‘guilty by association’ type of argument) or attack the personality or personal attributes of an individual.

Ad hominem attacks are popular in politics and widespread in media ‘comments’ sections.

How to spot an ad hominem attack

  • Look for personal criticism instead of refuting of the argument/issue. Is an individual’s character, behaviour or motives the focus rather than the argument they’ve made?

  • Watch for irrelevant insults. Does the response include name-calling, mocking or derogatory language aimed at an individual rather than the issue?

  • Are there ‘association’ based attacks? Does the counter argument criticise someone based on their group, affiliation, personal life or associates rather than their argument?

  • Does the counter argument bring up unrelated past behaviour or actions to undermine the person’s argument.

Pictured is an example of ‘name calling’ common in ad hominem attacks. The video and transcript can be viewed here. Note that the video discussion also has multiple examples of logical fallacies at play.

The news network apologised for the comment of their guest.

Image: Media Matters for America

Straw man arguments

In a straw man argument, the argument that is being refuted is not actually the argument or issue that is being discussed.

One way to do this is to distort or oversimplify an argument, presenting it as weaker or more extreme than it actually is. This weakened version is then criticised or attacked, allowing the person using the straw man argument to appear victorious without addressing the actual argument.

The ‘straw man’ refers to a scarecrow stuffed with straw. The term may have its roots in medieval combat training, where fighters practised against straw-filled dummies. These straw men were easy targets, much like the straw man argument creates an oversimplified or distorted version of an argument that is easier to defeat.

How to spot a straw man argument

  • Clearly identify the argument/debate then ask if it has been changed.

  • Is the argument oversimplified?

  • Is the argument blown out of proportion, making it seem unreasonable or absurd?

  • Does the argument overgeneralise the issue, making it easier to attack?

  • Instead of addressing the main point, does the response target a minor or irrelevant part of the argument?

An example of a straw man argument is this response to those who support clean energy: “People who advocate for renewable energy want to shut down all fossil fuel plants tomorrow.” This is a straw man argument because it suggests supporters of renewable energy are unreasonable, when most typically argue for a gradual transition rather than an immediate shutdown.

Pictured is a late American politician who refuted global warming with a snowball as evidence. This argument misrepresents climate science. Climate scientists do not claim that global warming means it will never snow or be cold. Instead, they explain that climate change refers to long-term global temperature trends, not short-term weather fluctuations.

Image: Public domain

Oversimplification

This is when issues or arguments are made to appear simpler by ignoring any complexities involved. Sometimes oversimplification occurs because people making arguments are simply not aware of the related complexities.

How to spot an oversimplification

  • Is there missing context? Does the argument ignore critical details or broader contexts that make the issue more complex?

  • Look for red flag phrases like “the only reason”, “all you need to do” or “if we just”.

  • Is the argument or solution particularly simplistic? Are unintended consequences around the proposed argument or solution ignored?

  • If this solution is so simple, why has it not been proposed or done before? Look for evidence of the solution having been tried. What were the results?

  • Interrogate the source. Is the person or organisation suggesting a solution or argument an expert in the area under discussion?

When you see any single-fix solution for climate change, you are seeing oversimplification in action. No single solution like solar power, electric vehicles or a geoengineering method will alone be a solution for this complex problem. Climate change solutions will involve many different technologies, tools and changes in human behaviour.

The article for the pictured headline can be found here.

Image: Wired

Misrepresentations

This is when someone twists, alters or presents a false or misleading version of another’s argument, position or evidence to distort understanding of the issue.  

How to spot a misrepresentation

  • Look for differences between what is being said (or criticised) and the original position or evidence. 

  • Identify the issue and then check if the counter argument has been oversimplified. 

  • Is there any misquoting or selective use of evidence? 

  • Has any of the detail been lost – for example, have certain quantifying words like “in some cases” or “possible” been removed? 

  • Has the argument been reframed so it could mislead an audience?  

For example, the image refers to a headline about an industry report that refutes the number of Archey’s frog in the Coromandel. The Department of Conservation administers the Native Frog Recovery Plan. It estimates a population of 5,000 to 20,000. The report’s author used modelling ‘based on sparse data’ (his words) to predict population numbers of more than 50 million in a pilot study. Read more about the data and how it was presented in this Stuff article

Image: Newsroom

Red herring

A red herring is an argument that aims to divert attention by bringing in irrelevant distractions or unrelated arguments. The term’s origin is thought to come from the use of strong-smelling fish to train scent hounds. The smell would divert the dog’s attention from the original scent it was meant to track. 

Politicians across the globe are particularly fond of using red herrings to distract from difficult issues. For example, a politician or political group may complain that they’ve been badly insulted or subjected to discrimination to distract from an unpopular policy decision, or they may bring up an opponent’s historical scandal when they’re embroiled in their own scandal. 

How to spot a red herring

  • Determine the main argument. What is the issue being discussed? Are the responses about the central argument? 

  • Assess relevance. Does the diversion relate directly to the issue or provide meaningful evidence? 

For example, in a debate about climate change, a politician might avoid addressing carbon emissions by talking about how “job creation is more important right now”. This is a red herring because it shifts focus from the topic (climate action) to a different issue (employment). 

A similar example might be a political debate on employment where another politician says: “We cannot consider employment issues when the world is on fire and there will be no planet for people to be employed on.” 

Image: annaorlova23/123RF Ltd

Single cause fallacy

The single cause fallacy is when one thing is blamed for an issue that has many contributing factors.  

How to spot a single cause fallacy

  • Consider the complexity of the issue. Is the issue multifaceted, involving various interconnected factors?

  • Like scapegoating, watch for a complex problem being reduced to one specific cause or entity being blamed without sufficient evidence.

  • Is the information overlooking broader context? Does the argument ignore the larger, more complex web of factors contributing to the issue.

This statement is an example of a single cause fallacy: “Obesity is caused solely by poor personal choices.” It ignores that obesity is influenced by many factors, including genetics, socioeconomic status, access to healthy food, psychological issues and lifestyle habits.

In the image, a person on social media argues that obesity is caused by poor self-control. What is the other common logical fallacy they use in the first part of their argument? 

Image: The University of Waikato Te Whare Wānanga o Waikato

Cherry picking

Cherry picking is the selective reporting of research or data. This can range from a scientist not using all their data to inform their conclusions to individuals and organisations selecting only the data and research that supports their point of view.

Another example of cherry picking is the use of anecdotes. Anecdotes are usually personal experiences and are often only one example or experience. In this case, cherry picking is also called the anecdotal fallacy.

How to spot cherry picking

  • Is there a source for the data or research that is being referred to? Does the geographical region or the date make sense for the context?

  • If you can, check the original data and scientists’ conclusions. Were the scientists’ conclusions informed by all the data?

  • Has the person writing/reporting on the research used all the data?

  • What is the bigger picture? Are there any other factors being ignored?

  • Is any evidence missing? Search out contrary claims.

Cherry picking/an anecdotal fallacy is being used if you hear someone refute the dangers of cigarette smoking with a statement like this: “Smoking dangers are overstated. My Dad was a two pack a day bloke and he lived until he was 99.”

In the pictured article, British American Tobacco New Zealand signal its responsibility and commitment to stopping black market tobacco. It states: “Globally, illegal tobacco is a growing trade – some research indicates this account[s] for 11.2% of global tobacco consumption.”

New Zealand data suggests the actual use of illicit tobacco across 2012–2022 ranged from 2.9% to 17.51%. The Ministry of Health report states: “Historically, there is little evidence that significant increases in the illicit tobacco trade in NZ have taken place in response to past tobacco control measures.”

See more links and information on cherry-picked data about illegal tobacco trade in Aotearoa in the download Examples of bad science and countering false information.

Some questions to consider:

  • Why do you think the tobacco company chose to use global data and not New Zealand-specific data about illegal tobacco?

  • Why do you think the article then uses data about the seizure of illicit tobacco at the border in New Zealand?

  • Why does the New Zealand report detail the methods they used and caveats around the methods and data?

Image: BAT New Zealand

Jumping to conclusions

This means forming a general conclusion without considering all the variables involved.

A well-known example is the controversy that erupted from the publication of a case study linking the measles, mumps, and rubella (MMR) vaccine to autistic behaviours in young children.

The paper was retracted by the publisher and the authors found guilty of deliberate fraud (they cherry picked and chose data that suited their case and falsified facts).

However, media and parents jumped to conclusions – concluding that vaccines cause autism, which resulted in a fall in children being vaccinated against childhood diseases. Read more in the article Fraudulent study: MMR vaccine controversy.

How to spot jumping to conclusions

  • Ensure that findings are based on robust evidence and replicated in independent studies.

  • Consider alternative explanations and maintain scepticism about extraordinary claims.

  • Look to see if the results are overstated.

The pictured headline from this news article jumps to the conclusion that the AstraZeneca vaccine was withdrawn due to dangerous side effects. Try some lateral reading – such as this BBC news article that states the withdrawal was a commercial decision due to lack of demand. Further reading shows that the AstraZeneca vaccine was not the only one to cause blood clots, and research supports that this side effect was “extremely rare”.

Image: The Australian

Impossible expectations

This occurs when someone dismisses an argument, claim or solution by setting unrealistic or unattainable standards of proof or perfection. It undermines valid ideas by demanding evidence or results that cannot reasonably be achieved, even when there is sufficient evidence or practical merit.

How to spot an impossible expectation

  • Unreasonable demands. Look for arguments dismissing claims because they don’t meet impractical or perfectionist standards.

  • Does the argument ignore that most conclusions or solutions are based on probabilities, trends or practical trade-offs – not absolutes?

  • Moving goalposts. Does the information or argument requirement for proof or success keep changing to avoid accepting the argument?

The pictured post on a social media account asks climate modellers to “predict one stock accurately for one year” to prove their climate models can be believed. The suggestion a climate modeller create a model based on an area outside of their expertise to prove their expertise is unrealistic.

Image: Twitter (now ‘X’) user collection, all rights reserved

Rights: The University of Waikato Te Whare Wānanga o Waikato
Published: 30 June 2025