Science Learning Hub logo
TopicsConceptsCitizen scienceTeacher PLDGlossary
Sign in
Article

Countering false information – key terms

This resource provides explanations of some of the key concepts and terms encountered when learning to counter false information. Some of these terms have different meanings in different contexts, so they may require explicit teaching. 

Supporting resources 

The following resources provide additional information and real-life examples of the terms and concepts listed below. 

  • Countering false information – article

  • Online algorithms, biases and incorrect information – article

  • Recognising false information online – article

  • Misinformation, disinformation and bad science – article

  • Making sense of our information environment – article

  • Common logical fallacies – interactive

  • Spotting misinformation – student activity

  • Manipulation tactics – create an inoculation campaign – student activity

  • Examples of bad science and countering false information – download

  • Algorithms 

  • Bias 

  • Clickbait 

  • Confirmation bias 

  • Debunking 

  • Disinformation 

  • Echo chamber 

  • Filter bubble 

  • Inoculation theory 

  • Logical fallacies 

  • Malinformation 

  • Misinformation 

  • Op-ed 

  • Prebunking 

  • Rabbit hole 

  • Selection bias

  • Sticky

Algorithms

An algorithm is a set of rules used for calculation or problem solving, especially with a computer. Within the context of false information, we refer to the use of algorithms. Search engines and social media platforms use AI-enhanced algorithms to analyse our habits and interests. They can be used to determine what we see and don’t see, which can manipulate how we feel and how we think. Learn more about algorithms in the article Online algorithms, biases and incorrect information.

Bias

An opinion, feeling or trend that is preconceived or unreasoned. Bias can be directed at individuals, groups, projects or actions. Confirmation bias and selection bias are forms of bias. Learn more about bias in the article Misinformation, disinformation and bad science.

Clickbait

Eye-catching and often inflated headlines and statements that encourage people to click on a weblink.

Confirmation bias

The tendency to search for, interpret and remember information in a way that confirms pre-existing beliefs while ignoring or discounting evidence that contradicts them.

Debunking

Exposing a false or exaggerated claim. Debunking can be a conundrum. It can draw attention to the false claim, which boosts promotion via algorithms. Debunking is an important task, but we need to consider the tactics we use. 

Disinformation

Information that is intentionally shared in print or online media to harm someone or something and is based on false information. This is usually done to influence people to think a certain way.

Facebook post of a cat with a love heart emoji

User behaviour analytics

See more

The user behaviour analytics (UBA) process involves social media sites monitoring and collecting data on how users interact with the platform. This can include the users’ actions, clicks and engagement levels. It enables the platform to personalise content for the user.

If you engage and like a lot of posts about cats, you’ll see a lot more in your personalised feed.

This image has been altered to remove personal information.

Rights: The University of Waikato Te Whare Wananga o Waikato 
Referencing Hub media

Echo chamber

An online, social or media environment in which the same opinions are repeatedly promoted in a way that does not expose those reading or watching to opposing views. The term echo chamber was first used to describe an enclosed space that amplifies and reflects sounds.

Filter bubble

A filter bubble is one’s ‘own little world’ where like-minded people echo each other. It’s based on what an individual likes, shares and engages with online and they are then selectively shown relevant information. 

Inoculation theory

The inoculation theory is analogous to a medical inoculation – the way in which a vaccine uses a weak dose to trigger an immune response to fight a future infection. Using inoculation theory as a strategy to recognise misinformation ‘immunises’ or helps us build resistance to deceptive or manipulative influences or information. 

Logical fallacies

Logical fallacies are intentional or unintentional mistakes in reasoning that undermine the validity of an argument. They often involve flawed logic, misleading tactics or irrelevant points. They can make an argument appear convincing even though it isn’t. Such fallacies can be disproven through reasoning. 

Malinformation

Information that is intentionally shared in print or online media to harm someone or something but it is not based on false information.

Navigating false information Illustration - cars, roads & signs

Navigating false information

See more

“When it comes to navigating false information on the internet, it’s like everyone’s driving cars without a licence.”

Illustration by Gavin Mouldey.

Rights: Crown 2019
Referencing Hub media

Misinformation

Information that is wrong but it is not shared in print or online media with the intent to hurt someone or something. Misinformation is different from disinformation and malinformation. 

Op-ed

Opinion-based article. ‘Op-ed’ refers to ‘opposite the editorial page’ as the traditional placement of op-eds was opposite the editor’s column in a printed newspaper. 

Prebunking

Prebunking is a form of inoculation theory. It targets misinformation using harmless examples. It is the opposite of debunking – exposing false claims. Debunking draws attention to false claims, which can help claims spread widely due to digital algorithms. Algorithms also mean debunked, fact-checked information is unlikely to reach the people most likely to believe the misinformation.

Rabbit hole

A term used when spending a lot of time researching, watching or reading online. Websites and social media platforms are designed to keep users engaged. When used in a positive manner, the term denotes the tenacity to follow the twists and turns of a difficult topic. Negatively, it denotes distraction or spending too much time or becoming too involved in something. Its origin is the idiom ‘down the rabbit hole’ from Alice’s Adventures in Wonderland, where Alice falls into a rabbit hole – an unreal world.

Selection bias

When individuals or researchers focus on specific data or examples that support a particular conclusion while ignoring or excluding other relevant information

Sticky

When information becomes more and more familiar. When we hear it from multiple sources – we become less sceptical of it, and it becomes sticky. Confirmation bias also helps to make information or ideas stick.  

Related content

The Connected article Fake facts defines misinformation, malinformation and disinformation – how they are used in online media, with examples of each. It also delves into the human brain and how it deals with information and fake news.

Connected also features the article Amazing algorithms and their behind-the-scenes work.

Bots vs Beings – the impacts of AI on life and work has text and video of a panel discussion of the question how will AI impact your life and work?

In the article Fraudulent study: MMR vaccine controversy, learn about a real-life example of how an unethical and later retracted scientific paper has had ongoing ramifications for accurate information on vaccinations.

Useful links 

Netsafe has a useful resource Misinformation and disinformation that explains the different types of false information and why we can be misled by false information. 

The Disinformation Project operated between 2020 and2024 as New Zealand’s only independent research group providing best-practice monitoring, research and consulting on disinformation and its impacts. Several of its plain-language reports have been archived here. 

Logo for The Workshop.

Glossary

Published: 30 June 2025
Referencing Hub articles

Explore related content

Facebook post of a cat with a love heart emoji

Article

Online algorithms, biases and incorrect information

Digital algorithms are amazing! All computer programs rely on algorithms. These invisible sets of instructions are working behind the scenes ...

Read more
Infographic examples of misrepresented data on graphs

Article

Recognising false information online

How do we know what is accurate information and what isn’t? We live in a world where it is becoming ...

Read more
An example of pseudoscience in a wellness ad.

Article

Misinformation, disinformation and bad science

Learning the ways in which science information can be undermined or falsified helps us to recognise and counter false information. ...

Read more

See our newsletters here.

NewsEventsAboutContact usPrivacyCopyrightHelp

The Science Learning Hub Pokapū Akoranga Pūtaiao is funded through the Ministry of Business, Innovation and Employment's Science in Society Initiative.

Science Learning Hub Pokapū Akoranga Pūtaiao © 2007-2025 The University of Waikato Te Whare Wānanga o Waikato