Making sense of our information environment
We all deserve a healthy information environment where all the best knowledge we have from science, mātauranga Māori and lived experiences is easily available, easily understood and trusted so that all people, not just some, can use it to improve their lives in ways that matter to them. False information gets in the way of a healthy information environment, making it harder to trust the information we are exposed to.

False information isn’t new
Fake news and sensationalised stories are not new.
Look at this cartoon from 1894. What do you think the cartoon is saying about media at that time? How do you think things have changed?
Cartoon illustration by Frederick Burr Opper (1857–1937)
False information isn’t new, but digital media has made it easier for false information to spread faster. To understand why this is, we first have to learn about how humans process information.
Many of us assume that we start processing information through facts and logic. This is the idea that, when given the right information and facts, people will get filled up with these facts, apply logic and come to the right decision.
What cognitive science tells us is something quite different. How people process, understand and respond to information is determined mostly by these things:
Fast-thinking brains – unconscious cognitive processes. These are processes your brain undertakes automatically, like understanding language or recognising faces, without you even realising it’s happening.
Shared or cultural mental models – our shared beliefs about our society and how things work.
Framing – the ways in which information is presented to people that affects how they respond.
Fast-thinking brains
Psychologist Daniel Kahneman describes how we process information in his book Thinking, Fast and Slow. He outlines all the ways in which we have evolved two systems of thinking over millions of years:
A fast-thinking system, which is in use most of the time and helps us process vast amounts of information quickly.
A slower-thinking system, which works more slowly and less reactively. We don’t use our slow-thinking system often because it takes a huge amount of our energy to use.

Fast-thinking brain
Our fast-thinking brain prioritises values and beliefs over fact and logic.
This can be an immense challenge for having a productive public conversation about complex social, environmental and scientific public policy issues.
Illustration by Megan Salole/Big Picture Thinking from How mindset and narrative shifts can enable change: a briefing paper.
Our fast-thinking system means most of the time we take shortcuts to assess new information. We first filter new information through these things:
Our existing understanding and beliefs – what we already know and believe to be true.
Our values – the things that deeply matter to us and that motivate our actions.
Our feelings – how the information makes us feel both in emotions and body feelings (like butterflies in our tummy or a sinking feeling).
Other mental shortcuts (sometimes described as biases) – we have lots of shortcuts that help us make quick sense of the world. For example, if we know and trust someone or an institution (like a government department or community leader), we’re more likely to accept the information they share, or if a piece of information is familiar to us, we’re less likely to question it.
Once information has been through these filters, we judge the ‘truth’ and relevance of that information, then we make a decision about it. We will then backfill our decision with logic and facts that fit.
This process happens in the blink of an eye, and we’re mostly not even aware it is happening. This fast-thinking process is also really practical – imagine if we had to think carefully about everything we see and experience in a day. That would be exhausting and overwhelming.
This information-processing system explains why just giving people ‘correct facts’ isn’t enough to shift how they think about an issue. This is especially true if the facts we give are in conflict with the false information they currently believe. Their fast-thinking brain will reject facts that don’t feel right to them or align with their existing ways of thinking and reasoning about the issue.

The fast-thinking brain takes shortcuts
The fast-thinking system helps us to take shortcuts and process information quickly. This is very useful for dealing with the vast amounts of information we have to process each day. However, when we use our fast-thinking brain, we are prone to reject facts that don’t feel right to us or don't align with our existing ways of thinking and reasoning about an issue.
Illustration by Megan Salole/Big Picture Thinking for The Workshop.
Cultural or shared mental models
Mental models are deep, unconscious frameworks we use to make sense of the world. They are sometimes called mindsets. We get them from our culture, our social environment and from the information we consume. Shared mental models are those that appear and reappear in patterns across our communications. Shared mental models provide us with explanations about how a problem has happened, who caused it and what the solutions should be.
When shared mental models are strong – for example, when they are constantly repeated and reflected in our lives and culture – our fast-thinking brains make them harder to shift or change. Repetition creates neural pathways in our brains that make fast thinking about shared mental models smoother and more comfortable. It works like a muscle – the more we use them, the stronger they become.
For example, there is a strong shared mental model that, when we are asked to think about ‘transport’, the things that pop most easily to mind are roads and cars. That is because across our own experiences, in the media and in our culture, cars and roads are understood to be at the centre of ‘transport’. This mental model, combined with our fast-thinking brains makes it hard to think about other ways that we could move around our towns and cities like walking, riding or taking public transport and makes it even harder for people to consider changing how we use our road spaces to allow for these different ways of moving.

Mental models
Mindsets or mental models are deep, unconscious models we use to make sense of the world. Shared mindsets provide us with unconscious explanations about how a problem has happened, who caused it and what the solutions should be. These implicit explanations are present in shared mindsets.
This means we need to work with them if we want people to understand and support shifts to systems, structures, policies and practices, transforming the way people currently do things.
Illustration by Megan Salole/Big Picture Thinking for The Workshop.
Framing
Framing is the conscious and unconscious ways we present information. It is the last ingredient in how we process information. Like a window, a frame gives people a particular view on an issue. It helps people focus on the things we want them to think about. It obscures or blocks ways of thinking that are outside the frame.
For example, if we ask someone to think of the African desert, they think of everything associated with those words in both our collective thinking and their individual experience of it. They are unlikely to think about a polar bear as it doesn’t fit the frame.
We cannot avoid framing. It is present in all communications because of how our minds function. How an issue is framed can spark different thinking and reasoning in our fast-thinking brains. It is one way we can influence how people see an issue.
What makes false information sticky
False information has always been somewhat sticky because it often:
taps into our existing shared mental models, making it seem reasonable and familiar
uses repetition to make it stronger and more dominant in the information environment.
With the spread of digital communication platforms, false information has become even more sticky and easy to spread:
Social media algorithms are designed to keep our attention and will continue to feed us more information that is like what we’ve paid attention to in the past, creating a repetition loop.
Digital media content is designed to create stronger emotions that draw us in to engage, spend longer on the site and share more widely. This taps into our fast-thinking system, often without us knowing.
Sharing information is as easy as a click of a button.
Supporting resources
The following resources provide additional information and learning around countering false information online:
Countering false information – article
Countering false information – key terms – article
Online algorithms, biases and incorrect information – article
Recognising false information online – article
Common logical fallacies – interactive
Examples of bad science and countering false information – download
Activities
In this activity, students are presented with statements containing logical fallacies. Through discussion or discovery, they work through the statements, identify specific vocabulary or characteristics and match the statement with a common logical fallacy technique.
In Manipulation tactics – create an inoculation campaign, students watch videos and use a template to analyse the inoculation messages they explain. Students then use the template to plan and create their own inoculation campaigns.
Related content
In the article Fraudulent study: MMR vaccine controversy, learn about a real-life example of how an unethical and later retracted scientific paper has had ongoing ramifications for accurate information on vaccinations.
The Connected article Fake facts defines misinformation, malinformation and disinformation – how they are used in online media, with examples of each. It also delves into the human brain and how it deals with information and fake news.
Climate change, science and controversy looks at fake facts – from Galileo to the present.
Information on wicked problems – those that are incredibly complicated to solve – can be undermined by fake facts. Use one of these as a context for exploring misinformation, malinformation and disinformation:
1080 – a wicked problem This article discusses how to use the science capabilities to check the objectivity and/or accuracy of information.
The ClimateViz citizen science project needs help interpreting climate change graphics to help counter misinformation and support scientific communication.
The Hub has an Ethics thinking toolkit and there are several related articles designed specifically to support teachers in exploring ethical thinking with their students. These include Frameworks for ethical analysis and Teaching ethics.
Use the article The ‘Participating and contributing’ strand to find more examples of socio-scientific issues/resources and how to include them into a science programme.
Useful links
The Workshop has a large repository of plain-language resources that look at cognitive science and communicating difficult topics. Their toolkits give examples of how we can reframe language to help people to sort fact from false information. These include:
Dr Jess Berentson-Shaw, Director of Narrative Research and Strategy at The Workshop, has published A matter of fact: Talking truth in a post-truth world. The book is stocked at selected bookstores or can be purchased online from Bridget Williams Books.
Psychologist Professor Daniel Kahneman studied how people make choices and decisions and why we think in ways that aren’t always logical. He is credited with revolutionising thinking on cognitive biases and behavioural economics. His bestseller Thinking, Fast and Slow unpacks ideas on thinking. It is stocked in bookstores and available at libraries. Listen to Dr Kahneman speaking about fast and slow thinking in this lecture.
Acknowledgement
This resource was written by Julie Fairfield, Senior Narrative Advisor from The Workshop. The Workshop, are experts in framing – the conscious and unconscious choices people make about how to present an issue. They conduct research and draw on data and insights from various disciplines, including psychology, linguistics and oral storytelling. Their work on false information draws specifically on the work of Dr Jess Berentson-Shaw from her book A matter of fact: Talking truth in a post-truth world.
The Workshop shares their work under a Creative Commons Attribution Non-Commercial Share Alike International Licence, encouraging people to pick up and use it for non-commercial purposes.
