The illusion of truth
Cognitive biases
Hey all, today we’re continuing our ‘From the archives’ series with our fourth article from the month of April: illusory truth effect.
Welcome to Human Nature, the illustrated psychology newsletter. If you’re enjoying this publication, please help it reach more people by liking, commenting, sharing or subscribing.
Illusory truth effect
What it is: The illusory truth effect or the illusion of truth is the name given to the cognitive bias whereby we end up believing in false information simply after being exposed to it multiple times, sometimes even if we initially know the information to be false.
For instance, you might have heard many times that moss grows on the north side of trees. Later, you learn that this is in fact a myth, but when one day you get lost in a forest you still use this information to orient yourself, because you have feeling that there might be some truth to it.
See also: mere exposure effect or familiarity effect (we end up liking something simply because we have been exposed to it repeatedly).
The illusory truth effect is especially relevant in the age of social media, where repeated exposure to misinformation can shape our opinions without us being aware of it.
How its was discovered: In a 1977 experiment, Hasher, Goldstein and Toppino gave college students a list of statements, both true and false, and asked them to assess how accurate each statement was. The statements were about topics of which the students had no prior knowledge. The students came to the lab three times, with two weeks between each visit. Crucially, some of the statements were repeated at each visit while others weren’t. They found that students were more likely to rate a statement as accurate when they had seen it repeatedly.
Later, this experiment was repeated with statements participants did have prior knowledge about. They still demonstrated the illusory truth effect.
Recent studies have shown the role of illusory truth effect in people believing false information that’s being circulated around social media.
How it works: The average human makes 35,000 decisions everyday. Since we can’t consciously process this level of information, we rely on heuristics to make decisions. These are mental shortcuts we use in order to navigate the world around us. Behavioural economist Daniel Kahneman suggested that we have two levels of thinking: System 1 and System 2. System 1 refers to the fast, automatic thinking that occurs unconsciously, whereas System 2 refers to the slower, more deliberate and conscious type of thinking. Ours brains prefer to rely on System 1 whenever possible to lighten our cognitive load.
The illusory truth effect is thought to be related to ease of processing, also called processing fluency. Repeated information becomes easier to process and therefore feels like it must be true. The effect is so strong that false information ends up overriding true information if it’s repeated to us enough times.
We can overcome the illusory truth effect by being aware of it and fact-checking new information as it is presented to us.
Thank you for reading, see you next time!
Sources:
Exposure and affect: A field experiment. (Zajonc & Rajecki, 1969).
Frequency and the Conference of Referential Validity. (Hasher, Goldstein & Toppino, 1977).
Why do we believe misinformation more easily when it’s repeated many times? The Decision Lab.
Cognitive biases are systematic errors in thinking that we all make and that affect our judgement and behaviours. We humans like to believe we’re rational thinkers, but our brains aren’t quite wired that way. Cognitive biases are just one of the ways in which our thinking is flawed, so it’s good to be aware of them.



