Facts are useful, but not enough to actually fix the issue

Saturday, March 26th, 2022

David Epstein talks to Lisa Fazio, a Vanderbilt psychologist who studies misinformation , about the illusory truth effect:

This is a term we use for the finding that when you hear something multiple times, you’re more likely to believe that it’s true. So, for example, in studies, say that you know that the short, pleated skirt that men wear in Scotland is called a “kilt,” but then you see something that says it’s a “sari.” You’re likely to think that’s definitely false. If you see it twice, most people still think it’s false, but they give it a slightly higher likelihood of being true. The illusory truth effect is simply that repetition of these statements leads to familiarity and also to this feeling of truth.


We’ve done studies where we get people to pause and tell us how they know that the statement is true or false. And when people do that, they seem to be less likely to rely on repetition.


We’ve seen the illusory truth effect from five-year-olds to Vanderbilt undergrads, and other adults. I think that’s one of the big takeaways from all of the research we’ve done on misinformation is that we all like to believe that this is something that only happens to other people. But, in reality, just given the way our brains work, we’re all vulnerable to these effects.


Facts are useful, but not enough to actually fix the issue. You have to address the false information directly. So in a truth sandwich, you start with true information, then discuss the false information and why it’s wrong — and who might have motivation for spreading it — and come back to the true information. It’s especially useful when people are deliberately misinforming the public.


People have already created this causal story in their mind of how something happened. So in a lot of the experiments, there’s a story about how a warehouse fire happens. And initially people are provided with some evidence that it was arson — there were gas cans found on the scene of the crime. And then in one case you just tell people, “Oh, oops, sorry, that was wrong. There were no gas cans found there.” Versus in another you give them an alternative story to replace it — that there weren’t any gas cans at all; instead, it turns out that there was a faulty electrical switch that caused the fire. If you only tell people the gas cans weren’t there, they still think it’s arson. They just are like, “Oh, yeah. The gas cans weren’t there, but it was still arson, of course.” Whereas in the second story, they’ll actually revise the story they had in mind and now remember it was actually accidental.


Yeah, and with false information you can make it really engaging, really catchy, really easy to believe. And the truth is often complicated and nuanced and much more complex. So it can be really hard to come up with easy ways of describing complicated information in a way that makes it as easy to believe as the false information.


  1. Harry Jones says:

    Illusory truth is conventional wisdom is peer pressure acting upon cognition.

    It’s how NPCs are programmed.

  2. Bruce Purcell says:

    Establishment media should have a seal at the bottom of the screen to show it’s free of misinformation, like ‘Truth’, only in a foreign language to be classy: PRAVDA.

  3. Felix says:

    Oddly enough, we learn what we sense. And what we sense can be wrong. Horrors!

    Wait. Did I write, “we”? Nah. Others sense wrongly. Not me.

    Anyway, the solution is simple. Don’t sense anything wrongly. And, if that seems hard, simply don’t sense anything.


    You can use the technique advised by this Vanderbilt psychic. Start with knowing the truth. Then wrap it around each falsehood to protect yourself from ever being wrong.

  4. Wang Wei Lin says:

    Beware the leaven of the Pharisees. The techniques for discerning truth are not new unless you’re a Vanderbilt psychologist.

  5. Bomag says:

    Been looking for an excuse to use this link.

Leave a Reply