Persistence of Myths Could Alter Public Policy Approach

Sunday, December 30th, 2007

Persistence of Myths Could Alter Public Policy Approach — because people forget silly details like which points are true and which are false:

The federal Centers for Disease Control and Prevention recently issued a flier to combat myths about the flu vaccine. It recited various commonly held views and labeled them either “true” or “false.” Among those identified as false were statements such as “The side effects are worse than the flu” and “Only older people need flu vaccine.”

When University of Michigan social psychologist Norbert Schwarz had volunteers read the CDC flier, however, he found that within 30 minutes, older people misremembered 28 percent of the false statements as true. Three days later, they remembered 40 percent of the myths as factual.

Younger people did better at first, but three days later they made as many errors as older people did after 30 minutes. Most troubling was that people of all ages now felt that the source of their false beliefs was the respected CDC.

The psychological insights yielded by the research, which has been confirmed in a number of peer-reviewed laboratory experiments, have broad implications for public policy. The conventional response to myths and urban legends is to counter bad information with accurate information. But the new psychological studies show that denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.

People, of course, believe all kinds of crazy things — that Saddam was behind the 9/11 attacks, that it was an inside job, etc. — and that has been a big concern for years:

As early as 1945, psychologists Floyd Allport and Milton Lepkin found that the more often people heard false wartime rumors, the more likely they were to believe them.

The research is painting a broad new understanding of how the mind works. Contrary to the conventional notion that people absorb information in a deliberate manner, the studies show that the brain uses subconscious “rules of thumb” that can bias it into thinking that false information is true. Clever manipulators can take advantage of this tendency.

The experiments also highlight the difference between asking people whether they still believe a falsehood immediately after giving them the correct information, and asking them a few days later. Long-term memories matter most in public health campaigns or political ones, and they are the most susceptible to the bias of thinking that well-recalled false information is true.

You have to wonder how humanity got this far:

Furthermore, a new experiment by Kimberlee Weaver at Virginia Polytechnic Institute and others shows that hearing the same thing over and over again from one source can have the same effect as hearing that thing from many different people — the brain gets tricked into thinking it has heard a piece of information from multiple, independent sources, even when it has not. Weaver’s study was published this year in the Journal of Personality and Social Psychology.

The experiments by Weaver, Schwarz and others illustrate another basic property of the mind — it is not good at remembering when and where a person first learned something. People are not good at keeping track of which information came from credible sources and which came from less trustworthy ones, or even remembering that some information came from the same untrustworthy source over and over again. Even if a person recognizes which sources are credible and which are not, repeated assertions and denials can have the effect of making the information more accessible in memory and thereby making it feel true, said Schwarz.

Experiments by Ruth Mayo, a cognitive social psychologist at Hebrew University in Jerusalem, also found that for a substantial chunk of people, the “negation tag” of a denial falls off with time. Mayo’s findings were published in the Journal of Experimental Social Psychology in 2004.

“If someone says, ‘I did not harass her,’ I associate the idea of harassment with this person,” said Mayo, explaining why people who are accused of something but are later proved innocent find their reputations remain tarnished. “Even if he is innocent, this is what is activated when I hear this person’s name again.

“If you think 9/11 and Iraq, this is your association, this is what comes in your mind,” she added. “Even if you say it is not true, you will eventually have this connection with Saddam Hussein and 9/11.”

Mayo found that rather than deny a false claim, it is better to make a completely new assertion that makes no reference to the original myth. Rather than say, as Sen. Mary Landrieu (D-La.) recently did during a marathon congressional debate, that “Saddam Hussein did not attack the United States; Osama bin Laden did,” Mayo said it would be better to say something like, “Osama bin Laden was the only person responsible for the Sept. 11 attacks” — and not mention Hussein at all.

The psychologist acknowledged that such a statement might not be entirely accurate — issuing a denial or keeping silent are sometimes the only real options.

So is silence the best way to deal with myths? Unfortunately, the answer to that question also seems to be no.

Another recent study found that when accusations or assertions are met with silence, they are more likely to feel true, said Peter Kim, an organizational psychologist at the University of Southern California. He published his study in the Journal of Applied Psychology.

Myth-busters, in other words, have the odds against them.

The rumor research I was already aware of was the basic law of rumor going back to WWII:

Allport and Postman called their most far-reaching assertion “the basic law of rumor.” It declared that rumor strength (R) will vary with the importance of the subject to the individual concerned (i) times the ambiguity of the evidence pertaining to the topic at hand (a), or R ? i × a.

Leave a Reply