As a new review of past research concludes, “mud” sticks — and, worse, attempts to correct erroneous beliefs can backfire, reinforcing the very misrepresentations they aim to erase. The main problem, the research reveals, is that rejecting information takes more mental effort than accepting it, and given our social environment, most people’s default position is to believe what others tell them. “For better or worse,” the authors write, “the acceptance of information as true is favored by tacit norms of everyday conversational conduct.” Indeed, some research even suggests that in order for the brain to process incoming information at all, it must initially assume that the information is correct.

In multiple studies included in the new review, researchers presented people with a fictitious news report about a warehouse fire that was initially thought to have been caused by negligent storage of gas cylinders and oil paints. The participants were then offered an explicit retraction of the information about the cause of the fire, but even after reading the correction, only about half of those in the study reported that the initial news account was wrong. The finding suggests that the original false belief may stick 50% of the time, despite a correction.

The problem becomes even more extreme when political beliefs are involved. People have a tendency to believe information that supports their existing worldview and to reject data that threatens it.

So, what can be done to solve this problem?

Presenting the new information as part of a coherent story seems to help, by filling the gap in explanation that arises from simply negating a statement.

In cases involving political misinformation, providing new data that is congruent with someone’s preexisting beliefs also helps […]

Similarly, giving correct information while making people feel good about themselves through self-affirmation also helps them cope with the new info that would otherwise threaten their identity […]

It’s also helpful to simply spending more time debunking myths in detail: this doesn’t backfire like brief debunking does. A study found that a psychology course aimed at correcting misconceptions about the field was more successful when it directly refuted myths in depth than when it simply presented accurate information, without addressing common untruths.

Finally, presenting correct information coherently in the simplest way that is accurate, strengthening the message through repetition and, if possible, warning people that misinformation is about to be presented can help prevent it from sticking, though it may be hard to maintain constant vigilance over one’s data diet.

The authors conclude: “Widespread awareness of the fact that people may ‘throw mud’ because they know it will ‘stick’ is an important aspect of developing a healthy sense of public skepticism that will contribute to a well-informed populace.”