BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Is ‘Fake News’ Creeping Into Your Belief System?

Following

If you’re like most people, you make thousands of decisions each day.

Sure, most of those decisions are unconscious and inconsequential—like will you put on your left shoe or right shoe first?

Of course, many decisions do have varying degrees of consequence. Among all the products and services on the market, which ones will you buy? Which relationships receive your focused attention? Of all the priorities in your life, where do you invest the most time and energy? And when it comes to your core philosophy—your views on faith, social, and political issues—how do you navigate the sometimes-bewildering maze of conflicting perspectives?

In this voting season we’re reminded that such navigation is often complicated by political candidates and pontificators who insist on playing fast and loose with facts.

That conundrum is the subject of a thoughtful piece in the MIT Sloan Management Review, titled “The Cognitive Shortcut That Clouds Decision-Making.” It’s based on research by an international team of academics who specialize in how people reach conclusions that have varying degrees of influence on how they live—and enjoy—their lives.

One of those researchers is Dr. Nadia M. Brashier, a Purdue University professor who focuses on ways people come to believe things that are not true, from fake news to common superstitions.

Rodger Dean Duncan: What is it about repeated misinformation that enables it to come across as true—even months later?

Nadia Brashier: As marketers seem to realize, information feels easier to process each time we encounter it. We misinterpret that feeling of ease as evidence of truth. Just a single exposure can be powerful, but perceived accuracy continues to increase with additional exposures. This is a powerful illusion that occurs among smart people, despite conflicting advice from trustworthy sources, and even when we “know better.” Repeating a claim like “China has the largest economy in the world” several times could persuade you, even though you know that the U.S., not China, has the largest GDP.

In one recent study, I found that even paying participants for correct answers doesn’t disengage them from relying on feelings of ease.

Duncan: In explaining how the mind takes shortcuts, you use the term “processing fluency.” Tell us how that works.

Brashier: Our brains are sensitive to how much difficulty we experience while processing information. Repetition creates a feeling of ease that inflates all kinds of judgments. In what is known as the false fame effect, seeing non-famous names just once leads people to mistakenly assume that those names belong to celebrities.

As another example, encountering a product over and over again makes it fluent, and people like it more as a result. Similarly, any strategy that makes a claim easy to digest, whether that be repeating it or presenting it in a high-contrast font, also makes it seem truer.

Duncan: We live at a time when one person’s “fake news” is another person’s infallible truth. What is it that causes (or enables) people to cling so tenaciously to their views, even in the face contradictory data?

Brashier: People sometimes rely on misinformation even after receiving explicit debunking messages. This continued influence effect can occur because people initially refuse to accept the new information. But another key problem is that, even if we accept a correction, our brains don’t “overwrite” false beliefs. We concurrently store both the original myth and its correction, and the latter often fades from memory. So it might help initially to give people a detailed explanation of why antibiotics don’t cure COVID—but just a few weeks later, those same people might ask their doctors for a prescription.

Duncan: You say that avoiding the bias blind spot is a good strategy for avoiding the “illusory truth effect.” Please give us an example of that.

Brashier: The bias blind spot refers to the fact that we detect others’ biases more readily than our own. For example, people might assume that coworkers work hard for external incentives like money, while claiming that they personally are motivated by internal incentives like pride in a job well done. Nearly everyone believes that they are “less biased than the average person.” This can have negative consequences in the workplace, as people with a high bias blind spot take useful advice less often.

Duncan: An “accuracy mindset,” you say, can help short-circuit the illusory truth effect. How does that work?

Brashier: People don’t always spontaneously consider whether incoming information is true or false. They might focus instead on how interesting or entertaining a claim is. Fortunately, we can shift people’s attention toward accuracy. Asking people to ‘behave like fact-checkers’ initially, when they first encounter a falsehood, protects them from illusory truth later. They draw on their own knowledge instead of relying on the feelings of ease they experience when reading repeated misinformation.

Duncan: Some people live in an echo chamber in which they’re exposed only to viewpoints they already embrace. What can leaders do to foster a good cross-pollination of perspectives among team members?

Brashier: Diverse teams generate a larger number of original and useful ideas, but this brainstorming process does not always translate into implementation.

We each assume that our own beliefs and attitudes are typical and can perceive consensus where none exists. Hearing one coworker repeatedly suggest a competitive advertising campaign, for example, can create the false impression that everyone agrees on that strategy.

In addition, exposure to opposing viewpoints can unfortunately increase polarization by emphasizing the differences between groups. Strategies to minimize these biases include ensuring that teammates have equal opportunities to speak and asking employees to take each other’s perspectives.

Duncan: Is illusory truth always a bad thing?

Brashier: We learn from experience that there’s a correlation between repetition and truth. On average, we encounter the true version of a statement more often than any one of its many possible falsifications. For example, you are more likely to hear the correct claim that “Jeff Bezos is the founder of Amazon” than the false claims that Elon Musk, Bill Gates, or Mark Zuckerberg founded the company.

Relying on fluency leads us astray sometimes, but it’s generally an adaptive rule of thumb. Moreover, we can leverage this cognitive shortcut to make accurate information feel fluent—repeating verified facts about coronavirus, for example, increases belief in them.

Follow me on Twitter or LinkedInCheck out my website or some of my other work here