Skip to contentSkip to navigation

Why our brains fall for false expertise, and how to stop it

Once we are aware of the shortcuts our minds take when deciding who to listen to, we can take steps to block those shortcuts.

At the beginning of every meeting, a question hangs in the air: Who will be heard? The answer has huge implications not only for decision making, but for the levels of diversity and inclusion throughout the organization. Being heard is a matter of whose ideas get included — and who, therefore, reaps the accompanying career benefits — and whose ideas get left behind.

Yet instead of relying on subject matter experts, people often pay closest attention to the person who talks most frequently, or has the most impressive title, or comes from the CEO’s hometown. And that’s because of how our brains are built.

The group decision-making process, rather than aligning with actual competence, habitually falls for messy proxies of expertise, a phrase coined by University of Utah management professor Bryan Bonner. Essentially, when our brains are left to their own devices, attention is drawn to shortcuts, such as turning focus to the loudest or tallest person in the room. Over time, letting false expertise run the show can have negative side effects.

“The expert isn’t heard, and then the expert leaves,” Bonner said in an interview with the NeuroLeadership Institute, where I head the diversity and inclusion practice. “They want to realize their potential. [If] people can’t shine when they should be shining, there’s a huge human cost.”

If the people who offer the most valuable contributions to your organization aren’t appropriately recognized for it, they won’t stay long. Or, possibly worse, they will stay and stop trying. As my mother was fond of reminding me when I got my first management role: “When people can’t contribute, they either quit and leave or they quit and stay.”

One of the most important assets a group can have is the expertise of its members. But research indicates that even when everyone within a group recognizes who the subject matter expert is, they defer to that member just 62 percent of the time; when they don’t, they listen to the most extroverted person. Another experiment found that “airtime” — the amount of time people spend talking — is a stronger indicator of perceived influence than actual expertise. Our brains also form subtle preferences for people we have met over ones we haven’t, and assume people who are good at one thing are also good at other, unrelated things. These biases inevitably end up excluding people and their ideas.

In recruiting, management scholars have found that without systemic evaluation, hiring managers will favor and advocate for candidates who remind them of themselves. This plays out in meetings, too, where diversity goals can be undermined by these messy proxies to the extent that we use proxies that hinder particular groups: Height gives men and people from certain nations (whose populations tend to be taller) an advantage, and loudness disadvantages introverts and people with cultural backgrounds that tend to foster soft-spokenness. This phenomenon applies to both psychological and demographic diversity.

People are not naturally skilled at figuring out who they should be listening to. But by combining organizational and social psychology with neuroscience, we can get a clearer picture of why we’re so habitually and mistakenly deferential, and then understand how we can work to prevent that from happening.

How Proxies Play Out in the Brain

The brain uses shortcuts to manage the vast amounts of information that it processes every minute in any given social situation. These shortcuts allow our nonconscious brain to deal with sorting the large volume of data while freeing up capacity in our conscious brain for dealing with whatever cognitive decision making is at hand. This process serves us well in many circumstances, such as having the reflex to, say, duck when someone throws a bottle at our head. But it can be harmful in other circumstances, such as when shortcuts lead us to fall for false expertise.

At a cognitive level, the biases that lead us to believe false expertise are similarity (“People like me are better than people who aren’t like me”); experience (“My perceptions of the world must be accurate”); and expedience (“If it feels right, it must be true”). These shortcuts cause us to evaluate people on the basis of proxies — things such as height, extroversion, gender, and other characteristics that don’t matter, rather than more meaningful ones.

Although we humans may have biased brains, we also have the capacity to nudge ourselves toward more rational thinking.

The behavioral account of this pattern was first captured by breakthrough research from Daniel Kahneman and the late Amos Tversky, which eventually led to a Nobel Prize in Economic Science for Kahneman, and his bestseller Thinking, Fast and Slow. Their distinction between so-called System 1 thinking, a “hot” form of cognition involving instinct, quick reactions, and automatic responses, and System 2 “cool” thinking, or careful reflection and analysis, is very important here. System 1 thinking can be seen as a sort of autopilot. It’s helpful in certain situations involving obvious, straightforward decisions — such as the ducking-the-bottle example. But in more complicated decision-making contexts, it can cause more harm than good — for instance, by allowing the person with the highest rank in the meeting to decide the best way forward, rather than the person with the best idea.

Taking Steps to Combat Your Own Decision-Making Bias

Given the extent to which Western business culture puts a premium on individualism and fast decision making, it’s understandable that so many people have been trained to go their own way as quickly and confidently as possible. The good news is that with the right systems in place, people can be trained to approach problem solving in a different, less bias-ridden way.

Although we cannot block a biased assumption of which we are unaware, we can consciously make an effort to direct our attention to the specific information we need to evaluate, and to weigh it consciously. Just about any sort of decision can get hijacked by mental shortcuts, so it’s useful to have a few tools to nudge yourself and others toward more reflective, rigorous, and objective thinking.

Set up “if-then” plans. To guide attention back from these proxies of expertise, you can formulate “if-then” plans, which help the anterior cingulate cortex — a brain region that allows us to detect errors and flag conflicting information — find differences between our actual behavior and our preferred behavior. By incorporating this type of bias-mitigation plan before we enter into a situation where we know a decision will be made, we increase our chances of making optimal decisions.

For example, you can say to yourself: “If I catch myself agreeing with everything a dominant, charismatic person is saying in a meeting, then I will privately ask a third person (not the presenter or the loudest person) to repeat the information, shortly after the meeting, to see if I still agree.”

Get explicit, and get it in writing. One fairly easy intervention is to instruct employees to get in the habit of laying out, in writing, the precise steps that led to a given decision being made. You also can write out the process for your own decision making.

For example, narratives in the form of “We decided X, which led us to conclude Y, which is why we’re going with strategy Z” bring a certain transparency and clarity to the decision-making process and serve as a record that can be referenced later to evaluate which aspects of the process worked and which didn’t.

Incentivize awareness. Along those same lines, managers should reward employees who detect flaws in their thinking and correct course. At the NeuroLeadership Institute, we have a “mistake of the month” section in our monthly work-in-progress meetings to help model and celebrate this kind of admission.

To use a sports example, New England Patriots quarterback Tom Brady reportedly pays his defense if they can intercept his passes in practice. (It must help. He’s one of two players in NFL history to win five Super Bowls.) The takeaway: By making error detection a team sport, you destigmatize the situation, highlight the learning opportunities, and increase the likelihood of making better decisions in the future.

Set up buffers. Taking your decision making from “hot” to “cool” often requires a conscious commitment to create a buffer between when you receive information and when you make a decision on how to move forward.

For example, before a big decision is officially made, everyone involved should be encouraged to spend 10 minutes relaxing or going for a walk before reconvening one last time to discuss any potential issues that haven’t yet come up. This is a way of “cooling off” and making sure things have been thought through calmly. Another way to accomplish this is to engage in a “pre-mortem” — imagining a given decision went poorly and then working backward to try to understand why. Doing so can help identify biases that might otherwise go undetected.

Cut the cues. The most common and research-backed approach involves giving hirers access to fewer of the sorts of cues that can trigger expedience biases. Blind selection is a classic example. In the 1970s and 1980s, top orchestras instituted a blind selection process in which the identity of applicants was concealed from the hiring committee, often by literally hiding the player behind a screen while he or she performed. As a result, the number of female musicians in the top five U.S. symphony orchestras rose from 5 percent in 1970 to more than 25 percent in 1996.

Bonner, the Utah psychologist, says to “take the humanity out” when you can. “Set up situations where people exchange information with as little noise as possible,” he says. If you’re brainstorming, have everyone write down their ideas on index cards or on shared documents, then review the ideas anonymously — that way the strength of the idea, rather than the status of the source, will be the most powerful thing.

Technology can also be leveraged. For example, the “merit-based matching” app Blendoor strips the name, gender, and photos of an applicant from a recruiter’s view, and Talent Sonar uses predictive analytics to shape job listings that attract both male and female candidates, and performs a blind resume review, which leads to a 30 percent larger hiring pool, the company says.

Biases are human — a function of our brains — and falling for them doesn’t make us malicious. We have the capacity to nudge ourselves toward more rational thinking, to identify and correct the errors we make as a result of bias, and to build institutions that promote good, clear thinking and decision making. With the right systems, tools, and awareness in place, we can better cultivate the best ideas from the most well-suited minds. It just takes a bit of effort, and in the long run pays off in big ways. The best ideas get a chance to be heard — and implemented — and your best thinkers are recognized and keep on thinking.

Author profile:

  • Khalil Smith heads the diversity and inclusion practice at the NeuroLeadership Institute. He has 20-plus years of experience in leadership, strategy, and HR, including more than 14 years at Apple Inc.
Get s+b's award-winning newsletter delivered to your inbox. Sign up No, thanks
Illustration of flying birds delivering information
Get the newsletter

Sign up now to get our top insights on business strategy and management trends, delivered straight to your inbox twice a week.