Alternative Facts and Unconscious Bias: How We are Less Rational than We Think

This is really not the time for alternative facts. Serious global challenges—climate change, war and refugees, growing authoritarianism—require thoughtful responses based on data and evidence. But truth seems to be malleable and alternative facts compete with actual facts.

And people with conflicting facts find it hard to speak to one another. Instead, we retreat to our own bubbles making collaborative solutions all but impossible. How and why did we get here and what can we do to correct course?

 

Why Our Decisions are Based Less On Rational Calculation Than We Think   

First, the facts sometimes conflict with our values. And in the battle between facts and values, values often win.

Second, people operate through unconscious bias. No matter whether we’re Republican or Democrat, a Bernie, Hillary, or Trump supporter, unconscious processes push us to react to situations in a near automatic fashion based on gut feelings. As political psychologists Milton Lodge and Charles Taber have written in The Rationalizing Voter, “feelings enter the decision stream before any cognitive considerations come to mind.”

Most of the time we don’t even realize we’re acting in this way—that our own thinking can be so prone to biased decisions. Because of this we are easily, incessantly, and unknowingly manipulated.

There are many biases that impede rational decision making. Attribution substitution is an example. We substitute a difficult question with an easier one and then answer that easier question. And again we do so unconsciously.  Imagine someone had asked you “How well would Tim Kaine or Mike Pence do as vice president?” That’s a tough question. You have to know what the VP job entails and what kinds of experience are useful for the job. Then you have to know something about the candidates. Rather than answering these questions, you’re more likely to substitute an easier question such as “How much do I like Mike Pence or Tim Kaine?” And if you have a good feeling about the candidate, you’ll probably say “sure he’d make a great VP.”

Once you have arrived at a firm assessment new information may not change your mind. Political scientist David Redlawsk and colleagues conducted experiments to see how negative information affects a person’s attitude toward their candidate. They found that “voters may become even more positive about a candidate they like after learning something negative about that candidate.”

Why Are We So Resistant to Changing Our Minds?

Studies have demonstrated that the brain remains loyal to an idea once it has embraced that idea. Fascinating research by University of Colorado graduate student Scott Schafer finds that the placebo effect for pain relief still works even after patients are told the treatment is fake. The key here is that the placebo has provided relief several times before the patient is told the truth. According to Schafer, “…experiences make the brain learn to respond to the treatment as a real event. After the learning has occurred, your brain can still respond to the placebo even if you no longer believe in it.”

Psychology professors Stephan Lewandowsky, Ullrich Eckerand and colleagues argue that people are more likely to accept information that feels right. And information feels right when it supports the values they already hold. But what happens when you hear something that feels right but is later retracted? Lewandowsky and Eckerland write that you may not believe the new information even if it is backed up by irrefutable evidence. Take the case of Iraq. Colin Powell, the US Secretary of State, made a compelling case that the Iraqis harbored weapons of mass destruction. As we know, this turned out to be false. But even after a bipartisan commission reported that no WMD were found, 20-30 percent of Americans continue to believe it.

Lewandowsky and Ecker give a few reasons for this. First, the more a message is repeated, the more likely it is to stick. Anything told repeatedly has a staying power that pushes us away from conscious deliberation.  In a classic 1945 study on rumors, authors found that “the strongest predictor of belief in wartime rumors was simple repetition.” Acknowledgment of this fact, seems to be behind news outlets’ growing assertiveness in immediately pushing back against what they see as alternative facts. A January 2017 New York Times headline for example read “White House Pushes ‘Alternative Facts.’ Here are the Real Ones.”

Misinformation can also work through our group identities. It is particularly powerful when a person feels,“this is what my people believe.” This is called the “false consensus effect.” The problem is we often misjudge what “our people” believe. In an extreme example, Australians who held strongly negative attitudes toward Aboriginals thought 79 percent of  all non-Aboriginal Australians felt the same way. The number, however, was less than 2 percent.

And, lest you think  biases are stronger in people with less education, it’s actually the reverse. The more educated you are the less likely you are to change your opinion in the face of facts. The push against vaccines for children emerged among the well-educated left. The movement persists even after the study linking vaccination with autism was thoroughly debunked as an “elaborate fraud.” The study was so shoddy that the British government stripped the author of his medical license.

Perhaps the most troubling bias was uncovered by psychology professor and Nobel Prize winner, Daniel Kahneman and his colleague Jonathan Renshon. In a Foreign Policy article, “Why Hawks Win,” they argue that policymakers more often follow the advice of hawkish advisors. Dovish or moderate advisors who call for negotiation are seen as less credible. They write, “a bias in favor of hawkish beliefs and preferences is built into the fabric of the human mind.” Hawkish bias can lead decision makers to incorrectly asses their adversaries' motivations. We might call that "jumping to conclusions" and it can start wars needlessly.

So What Can We Do?

To learn more about how to combat these biases I emailed Justin Parkhurst, author of The Politics of Evidence a new book on how governments can better use evidence. Here’s what he said:

“Being aware of our cognitive biases and knowing how to mitigate them is probably one of the most important things we can do… how to institutionalize that or make it common across society is a harder question to answer though for sure.”

Professors Taber and Lodge have also stressed the difficulty of overcoming biases: “Where, when, how, and for whom deliberative processes will successfully override automatic response is the critical, heretofore unanswered question that goes to the heart of all discussion about human rationality and the meaning of a responsible electorate.”

Our political system is based on checks and balances of power. Now that we understand our inherent reliance on gut feelings, we must also think harder about devising a system of checks on ourselves.  More than just fact checking, we must learn how to confront our own biases. This will be crucial in facing the challenges coming our way with people who may not share our worldview.

Social scientists, like Lewandowsky and Ecker continue to uncover ways to combat biases. In my next post I’ll explore some of these ideas and offer concrete suggestions.

In the meantime, please send any comments or questions to me at: [email protected].