5 Ways to Protect Yourself From Misinformation
Alex Edmans shares 5 key insights from May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases―And What We Can Do About It.
Do you fall for misinformation? You do, whether you realize it or not, because the human brain is intrinsically prone to deception. To protect yourself, check out these 5 big ideas from May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases―And What We Can Do About It by Alex Edmans. Alex is a Professor of Finance at London Business School and has won 25 teaching awards there and at Wharton, where he taught previously. Alex also serves as non-executive director of the Investor Forum, on the World Economic Forum’s Global Future Council on Responsible Investing, and on Royal London Asset Management’s Responsible Investment Advisory Committee. Here he is to share 5 big ideas from his book.
1. Take responsibility.
We often think that combatting misinformation is someone else’s problem. Governments should regulate and prosecute misinformation, publishers should vet more thoroughly, and the scientific professions should kick out offending members. But this is unrealistic and ineffective. It’s unrealistic because misinformation can be produced far faster than authorities can correct it. It’s ineffective because some of the most pervasive forms of misinformation are subtle and can’t be prosecuted. Even if the facts people give are 100 percent accurate, they can draw misleading inferences from them, like over-extrapolating from a single example.
Instead, the solution is to take responsibility for ourselves and be on our guard. The main thing to guard against is confirmation bias. That’s the temptation to accept a claim uncritically because we want it to be true and dismiss something out of hand because we don’t want to believe it. How often do we repost or like an article without actually reading it, just because we like the headline? Or perhaps we do read it, but we’ll believe all the claims without truly scrutinizing the evidence behind them. On the rare occasions that we read something we don’t like the sound of, we do so with the mindset of trying to pick it apart. We need to apply the same skepticism to something we like as to something we don’t.
2. Check the evidence.
What does it mean to apply skepticism? If we see a claim we dislike, we’d ask for the evidence behind it. We need to do exactly the same for a claim that we like. Note that it’s “check the evidence,” not “check the facts”—we often think that combatting misinformation is simply about checking the facts. If people spread misinformation that Barack Obama isn’t a natural-born citizen of the U.S., you can check his birth certificate. The excellent book Factfulness showed how we can be much more optimistic about the world if we check simple facts like how many countries have an average daily income below $2. But many claims aren’t simple statements that can be proven or disproven by fact-checking.
Take the famous phrase, “Culture eats strategy for breakfast.” That’s taken as gospel because it’s a quote by Peter Drucker, a highly-respected management guru. But even if it’s a fact that Drucker said it, that’s not enough. “Peter Drucker said” is not evidence. Did Drucker actually conduct a study taking one set of companies with a strong culture and weak strategy and another set with a strong strategy and weak culture and show that the first beat the second? The answer is no, so we should doubt the phrase, no matter who said it or how much we want it to be true.
“Many claims aren’t simple statements that can be proven or disproven by fact-checking.”
Or take the famous “two-minute rule” introduced by David Allen’s time management book Getting Things Done. It says you should do any task immediately if it takes two minutes or less. But there’s no evidence behind the rule; he just made it up. As a business magazine noted, “Fortunately for Allen, he didn’t need empirical evidence: People felt better after taking his seminars.” People started practicing the rule, and the rule makes you feel good— you get a dopamine hit after you complete a task—so confirmation bias means you want it to be true. But focusing on quick wins and low-hanging fruit is at the expense of deep work and reduces your productivity.
Regularly ask yourself: “What is the evidence behind that claim?”
3. Get the whole truth.
In a trial, the witness swears to tell not only the truth but also the whole truth. But many beliefs we have are based on half-truths because people are selective with what they reveal.
Take Simon Sinek’s claim that starting with why leads to success. He points out how Apple, Wikipedia, and the Wright Brothers all started with why and all ended up successful. Their success is a true fact, but once again, the facts aren’t enough. Apple, Wikipedia, and the Wright Brothers are all cherry-picked examples; Sinek isn’t telling the whole truth. There could be hundreds of other companies that started with why and failed. There could also be hundreds of companies that succeeded even if they didn’t start with why. But those examples will never be in Sinek’s books or talks because they don’t fit his story.
“There could be hundreds of other companies that started with why and failed.”
The best type of evidence is a medical trial. You have one set of patients who’s given the drug, and you see how many of them get better and how many don’t. Importantly, you also have a control group that’s given a placebo, and you see how many of them get better and how many don’t. The trial will include people given the drug who don’t get better and those who got better even though they got the placebo. Both are counterexamples. If the evidence still shows that the drug makes people better even though it includes these counterexamples, it’s powerful. If there are no counterexamples, this is a tell-tale sign that the data is cherry-picked, and so the conclusions are meaningless.
Ask: “Has the person making the claim considered any counterexamples?”
4. Examine the alternative suspects.
Sticking with the trial analogy, evidence is only evidence if it points to one culprit but not others. If it suggests that multiple suspects could have committed the crime, it’s meaningless. The same is true outside of the courtroom.
Take the finding that companies that care more about wider society also make more money. That’s there in the data; it’s a true fact. But how do we interpret that fact? We argue that it means having a social conscience causes the company to be more profitable. Our confirmation bias wants this to be true—we want to live in a world with good karma, where the good guys always win.
“Once a company is doing well, it can afford to care about wider society.”
But there are alternative suspects. Perhaps causation is in the other direction. Once a company is doing well, it can afford to care about wider society. Or maybe a third factor causes both—a great CEO causes her company to be successful, and a great CEO also cares about wider society. In the cold light of day, most people know that correlation is not causation. If confirmation bias is at play, we don’t think with a clear head. We interpret the evidence how we like, jump to the conclusion that we like, and ignore the alternative suspects. This would be problematic in a criminal investigation and is equally problematic for any other type of evidence.
Ask yourself: “What are the alternative explanations for the same data?”
5. Encourage dissent.
Making better decisions is far more than just interpreting data and evidence correctly because our sources of information are far more than just data and evidence. One big source is our colleagues’ opinions on why our company’s strategy might backfire or why we shouldn’t hire a particular person.
We’ve explained how we should respond to dissenting views once we receive them: to tame our confirmation bias and to take them as seriously as views we agree with. But a bigger problem is that we sometimes never hear dissenting views to begin with. Many organizations have a big problem with groupthink: people will only say what they think the boss wants to hear, and they’re afraid of rocking the boat. A great leader is not one who gets others to follow but one who gets others to speak up and tell the leader when she’s going off course.
When he ran General Motors, Alfred Sloan closed a meeting by asking, “I take it we are all in complete agreement on the decision here?” Everyone nodded. Sloan continued, “Then, I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what this decision is about.” He believed that none of his ideas were 100 percent perfect, so if no one raised any concerns, this wasn’t because there weren’t any, but because he hadn’t yet given his colleagues time to think of them. If at the next meeting concerns are raised but the consensus is to go ahead, a great leader will speak to the dissenters afterward and say, “Even though we still went ahead, I appreciate you speaking up, and we’ll bear your concerns in mind as we execute the strategy.” By taking active steps to encourage dissent, a leader can harness the full collective wisdom of her organization.