IE 11 is not supported. For an optimal experience visit our site on another browser.

Why humans believe most people are telling the truth — even when we're told they're lying

We have a bias towards believing what we hear is true and make judgments on that basis, even when the context suggests the information is likely false.
Woman reading and texting on smartphone in bed
Yiu Yu Hoi / Getty Images

Why do people credit falsehoods? Why don’t they dismiss them? Here is a large part of the answer: Most of the time, we tend to believe other people. When they tell us things, we assume that they are telling the truth.

To be sure, we consider some people untrustworthy, perhaps because they have so proved themselves; perhaps because they belong to a group that we think we should distrust. But on average, we trust people even when we should not.

"Liars: Falsehoods and Free Speech in an Age of Deception" by Cass Sunstein
"Liars: Falsehoods and Free Speech in an Age of Deception" by Cass SunsteinOxford University Press

We pay too little attention to clear evidence that what is being said is false. We fail to discount for the circumstances.

For instance, what if I said: In recent months, scientists have found that climate change is unlikely to be a serious problem. On balance, most people will be unaffected by it. People in the United States and Europe are unlikely to be affected at all. To be sure, there will be some harmful effects elsewhere, including Rwanda and South Africa, but even there, those effects will be small. Remarkably, most of the world’s population will be better off, because the world will be warmer.

Actually that is false; I made it up. But if you’re like most people, that false statement might well linger in your memory, making you think, at least for a little while and in some part of your mind, that climate change isn’t a serious problem. (Sorry.)

That is an example of a broader phenomenon called “truth bias”: People tend to think that what they hear is truthful, even if they have excellent reason not to believe what they hear. If people are provided with information that has clearly been discredited, they might nonetheless rely on that information in forming their judgments.

On average, we trust people even when we should not.

Similarly, people are more likely to misremember — as true — a statement that they have been explicitly told is false than to misremember — as false — a statement that they have been explicitly told is true.

It follows that if you are told that some public official is a liar and a crook, you might continue to believe that in some part of your mind, even if you know that she’s perfectly honest. (In 2016, the sustained attacks on Hillary Clinton worked for this reason, even when people were aware that they were lies.)

And if you are told that, if you’re under the age of 50, you really don’t need to worry about a pandemic, you might hold onto that belief at least in some part of your mind, even after you are informed that people under 50 can get really sick.

That problem goes by an unlovely name: “meta-cognitive myopia.”

The basic idea is that we are highly attuned to “primary information”: whether the weather report says that it is going to be cold today, whether a candidate for public office claims that he was a war hero, whether the local newspaper reports that a famous television star committed a drug offense. By contrast, we are far less attuned to “meta-information,” meaning information about whether primary information is accurate. If you are given a clear signal that the supposed weather report was a joke, or that a public official is distorting his record in order to attract votes, you won’t exactly ignore the signal. But if you’re like most people, you will give it less weight than you should.

People tend to think that what they hear is truthful, even if they have excellent reason not to believe what they hear.

Evolutionary explanations are often speculative, but there is a reasonable one for truth bias. In hunter-gatherer societies, survival often depends on how people react to the evidence of their own senses, or even to signals that they receive from others. If you see a tiger chasing you, you had better run. And if your friends and neighbors are running, it makes sense to run too. There is much less urgency to picking up on signals about whether those signals are reliable.

To be sure, meta-cognition can be valuable, but primary information is the most important. That is enough to produce truth bias.

For a powerful demonstration of the existence of truth bias, consider some work from a research team consisting of Oxford’s Myrto Pantazi, and Olivier Klein and Mikhail Kissine, both of the Free University of Brussels.

To simplify a complex story, Pantazi and his colleagues gave a large number of participants information about two legal cases involving criminal defendants. Participants were explicitly told that some of the information, related to the appropriate sentence, was false. They were then asked to come up with an appropriate prison term and also to say how dangerous the defendant was.

The main question was whether people would adequately discount information that they were told was false, so that it would not influence their judgments.

The answer is that they did not. When people received negative information about the defendant, they were influenced by it — even when they had been explicitly informed that it was false. As the authors put it, “jurors may judge defendants based on evidence that they encounter, even if they clearly know this evidence to be false.” Consistent with other research, the authors also found that their participants tended to misremember false evidence as being true — and did so more often than they remembered true evidence as being false.

Pantazi and his colleagues undertook the same experiment with professional judges. Amazingly, they obtained the same basic results. Even if you are an experienced judge, false information about a criminal defendant might well affect your conclusions — and you might well remember it as true. Negative information in particular puts a kind of stamp in the human mind, and it is not easy to remove it.

In most settings, most of us assume that other members of the human species are telling the truth; that is our default assumption.

These experiments involve law, but the lesson is much larger.

Suppose that you are in a new town and you ask strangers for directions. You will probably assume that they are telling you the truth, rather than trying to get you lost. In fact, it might take a great deal to convince you that they were lying.

In most settings, most of us assume that other members of the human species are telling the truth; that is our default assumption. That is one reason, by the way, that advertising works — even if we should know better than to believe it.

As noted, there are times and places in which we do not indulge that assumption. We know that some people are liars. If we think that people care only about themselves and not about us, we are far less likely to believe them. We do not think that the statements of advertisers have the same credibility as the statements of our best friends.

But if we hear something online or in a newspaper, we often ignore or give too little weight to clear signals that it might be false.

Adapted from "Liars: Falsehoods and Free Speech in an Age of Deception" by Cass Sunstein. Copyright © 2021 and published by Oxford University Press. All rights reserved.