We are hearing a lot about how science and scientists got the lab leak theory wrong. This (mostly) partisan narrative, especially powerful among conservatives, is being used to discredit both science and science-informed public health officials. And it's both misinformed and shortsighted.
This (mostly) partisan narrative is being used to discredit both science and science-informed public health officials.
Unlike what people like Sean Hannity, Tucker Carlson and Alex Jones may lead you to believe, the lab leak theory isn't an I-told-you-so movement. An admission of uncertainty isn't a condemnation of science or a validation of conspiracy theorists. In fact, it is how science works. There are ambiguity, the emergence of new evidence and the shifting of individual and collective perspectives.
Let's start with the reality. Despite both the recognition that a lab leak is theoretically possible and pleas from many world leaders for more transparency from China, it is still far more likely that the coronavirus came from an animal. At this point, we simply don't know the actual source. And we may never know. But more knowledge about the origins (and possible lab safety issues — or lack thereof) would be valuable to our efforts to address, or perhaps stop, future pandemics. That's why it's sensible to continue to investigate — not because anything has changed from an investigative perspective.
A past position that turns out to be wrong (and we don't know, yet, whether the animal source position is wrong) can still be the correct position to have adopted, given the available evidence, at that time. And a fringe position deemed wrong in the past that turns out to be possibly true doesn't make the postulator of the fringe-y position an all-knowing soothsayer who should be trusted with future decisions. This type of thinking is what steers people toward conspiracy theories.
Let's say a renowned meteorologist uses all the available evidence — satellite imagery, barometric and temperature trends, computer modeling, years of training and experience — to inform a prediction that there is a 95 percent chance of rain tomorrow. Your neighbor thinks that the weather is controlled by a "Big Weather" satellite and that rainy days are a political plot to make us stay inside to work, and his favorite YouTuber says it will be sunny. It turns out to be sunny. Do we give up on meteorology and go with the anti-Big Weather YouTuber?
Part of the problem is that our thinking can be distorted by a kind of hindsight bias — that is, our tendency to misremember earlier positions (and why they were held) or that we could have foreseen an event or a conclusion. This is also known as the knew-it-all-along phenomenon. But given the available evidence at the relevant time, you really didn't. It's likely that no one did.
Another problem is that, yes, many public health officials and journalists did a less-than-ideal job talking about the possible causes of the pandemic. The language was often definitive, when it should have reflected the fact that there were (and are) many unknowns. Indeed, both the scientific community and the popular press need to do better jobs generally representing uncertainty and science as a process. Covid-19 should be an important teachable moment for the scientific community and for journalists covering these kinds of science-based stories.
In addition, when a scientific position is altered, that is too often portrayed as some kind of a failure. A recent study found that these science-got-it-wrong representations can erode public trust. On the other hand, portraying science accurately — that is, as a self-correcting problem-solving process that often involves false starts and dead ends — can bolster confidence and increase understanding.
Indeed, representing science better may also help to lessen the chance that a reversal in a scientific position (e.g., about the usefulness of masks) can be weaponized by those who seek to polarize public discourse and discredit science-informed experts.
Representing science better may also help to lessen the chance that a reversal in a scientific position (e.g., about the usefulness of masks) can be weaponized.
It is worth noting that there is a deep irony to conspiracy theorists' pointing to new scientific positions to support their views. Studies have shown that those who believe in conspiracy theories have reduced tendencies to "revise beliefs in the face of disconfirmatory evidence." In other words, they won't alter their positions based on evidence, but they want you to. The truth may be out there, but they aren't changing their minds.
So yes, we absolutely need to keep open minds and constantly question. We do need to make sure that the science is communicated effectively (dogmatic pronouncements one way or another are almost always a mistake). And we do need to make sure science is done well and in a trustworthy manner, including being transparent about conflicts of interest and political pressures that may twist its representation.
Let me be crystal clear: Early on, the lab leak theory was not handled well. Continued investigation (minus racist overtones, please) is warranted. But public health positions should still be informed by science, not fearmongering or ideologically driven speculation, even if the science-informed decision turns out to be wrong because the science evolves. And even if some outlandish ideologically driven speculation turns out to be true, that doesn't mean giving in to conspiracy theory rants is a rational way to make future decisions.
The truth is out there. And science can nudge us closer.