IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook, Twitter and other social media companies need to be treated like Big Tobacco

The surgeon general's new advisory shows their product is in need of serious consumer protection regulations.

Thursday marks a turning point in internet history. For the first time, the U.S. surgeon general has declared the barrage of misinformation spreading on social media a public health hazard. In an advisory, Surgeon General Dr. Vivek Murthy calls on technology companies to “take responsibility for addressing the harms” their social media products impose on consumers by prioritizing the early detection of misinformation, providing researchers with meaningful access to data, and protecting public health professionals from harassment.

For social media companies, misinformation is like secondhand smoke, spreading falsehoods to millions before the truth can be known.

In the same way his predecessor decades ago took on the tobacco companies, he is taking on the technology industry by defining how misinformation hurts Americans. In our view, this advisory shows that social media is a product in need of serious consumer protection regulations.

Murthy’s proposed actions include advising technology companies to redesign their algorithms so that search and recommendation systems don’t surface reckless misinformation, and to make it easier for people to identify and report misinformation. In addition to corporations taking these steps, he wants to see research conducted to use as a basis for regulating health misinformation.

As researchers of the internet and the dangerous effects of disinformation campaigns, we urge policymakers to embrace the surgeon general’s advisory and treat social media as a consumer product. We ask them to fund research on the true costs of misinformation and create public interest obligations for social media, like the way radio stations are required to routinely make local announcements, so that timelines and news feeds are required to provide timely, accurate and local knowledge and information.

The historical precedent for the surgeon general as an advocate for product regulation dates to the 1970s, when grassroots activists and independent scientists linked smoking to poor indoor air quality and physical harm. In 1986, the then-surgeon general and the National Academies of Sciences, Engineering and Medicine issued two critical reports that profoundly shifted how policymakers and the public viewed tobacco consumption.

Instead of treating smoking as a consumer choice, the government reports documented the harms caused by “secondhand smoke” and provided scientific proof that the disease in nonsmokers was caused by proximity to these products. The surgeon general’s report spurred changes to the design of tobacco products and laid the groundwork for federal and state taxes on nicotine products to pay for smoking cessation programs.

For social media companies, misinformation is like secondhand smoke, spreading falsehoods to millions before the truth can be known. It causes harm to the public’s health by contributing to vaccine hesitancy and sometimes prompting life-and-death decisions based on lies.

Murthy’s action plan calls on researchers, journalists, educators and policymakers to use an all-hands-on-deck approach to address this crisis. Beyond the individual steps tech companies take, these professionals must detect, document and debunk misinformation-at-scale.

Misinformation goes beyond an individual publishing or sharing something on the internet that’s inaccurate. It also undermines the quality and safety of our communication infrastructure, as it’s easily overrun by corrosive falsehoods and networked conspiracies at a moment’s notice.

Social media is built for openness and scale, not safety or accuracy. Instead, repetition, redundancy, reinforcement and responsiveness are the mechanisms by which content on social media circulates, drives public conversation and eventually becomes convincing — not truth. When users of social media see a claim over and over, across multiple platforms, they start to feel it to be more true, especially as groups interact with it, whether or not it is.

In fact, a study from the Massachusetts Institute of Technology showed that novel and outrageous claims reach more people faster than the truth on Twitter. This is mostly due to the design of algorithms that act like amplifiers by echoing similar messages within platforms and, then, across them. Similarly, because social media allows users to ask questions and get responses either from other users or from search engines, algorithms reinforce those interests that users engage with regardless of their factualness.

In 1996, John Perry Barlow, founder of the Electronic Frontier Foundation, published a manifesto, “Declaration of the Independence of Cyberspace.” Like many web pioneers of the time, he saw the internet as a space of the mind that existed outside the jurisdiction of the courts. This conception of the internet and social media as a place for unbridled free speech and a life of the mind was the foundational principle of today’s social media companies. But since then, movements for digital human rights have worked diligently to show how harms such as stalking, harassment, hate and incitement occur online.

By treating social media as a product, rather than as a place like Barlow did, the surgeon general’s advisory shifts the responsibility for these harms back on the companies. While we know that the tobacco business fuels smoking-related illnesses that cost $300 billion per year, the true costs of the health misinformation tech companies are helping to purvey are largely unknown. As Murthy says, “There is an urgent need to quantify the harms of health misinformation.”

We know that speculation, gossip and rumor are normal aspects of conversation, and it would be impossible to root them all out. But like a garden, social media must be tended so that it can flourish in a healthy way. Because the sheer scale of posts outpaces human ability to moderate content and manipulated media can easily fool artificial intelligence, journalists, researchers, advocates and corporations share the responsibility of safeguarding online spaces. We also need more investment specifically in corporate content moderation, an emerging industry with thousands of workers globally. Ultimately, whoever reaps the profits of social media’s products bears some responsibility.

No doubt, tackling a problem this big will require federal oversight for the long term. But, as the surgeon general’s advisory emphasizes, we don’t need to wait for regulation before we make changes.