Misleading, unverified and clearly false information about the coronavirus has spread across social media platforms, with some accounts only a few days old reaching millions of views with a mix of fearmongering and racial stereotyping.
Misinformation has spread on Facebook, Instagram, Twitter, YouTube and the short-form video app TikTok, some published by fringe groups that commonly traffic in conspiracy theories and far-right news organizations. But some misinformation that has achieved a sizable reach has come from accounts set up in recent days or those that focus on internet marketing, highlighting how global news stories can quickly become fodder for people looking to build followings on social media.
The first- and second-most-popular coronavirus posts on Instagram since the start of the outbreak, according to Crowdtangle, a social media metric platform, came from the account of Karmagawa, a U.S. nonprofit owned by an internet marketer that donates proceeds from the sale of branded merchandise to various charitable causes.
Both posts feature a carousel of videos, some of which show people eating animals like bats and mice and put the blame for the outbreak on the eating habits of Asian people — claims that have been debunked. Experts have warned about the spread of xenophobic and racially insensitive stereotypes tied to the coronavirus.
Users have flooded the comments on both posts with questions about the accuracy of the videos or their relevance to the charity. Meanwhile, the videos have racked up a collective 42 million views on Instagram, while the most popular coronavirus post from a verified news source, the BBC, has garnered just over 1 million views.
Jennifer Mancini, a public relations representative for the nonprofit, said that while the videos are perhaps technically misinformation, they generally support the current research and are in the service of a greater philanthropic mission: to save animals and build schools.
“A lot of his stuff is for shock value, but look at the results,” Mancini said. “He’s had so many shares. He’s gained so many followers, and donations have started coming in, so it’s clearly working. People are watching, right?”
Mancini said money donated to the nonprofit would go to buy supplies and medicine for people and families affected by the virus.
Similar and identical videos have been used by content makers across other platforms, accruing millions of views, according to Crowdtangle.
Health officials around the world are still working to contain the virus, which has now killed 170 people. There are currently more than 7,700 confirmed cases.
With fears of the coronavirus spreading, and talk on social media ramping up, users hoping to ride the virality of the discussion have hashtagged their posts — sometimes content that has no relation to the virus — with #coronavirus in order to get their content to the top of searches.
Professionals in the social media marketing sector call this practice unethical.
“Is it successful? Sure, in a sense you’re going to show up, but your content isn’t relevant,” said Tyler Farnsworth, founder and chief growth officer of August United, an influencer marketing agency. “You may be discovered but you wouldn’t be discovered in a way that, in our opinion, is ethical.”
Instagram said in a statement that the company has limited the distribution of content that has been rated false by its fact-checking partners, but declined to comment on specific accounts.
The major tech platforms have instituted a variety of measures in an attempt to limit the spread of misinformation. Twitter introduced a prompt that directed users to the Centers for Disease Control and Prevention when searching for information about the coronavirus. Google is also reportedly steering users to “authoritative information” on YouTube.
But accounts on various platforms that have promoted coronavirus misinformation have been rewarded with millions of views and an influx of new followers.
On TikTok, where the “coronavirus” hashtag has spread rapidly, accounts that began posting in the last two weeks have accrued millions of views by creating content about the coronavirus that is either misleading or fake.
One user, dressed in a white doctor’s coat and appearing in a lab-like environment, claimed to be working on the virus, showing a vial of “normal blood” next to a vial of “patient zero” blood. The blood of a “patient zero” appeared purple in the vial and of a different viscosity from the “real blood.”
That video was viewed at least 2.4 million times before it was removed, but video duets — side-by-side reactions to the video — still remain on the platform.
In an email to NBC News, a TikTok spokesperson said that the video was taken down for violating community guidelines.
“Our Community Guidelines do not permit misinformation that could cause harm to our community or the larger public. While we encourage our users to have respectful conversations about the subjects that matter to them, we remove deliberate attempts to deceive the public,” the spokesperson said.
Joan Donovan, director of the Technology and Social Change Research Project at Harvard University’s Shorenstein Center, said tech platforms provide the ideal way for people to make money off public crises.
“Just as platforms have provided the capacity to mobilize massive crowds, it also scales scams and fake charities in ways that the public falls prey to,” she said. “We’ve seen similar attempts at keyword squatting by influencers using ecological crises and other significant events to raise money for themselves.”
Coronavirus has also become an opportunity for conspiracy movements that have had a harder time finding traction on TikTok to gain a foothold.
The most prominent account on TikTok associated with QAnon — the conspiracy theory that baselessly claims a global cabal of cannibals and pedophiles is running the world and President Donald Trump is working in secret to stop it — received more video views last month than ever, according to the social media analytics company Clout Meter, in part because the content has shifted away from vague promises of a government overthrow and toward fearmongering about the coronavirus.
Travis View, a conspiracy theory researcher who hosts a podcast, “QAnon Anonymous,” that debunks and analyzes the QAnon movement, said major news events present opportunities for conspiracy theorists to broaden their viewership.
“Online conspiracy theorists always build their audience by putting their alternative spin on things going on in the actual news,” View said
View said conspiracy theories about Kobe Bryant’s death from the same account also received inordinately high engagement.
The strategy of latching on to popular news items to go viral at any cost is a common practice among social media users chasing larger audiences, Farnsworth said.
“There’s this attitude that we’ve built up that you can chase virality. It’s virality or nothing,” Farnsworth said. “And, for some people, it’s throw ethics to the wind.”
Nat Gyenes, who leads the Digital Health Lab at the technology nonprofit Meedan, and researches technology and health at Harvard's Berkman Klein Center for Internet & Society, said public health officials and organizations will have to compete harder if they’re to outpace misinformation.
"If authorities like the World Health Organization or the Centers for Disease Control and Prevention aren't the Instagram, Twitter or TikTok accounts that people source for information about the coronavirus outbreak, the entities they encounter may do more harm than good," Gyenes said.
Even when reliable sources are on platforms where people increasingly go for health information, often the language is so specific, so nuanced, that good information is eclipsed by more sensational claims and fearmongering elsewhere.
"It is a challenge that health information authorities will have to address, adapting their communication methods to combat the memetic transfer of misinformation,” Gyenes said.