After Facebook banned then-President Donald Trump in January 2021 following the Capitol attack, the company said it would reconsider his status on the platform in two years’ time. On Saturday, that timer will run out, and Facebook is grappling with a different landscape as it weighs whether to allow Trump back on its platform.
For one thing, social media looks different than it did two years ago.
Trump now has his own social media company, Truth Social, and his account has been restored on Twitter (where he has yet to tweet). Within Silicon Valley, the debate over when and why to ban accounts has shifted, most notably at Twitter since Elon Musk took over, decrying the company’s previous efforts at refereeing content. And though there’s no legal right for Trump or anyone else to be on social media, Republicans in Florida and Texas are trying to create laws that would prevent social media companies from removing certain posts.
All that leaves Meta CEO Mark Zuckerberg with a question: Is it safe for the company to allow Trump back on the platform in this different social media environment?
Meta, the umbrella company for Facebook and Instagram, isn’t tipping its hand before the self-imposed two-year deadline, although it may give itself a few weeks extra to think the question over.
“We will announce a decision in the coming weeks in line with the process we laid out,” Meta spokesperson Andy Stone said in an email.
NBC News asked a handful of experts in social media moderation what they thought about the upcoming decision. The answers offered a sense of the shifts in social media and moderation since both Twitter and Facebook banned Trump.
Daniel Karell, a Yale University sociologist who studies how social media shapes political violence, said it’s not so easy to tell what impact Trump’s words on Facebook would have now. In some ways, Facebook matters less, he said.
“People who would be motivated to threaten public safety because of Trump’s presence on Meta are surely exposed to similar ideas and rhetoric over the last year through other platforms, media sources and networks,” Karell said in an email. Last year, Facebook reported the first decline in users in its history, fueling investor fears of a possible death spiral.
Karell also noted a change in law enforcement’s approach to civil unrest; the FBI has arrested about 900 people in connection with the Jan. 6, 2021, attack.
But he added that “having Trump on widely used platforms normalizes his ideas and rhetoric, and very likely doesn’t help reduce any risks to public safety.”
Trump lost access to his accounts on Twitter, Facebook, Instagram, YouTube and other services in the days following Jan. 6, 2021, after his supporters attacked the Capitol. In one tweet posted during the attack, Trump said that “Mike Pence didn’t have the courage to do what should have been done.” Rioters chanted “Hang Mike Pence” and came within a minute of reaching the then-vice president.
The next day, Zuckerberg wrote on Facebook that Trump’s suspension on that service would go on indefinitely, and at least long enough to ensure the peaceful transition of power to President Joe Biden. The platform had been used “to incite violent insurrection against a democratically elected government,” Zuckerberg announced Jan. 7, 2021.
In 2021, the oversight board that hears appeals of Meta’s content moderation decisions said the company’s penalties should not be “indeterminate,” and Meta began a two-year clock, retroactive to Zuckerberg’s post.
Courtney Radsch, an academic who has most recently studied free expression and technology at UCLA’s Institute for Technology, Law & Policy, said it was clear to her that the threat from Trump has not receded. She cited “the plethora of elected representatives who feel they cannot get elected or reelected without supporting his baseless claims and dangerous rhetoric.”
She also noted Trump’s post on his own social media site, Truth Social, in December calling for the “termination” of the Constitution.
“Just as the people who died during the insurrection and the irreparable harm the former president did to America democracy and global norms cannot be restored, nor should his access to Facebook,” Radsch said in an email. At least seven people died in connection with the attack, according to the New York Times.
If Trump’s posts on Truth Social had appeared on Meta’s services, they would have gone against Meta’s community guidelines hundreds of times and about twice per day in the run-up to the 2022 midterm election, according to a report released last month from Accountable Tech, an advocacy organization.
Katie Harbath, a former employee of both Facebook and the Republican National Committee who’s now a consultant on tech and politics, said she thinks the company should allow Trump back on but have a clear process if he breaks the company’s terms of service or community guidelines.
She said tech companies should fight to protect free expression, especially of political candidates, and that the example they set will be watched by foreign governments looking for justification to shut down free speech.
“I’m very worried internationally about the trend of countries passing laws under the guise of regulating the tech companies that those governments are then exploiting to subdue free expression,” she wrote in a paper for a conference on “deplatforming” in October.
It’s not clear how much Trump might use his Facebook and Instagram accounts if given the chance. He hasn’t tweeted since Musk restored his Twitter account. Trump’s political organization still has an active Facebook page, Team Trump, where someone posts most days.
Despite Trump’s silence on social media platforms other than Truth Social, he’s still campaigning for the White House in 2024 and actively appealing to conspiracy theorists and election deniers.
Nick Clegg, Meta’s president of global affairs, said in a June 2021 blog post that Meta would “look to experts to assess whether the risk to public safety has receded” before unlocking Trump’s access.
“We will evaluate external factors, including instances of violence, restrictions on peaceful assembly and other markers of civil unrest,” he wrote. “If we determine that there is still a serious risk to public safety, we will extend the restriction for a set period of time and continue to re-evaluate until that risk has receded.”
The company has an internal group of employees working on the question, the Financial Times reported.
Musk, at Twitter, also turned to outsiders for help with his decision — but users, not experts. Musk allowed users to vote on the idea, and his poll received more than 15 million votes.
“The people have spoken,” Musk said then.
Meta has no plans to conduct a poll of its users regarding Trump, said Stone, the company spokesperson.