Feedback
News

Britain, France Propose Legal Liability for Websites That Don’t Remove Extremism

Britain and France are working on a plan to make social media companies like Facebook and Twitter legally liable for "extremist" material on their sites, their leaders said Tuesday.

British Prime Minister Theresa May described the plan as a counterterrorism initiative at a news conference in Paris with French President Emmanuel Macron. She said it would "ensure that the internet cannot be used as a safe space for terrorists and criminals and that it cannot be used to host the radicalizing material that leads to so much harm."

Image: Emmanuel Macron and Theresa May
French President Emmanuel Macron and British Prime Minister Theresa May at the Elysee Palace in Paris on Tuesday. Philippe Wojazer / Reuters

ITV, NBC News' British news partner, reported that the initiative would press tech companies to establish an industry-led forum to develop shared solutions to the problem, a policy agreed upon last month at the G7 Summit in Italy.

Related: Latest Attack Renews Debate Over Extremists’ Use of Social Media

May and Macron said the initiative would try to work out a way to encourage companies like Facebook and Google, the owner of YouTube, to quickly find and rapidly remove extremist materials. If that doesn't work, companies could be considered legally liable for the material and subject to fines.

"We are already working with social media companies to halt the spread of extremist material and poisonous propaganda that is warping young minds, but we know they need to do more," May said Tuesday.

Macron acknowledged the delicacy of balancing punitive measures with preservation of users' privacy and free speech, suggesting that European countries should consult with U.S. law enforcement agencies "to improve access to digital proofs that are used in our police and judicial investigations, wherever this data is located."

Related: Where Do We Draw the Line When It Comes to Free Speech Online?

After the terrorist attack that killed seven people in London on June 3, May accused U.S. tech companies of not having done enough to weed out extremists in cyberspace.

"There is, to be frank, too much tolerance of extremism in our country," she said at the time. "We cannot allow this ideology the safe space it needs to breed. Yet that is precisely what the internet and the big companies that provide internet-based service provide."

In reaction, Google told NBC News that company leaders "share the government's commitment to ensuring terrorists do not have a voice online."

Twitter said that in the second half of 2016, it suspended more than 376,000 accounts for violations related to the promotion of terrorism.

And Facebook responded: "We work aggressively to remove terrorist content from our platform as soon as we become aware of it."