IE 11 is not supported. For an optimal experience visit our site on another browser.

Zuckerberg calls for changes to tech's Section 230 protections

While meant to ensure that companies are taking action against unlawful content, the changes could theoretically shore up Facebook's power.
Mark Zuckerberg testifies before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill on April 10, 2018.
Facebook CEO Mark Zuckerberg testifies before a joint hearing of the Commerce and Judiciary Committees on April 10, 2018.Andrew Harnik / AP file

Facebook CEO Mark Zuckerberg will propose this week that Congress make revisions to federal internet regulations that would require platforms to have systems in place for identifying and removing unlawful content.

The proposal, which Zuckerberg will present during his testimony before the House Energy and Commerce Committee on Thursday, would raise the bar for social media companies that are currently granted immunity from liability for the content that appears on their platforms. That immunity, granted by Section 230 of the Communications Decency Act of 1996, has come under fire in recent years from Republicans and Democrats.

While meant to ensure that companies are taking action against unlawful content, the changes could theoretically shore up Facebook's power, as well as that of other internet giants like Google, by requiring smaller social media companies and startups to develop robust content moderation systems that can be costly.

"Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it," Zuckerberg will say in his opening remarks, according to written testimony released on the House Committee website Wednesday.

Under Section 230, Facebook, Google and other internet giants bear no legal liability for content that appears on their platforms. Zuckerberg's proposal would keep those protections in place, but would make platforms liable for having systems to protect against unlawful content.

"Platforms should not be held liable if a particular piece of content evades its detection — that would be impractical for platforms with billions of posts per day — but they should be required to have adequate systems in place to address unlawful content," he will say, according to the written testimony.

Zuckerberg added that the definition of an “adequate system” should “be proportionate to platform size and set by a third-party,” an acknowledgment that smaller companies will not be able to develop the same moderation infrastructure as Facebook and Google.

Zuckerberg will appear Thursday alongside Google CEO Sundar Pichai and Twitter CEO Jack Dorsey before two House subcommittee — on communications and technology, and on consumer protection and commerce — to testify on social media’s role in promoting extremism and misinformation. The hearing will take place virtually.