IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook responds to censorship critics with transparency pledge

Facebook is hoping its latest move will put to rest any concerns over whether it is unfairly censoring content.
Image: Donald J. Trump Rally In Fort Lauderdale
Diamond and Silk, sisters who have more than 1.6 million Facebook followers, at a Trump campaign event in Fort Lauderdale, Florida, in August 2016.Johnny Louis / WireImage via Getty Images file

SAN FRANCISCO — Human moderators have just a few seconds to make a decision about whether a piece of reported content belongs on Facebook — and they don’t always get it right.

Now, Facebook wants to make sure its moderators and the public are on the same page.

On Tuesday, Facebook updated its community guidelines, publishing for the first time the rules on what moderators should be looking for when deciding to remove content from Facebook. The social network, which has 2.13 billion users and counting, also said it would launch an appeals option for people who feel their page or posts were unfairly removed.

Under the new policy, users can file an appeal if they believe a piece of their content has been unfairly removed or if they’ve flagged a piece of content that Facebook’s team of content reviewers decided not to remove. Their appeal will be sent to a new human moderator, who will issue a decision within 24 hours.

“At our scale, we receive millions of reports every week in dozens of languages around the world,” Monika Bickert, head of global policy management at Facebook told reporters in a briefing last Thursday. “Even if we are operating at 99 percent accuracy, we are still going to have a lot of mistakes every day. That is the reality of reviewing content at this scale.”

“We want to get it right, which is why we want to make sure we are giving people the option to ask us to reconsider,” she added.

Facebook has long been dogged by criticism that it unfairly censors content, ranging from conservative voices to a breast cancer awareness video and even the iconic Vietnam War photo of a naked young girl running from a napalm attack.

With its latest move, Facebook wants to make its process for handling offensive content more transparent to ensure no one is being unfairly censored.

Facebook CEO Mark Zuckerberg told lawmakers during his recent two days of testimony that the question of bias was a "fair concern" and said he strives to ensure "we don't have any bias in the work that we do."

There were two names that came up repeatedly throughout Zuckerberg’s hearing: Diamond and Silk. The dynamic sister duo have a following of more than 1.6 million people on Facebook, where they sound off on everything from perceived media bias to political happenings and their love for President Donald Trump. They’re scheduled to testify on Thursday at 10 a.m. ET before the House Judiciary Committee regarding filtering practices on social media.

Republican lawmakers have seized on the topic of censorship of conservative voices by Facebook after the sisters’ page was throttled by Facebook this month for being “unsafe to the community.”

Zuckerberg told lawmakers that it was an “enforcement error” and that Facebook had been in touch with the sisters and reversed it. The sisters claimed they hadn’t been contacted, however internal emails published by the conservative blogger Erick Erickson show that Facebook had indeed reached out to them.

While Facebook is releasing a more detailed set of community standards, Bickert said it does not mean any existing policies or the way they are enforced will be changing. Instead, she said the goal is to give Facebook users more clarity about what isn’t allowed in the community.

“The more people know about this change, I think the better for Facebook users, the more feedback we get,” Bickert said.

That process will include continuous updates. Every week or two, she said, Facebook’s internal reviewers receive updates, however small, about how to handle content. Those updates will now also be reflected in Facebook’s public-facing community standards, she said.

Facebook is also launching a series of forums, beginning in Europe and coming to the United States and other countries later this year, to engage with the community about what’s working and what isn’t on the social media network.

“We are hoping this will really spark dialogue around the way people see our standards and that we will get feedback and input that will allow us to evolve these standards in an efficient way,” Bickert said.