Facebook, looking to outsource tough decisions, reveals new details about its 'oversight board'

Facebook has been soliciting feedback over the past six months from more than 650 people at workshops in 88 countries on its draft plan for the board.
Image: Mark Zuckerberg Delivers Keynote Address At Facebook F8 Conference
Facebook CEO Mark Zuckerberg delivers the keynote address at Facebook's F8 Developer Conference at McEnery Convention Center on April 18, 2017 in San Jose, California.Justin Sullivan / Getty Images file

Breaking News Emails

Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.
SUBSCRIBE
By Reuters

Facebook on Thursday released the findings from its consultations with outside experts into its content review process, providing a glimpse into how its plans for a proposed “external oversight board” might take shape.

Facebook has been soliciting feedback over the past six months from more than 650 people at workshops in 88 countries on its draft plan for the board, which it says will function as an independent court of appeals on content decisions.

Chief Executive Mark Zuckerberg has said decisions about acceptable speech on Facebook’s suite of social networks - used by some 2.4 billion people worldwide - should not rest in the company’s hands alone.

The company will finalize the board’s charter in August, it said.

According to the report, attendees at the workshops broadly agreed that Facebook employees should not sit on the board. The company also should not be able to remove members without cause, and should clarify how it would define “cause,” they said.

Other popular proposals were that the board should be able to choose its own cases; that board decisions should establish precedent for future cases; and that the board should have the power to influence Facebook’s content policies.

Attendees expressed concerns over the board’s independence, both from state actors and the company itself.

Facebook has long faced criticism for doing too little to block hate speech, incitements to violence, bullying and other types of content that violate its “community standards.”

It has stepped up enforcement of those standards over the past year, employing more than 30,000 people to monitor content and focus on improving “safety and security” on the platforms, many of them low-paid contractors.

But the company continues to struggle with high-profile controversies over content posted on its site, such as the live streaming of a shooting that killed 51 people at two mosques in Christchurch, New Zealand in March.