IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook's new Oversight Board strives to be seen as independent

“The Oversight Board has the potential to revolutionize how we think about the relationship between private corporations and our public rights."
Image: Mark Zuckerberg delivers the keynote address at F8
Facebook chief executive Mark Zuckerberg delivers the keynote address at F8, the company's developer conference, in San Jose, Calif., on May 1, 2018.Marcio Jose Sanchez / AP file

Social media users who believe their posts have been unfairly removed from Facebook or Instagram can now file an appeal to Facebook’s Independent Oversight Board, the company announced Thursday.

Positioned as a "Supreme Court" for Facebook’s content moderation decisions, the external panel of 20 journalists, academics, lawyers and human rights experts will weigh in — and potentially override its content moderation decisions. The board has up to 90 days to review cases submitted by users through its website after they have exhausted their content appeal options directly with Facebook. If the Board sides with the user, Facebook will restore the content and potentially re-evaluate its policies.

“The Oversight Board wasn’t created to be a quick fix or an all-encompassing solution,” said Helle Thorning-Schmidt, co-chair of the board and former prime minister of Denmark. But it aims to “offer a critical independent check on Facebook’s approach to moderating some of the most significant content issues.”

By announcing the board on Thursday, Facebook has launched an unprecedented model of governance that no other social media outlet has created. Facebook has faced intense scrutiny about the way it enforces its own content rules in the run-up to the 2020 U.S. presidential election, including how it handles disinformation, foreign interference and hate speech. While some criticize the platform for not removing enough content, three quarters of Americans believe social media sites like Facebook intentionally censor political viewpoints they find objectionable, according to a September study from Pew.

These accusations of censorship come as Facebook tries to position itself as a defender of free speech and stave off efforts by lawmakers to regulate content on the platform. While it will not decide any cases before the election, the Oversight Board will prioritize cases that pose a significant risk to human rights or free expression, for example activists who believe they have been censored or posts that could be construed as hate speech, members of the board explained in a press call on Thursday.

The board’s launch also follows months of criticism for being too slow to launch, too limited in scope and too cozy with Facebook. Now, the board and its members are going to great lengths to assert their independence and prove that they aren’t shilling for Facebook.

“Everyone is very conscious that our reputations are on the line,” said board member Alan Rusbridger, former editor-in-chief of The Guardian, in a Zoom interview in September. “I think it’s a bit unfair to wade in with huge amounts of criticism when we haven’t said anything yet.”

Still some followers of Facebook expressed optimism that such a board was even created.

“The Oversight Board has the potential to revolutionize how we think about the relationship between private corporations and our public rights,” said Kate Klonick, an assistant professor at the St. John's University School of Law, who has published research on the Oversight Board. “It’s a step toward recognition that these transnational companies control our public rights in a way that governments don’t and that we need to create a participatory and democratic mechanism to inform those companies that those rights are protected.”

Early Foundations

Facebook CEO Mark Zuckerberg first floated the idea of an oversight board in April 2018. The social networking giant then provided $130 million to establish an independent board trust and helped choose board members. However, the board has its own staff, independent from Facebook, who oversee the allocation of funds and the administration of the appeal process. The board members were announced in May.

From the start, the members knew they had a large task ahead of them.

“Of all the criticisms that are lodged against Facebook, I think one of the biggest is that we can’t trust them,” Jamal Greene, a Columbia Law School professor and co-chair of the Oversight Board, said in an interview in September. “One of the aims of the Oversight Board is to try to establish an institution that can be trusted.”

Since May, board members have been meeting via Zoom to figure out how they should assign, deliberate and judge cases.

They were trained in Facebook’s community standards -- the rulebook for content that’s allowed on the platform -- and took part in five-person simulation panels to rehearse how they would deliberate each case and wrote up draft decisions using a case management system built by Facebook.

During some of the early sessions, Facebook employees joined the board’s Zoom calls to observe the discussions. This made some board members uncomfortable, so they excluded the Facebook employees from subsequent sessions.

“In one of the early meetings somebody said, ‘Why are there eight people on this call when we’re only six of us here?’ and the Facebook people then vanished,” Rusbridger said.

Working details

In September, Facebook showed NBC News an early version of the case management tool it built for the board, comprising a dashboard of appeals submitted by users from which board members can assign and select cases for review. Facebook can also submit cases to the board, including controversial content that was left up.

Each case, submitted by the user through the Oversight Board’s website, includes the item that was removed, an argument written by the user for why it should stay, along with some internal Facebook data such as how many people viewed the content and which policy it violated.

Once the board has selected a case, a panel of five board members review the evidence and write up their deliberations and decisions, incorporating material from external subject matter experts where relevant.

During test panels, there were times when board members noted that their decision could affect Facebook’s commercial model. For example, being more permissive about images containing some types of nudity on the platform could deter users in parts of the world with stricter cultural norms.

“The reaction has always been ‘Well, that’s not our problem, that’s Facebook’s problem,’” said Rusbridger. “So I don’t think anyone is coming into this thinking we’re here to help Facebook continue with life as normal.”