IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook's new rapid response team has a crucial task: Avoid fueling another genocide

The Strategic Response team’s job is to find and help put in place concrete changes to Facebook’s products that could prevent violence in conflict-torn countries.
Twitter CEO Dorsey And Facebook COO Sandberg Testify Before Senate Intelligence Committee
Sheryl Sandberg, chief operating officer of Facebook Inc., listens during a Senate Intelligence Committee hearing on Sept. 5, 2018.Andrew Harrer / Bloomberg via Getty Images

MENLO PARK, Calif. — Steps away from the glass-enclosed office suite of Sheryl Sandberg, Facebook’s No. 2 executive, a team of employees has been taking shape with a mission that’s become critical to the tech giant’s future: avoid contributing to another genocide.

The driving force behind the team is the company’s blotted legacy in Myanmar, the southeast Asian nation where, according to United Nations researchers, Facebook became the go-to tool for spreading propaganda that helped drive a genocide of a religious minority, the Rohingya, that is estimated to have killed more than 10,000 people since the beginning of 2017.

Called Strategic Response, the team is a mix of the kind of people who have typically been found in either governments or multinational corporations with far-reaching interests. The team’s formation, which started in spring of last year and recently ramped up hiring, represents the latest evolution in the Silicon Valley’s culture: less “move fast and break things,” and more thinking through the harm they are adding to half a world away.

It’s also an implicit acknowledgement that Facebook, up until just a few years ago seen as an innocuous app for sharing photos and catching up with distant friends, has grown beyond its roots and must now embrace its emergence as a global company that plays a part in the daily lives of more than 2 billion people.

Rosa Birch, the Facebook team’s head, told NBC News in an interview that in assembling the team she prioritized people with experience in foreign affairs or conflict situations but who also understand what it’s like to work in a big organization.

“There’s a lot of similarities there between government and military and Facebook,” she said.

Tech companies and social media have played a role in geopolitics ever since the Arab Spring, but the Myanmar genocide and Facebook’s role in it raised the stakes of what happens on social media. At their worst, services like Facebook can be used not only to distort the news or elections but also to cause violence on a wide scale. Social media-fueled conflict has also flared in India, Sri Lanka and elsewhere.

Software engineers have been the core of Silicon Valley companies like Facebook, but lately the office parks housing America’s tech mega-corporations are seeing more people in key roles who used to work inside governments, the military or multinational corporations at risk of sparking violence in the world’s hot spots.

The anti-violence team’s members have unusual backgrounds for a tech company. They include or work closely with recent hires who are former diplomats, human rights researchers, a former military intelligence officer and one person who advised oil giant BP on geopolitical risk.

Victoire Rio, of the Myanmar Tech Accountability Network in Yangon, has spoken with members of the new Facebook team, and said that it’s progress to have people at Facebook “connecting the dots” between the company’s policies and products.

“But there are far too few of them,” Rio said, noting there’s about one team member per continent. “They really need more staff to go beyond firefighting and responding to PR crises, and be able to focus on more pre-emptive civil society engagement and developing systemic solutions.”

Strategic, but also rapid

The Facebook team’s name, Strategic Response, suggests long-term thinking, and the team’s job is to find and help put in place concrete changes to Facebook’s products — the news feed, for example — that could prevent violence in conflict-torn countries.

The team isn’t designed to write policy rulebooks or produce other paperwork, according to interviews with Facebook managers. Instead, its mission is to coax software engineers into adding or deleting features from Facebook apps, or to make demands of other teams — all with the explicit backing of Sandberg’s authority as the company’s chief operating officer.

Birch, who for years handled Facebook policy matters out of London but relocated to California last year, said the team worked on a new tool that allows approved non-governmental organizations to flag problematic material they see on Facebook in a way that is seen more quickly by the company than if a regular user reported the material.

“It sounds relatively simple, and something that we should have done a couple years ago,” she said.

The team also advises the rest of the company on languages — specifically, which language speakers they need to hire for among thousands of content reviewers, prioritizing languages used in conflict countries or by minorities. A failure to hire enough Burmese speakers contributed to Facebook’s slow response in Myanmar.

But the team is also supposed to be “rapid response,” informing other parts of the company when the next big social media-linked crisis finds its way into the news cycle. That could be Myanmar one day and Sudan the next.

When tensions begin rising in a country, no matter where, it might fall to Birch’s team to give guidance to Facebook’s thousands of content reviewers on what kinds of posts to watch out for and take down — for example, after the Easter Sunday bombings in Sri Lanka targeting churches and hotels in April.

“We can turn those up and turn those down quickly, within the space of hours,” Birch said. “When there’s something happening on the ground and we have concern about tensions on the ground that could be bubbling up, then we can more aggressively downrank content that we may not otherwise.”

People on the ground

There remains enormous skepticism of Facebook’s approach among international groups, in particular the idea that employees can address the company’s global crises mainly from a suburban office park in California.

Last year, civil society organizations in Myanmar wrote an open letter to Facebook CEO Mark Zuckerberg arguing that Facebook was too reliant on outside advisers like them, because it was difficult for outside advisers to reach company employees senior enough to make decisions.

“What’s ultimately necessary is to have Facebook employees in every country where they operate, and have a country director in each one,” said Paul Barrett, deputy director of the Center for Business and Human Rights at New York University.

Facebook has partnerships with non-governmental organizations in many countries, as well as paid contractors. It also sends Facebook employees abroad for field studies, research and other projects, but it maintains offices in relatively few countries.

Barrett said it was promising that Sandberg, Facebook's No. 2 executive, was paying attention and had set up a new team, but he said it was unrealistic for the team and to bear the world's problems, even with lots of outside advice.

“A large, international bank is not going to rely on local NGOs to help it run its global operations. It’s going to have people all around the world who are part of the organization,” Barrett said.

Samidh Chakrabarti, a director of product management at Facebook, said the company has benefited from research performed by non-governmental organizations and by Facebook’s own in-house researchers. A recent study in Cameroon, he said, affected the company’s thinking about images of graphic violence because people reported the images were useful by informing them of areas too dangerous to visit.

“We do an immense amount of work with people who are on the ground, who understand local context,” Chakrabarti said. Facebook says in the past year it has sent staff members to visit Cameroon, Lebanon, Myanmar, Nigeria and Sri Lanka.

Facebook in February announced the opening of an office in Nairobi, Kenya, with 100 people doing the job of content review, the process of examining posts to see if they violate Facebook’s rules. It is the first such office dedicated to sub-Saharan Africa, though Reuters reported in April that Facebook was still having trouble keeping up with posts in certain African languages such as Somali.

A growing team

The new team began after Facebook employees realized last year that there were 30 internal teams that had a hand in the company’s Myanmar response, though no one was in charge.

“There wasn’t perhaps the best coordination,” Birch said.

Birch has hired five people to focus on global hot spots, and they include a former U.S. State Department diplomat who specialized in the Middle East, an Iran-born computer scientist and a former researcher for a non-governmental organization in Myanmar.

Among human rights activists, the new Facebook team had something of a quiet debut last week when some of the team members traveled to Tunisia for a prominent international conference, RightsCon. Birch said Facebook sought research ideas and held “bilateral meetings” with various groups working on conflict areas.

Other Facebook employees work on adjacent subjects — some 17 Facebook employees list their work as including “strategic response” in some way on LinkedIn, including people in London, Singapore and Washington — and the company is still advertising job openings.

The team reports to Sandberg and meets with her at least weekly in a gathering that includes other senior Facebook executives, including people from the company’s lucrative ad business.

Birch said that cost is never a consideration. “Nobody has ever asked the question, ‘Should we do this because it’s going to cost us money but it would save people?’ It’s not come up, and I don’t feel worried that it would,” she said.

Sandberg asks questions at the meetings, Birch said. “‘How do we ensure this doesn’t happen again?’ And, ‘what’s missing there? Is it because there’s a staffing gap? Is there a process gap? Is there a product gap?’ Those are the kinds of questions that she’s asked,” she said.

When hate speech is from the government

Thorny questions await the team and Facebook. One of them is how to respond when the people using Facebook to stoke violence in a given country are elected politicians, military chiefs or other authorities — not just everyday users.

Facebook regularly takes down material from civilians or militant organizations, but it makes an exception for posts from governments, so-called “state actor” speech.

That had a tragic consequence in Myanmar, as Facebook failed to take down government-sponsored posts there that experts say contributed to violence. Only later did Facebook take down some accounts tied to the Myanmar military, making an exception to the company’s usual policy.

Facebook’s reluctance to fact-check officials from authoritarian governments who use Facebook push propaganda remains a yawning issue that risks another Myanmar-type genocide playing out on the platform again, said a former Facebook employee who worked on related issues at the company and spoke on condition of anonymity.

In March the company removed 200 accounts from Facebook and Instagram linked to a consultant for Philippine President Rodrigo Duterte, though Duterte — who has imposed a violent crackdown on drug users, alarming U.N. human rights advocates — still has a robust Facebook account with more than 4 million followers.

Brian Fishman, a Facebook policy director who works with the Strategic Response team, said the company was re-evaluating its policy with the goal of developing a clear rule to apply worldwide, though he said the company had no change to announce yet.

“You definitely want to set rules that you can apply as consistently as possible,” he said.

But he said the company still saw reasons not to censor governments. If Facebook shut down accounts linked to an authoritarian government, it might interfere with unrelated government services. Or, he said, some countries might retaliate against Facebook by shutting down or restricting internet services, hurting millions of users.

“We have to be very careful and very judicious,” he said, noting that Facebook’s power comes from its size and not from the United Nations or other legal authority.

Some advocates believe Facebook should be more attuned to international human rights law, following those precedents when it comes to defining hate speech, for example, rather than writing its own definitions.

“The issue is getting Facebook and other companies to engage with human rights law in a way that puts human rights law at the center of speech restrictions,” said Amos Toh, a senior researcher at Human Rights Watch.

One result of the new team’s work may be that Facebook — which started out providing nearly the same service to everyone — looks increasingly different in different countries, especially those with civil conflict.

“The rotation that we need to make as a company on the product side is, how not to think of our platforms as one thing that’s the same across the world, but how should they be different in different regions to try to mitigate some of these risks,” Chakrabarti said.