IE 11 is not supported. For an optimal experience visit our site on another browser.

Myanmar rights groups say Facebook put thousands in danger, contradict Mark Zuckerberg

The social media network's response to hate speech amid a suspected genocide has been "inadequate," the groups wrote in an open letter.
Image: Myanmar Facebook
A member of the Rohingya ethnic minority scrolls through Facebook on his cell phone at a temporary makeshift camp after crossing over from Myanmar into the Bangladesh side of the border on Sept. 8, 2017.Ahmed Salahuddin / NurPhoto via Getty Images

Facebook potentially endangered hundreds of thousands of people in conflict-ridden Myanmar with its "inadequate" attempts to quash online hate speech, civil rights groups say.

In an open letter on Thursday, six civil society and human rights organizations blamed Facebook for allegedly facilitating propaganda and misinformation that helped to fuel Myanmar's suspected genocide — and contradicted cofounder and CEO Mark Zuckerberg's recent claims that his platform had effectively shut down the threats targeting western Myanmar's persecuted Rohingya community.

On Monday, in a wide-ranging interview with Vox about the various controversies Facebook is currently embroiled in, Zuckerberg had praised his company's response to anti-Rohingya propaganda, which proliferated through Facebook Messenger chain letters last September.

"We were surprised to hear you use this case to praise the effectiveness of your ‘systems’ in the context of Myanmar. From where we stand, this case exemplifies the very opposite of effective moderation."

"We were surprised to hear you use this case to praise the effectiveness of your ‘systems’ in the context of Myanmar. From where we stand, this case exemplifies the very opposite of effective moderation."

"That’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that's going on. We stop those messages from going through," Zuckerberg said. "But this is certainly something that we're paying a lot of attention to."

But according to the open letter, the messages, sent last September, were passed around for days before Facebook took notice of them — and Facebook only became aware of them after the rights organizations escalated the messages to staffers at the social network via email.

"We were surprised to hear you use this case to praise the effectiveness of your 'systems' in the context of Myanmar. From where we stand, this case exemplifies the very opposite of effective moderation," wrote Phandeeyar, a tech innovation lab, and five other groups in the letter.

The Facebook messages, screenshotted in the letter, warned of an impending attack and cautioned members of Myanmar's Muslim minority to stay alert wherever they went. With no reporting function on the Facebook Messenger platform, there was no way to stop the messages, the authors of the letter said.

"Though we are grateful to hear that the case was brought to your personal attention, Mark, it is hard for us to regard this escalation as successful. It took over four days from when the messages started circulating for the escalation to reach you, with thousands, if not hundreds of thousands, being reached in the meantime," the letter added. "This is not quick enough and highlights inherent flaws in your ability to respond to emergencies."

Facebook's role in spreading misinformation in Myanmar has become one of the most sensitive topics for the social media company, which is already dealing with a variety of controversies — most notably how data for tens of millions of users ended up being used by a data analysis firm with ties to President Donald Trump's campaign.

Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, said that Facebook "substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public," and had played a "determining role" in Myanmar.

Facebook has acknowledged its role in the conflict. Adam Mosseri, head of the company's News Feed, told Slate in relation to the Myanmar situation: "we lose some sleep over this" despite the company's efforts to stop the spread of misinformation.

In a statement to NBC News, Facebook said Zuckerberg had not properly characterized how the social media company became aware of the threats.

"We don't want Facebook to be used to spread hatred and incite violence, and we are very grateful to the civil society groups in Myanmar who have been helping us over the past several years to combat this type of content. We are sorry that Mark did not make clearer that it was the civil society groups in Myanmar who first reported these messages," Facebook said.

"We took their reports very seriously and immediately investigated ways to help prevent the spread of this content," the statement continued, adding that Facebook will be rolling out the ability to report content in Messenger and has added more Burmese language reviewers to handle any future reports.

Myanmar, formerly known as Burma, is rife with ethnic conflict. Rights watch organizations have consistently criticized Myanmar's government for its violations of humanitarian law; last September, the United Nations described a security operation there that targeted Rohingya Muslims as a "textbook example of ethnic cleansing."

Nearly 700,000 Rohingya have fled to Bangladesh from western Myanmar as a result of the crackdown, according to the Associated Press.

The letter to Zuckerberg acknowledged the prominent role Facebook plays in delivering news and other important communications to people in Myanmar but pointed out that without a presence in Myanmar or Burmese-speaking Facebook staff, dangerous hate speech can easily spread.

The Myanmar accusations are of course only the latest troubles for the beleaguered social media behemoth, which is still reeling from the Cambridge Analytica data scandal.

Zuckerberg is slated to testify before Congress next week about concerns over the company's data privacy policies.