A new lawsuit accuses Facebook of playing a role in political violence in Africa and seeks to hold it accountable by demanding more than $2 billion in restitution funds and major changes to the service’s content moderation efforts in the continent.
It is the latest case to draw connections between the platform and ethnic violence in the developing world.
The class-action lawsuit was filed in Nairobi, Kenya, where Facebook opened a major content moderation hub for Eastern and Southern Africa in 2019, accuses the company of monetizing the viral potential of hate and violence in conflict-torn Ethiopia, in violation of more than 10 articles of Kenya’s Constitution. It also alleges the company does not devote enough resources to content moderation on the continent compared to the United States.
Among the plaintiffs in the lawsuit is Ethiopian professor Abrham Meareg, who is seeking political asylum in the United States. He alleges his father was killed by militants last year during the ongoing civil conflict in Ethiopia, as a result of incitement that spread on Facebook.
Meareg’s father, Meareg Amare Abrha, was a well-known chemistry professor and member of the Tigrayan ethnic group. He was murdered on Nov. 3, 2021, when a group of men followed him from the university on motor bikes and shot him twice in front of his home, according to an affidavit Meareg filed in the case. The family home was eventually occupied by militants, and Meareg’s mother fled to Addis Ababa, Ethiopia’s capital.
“My father didn’t get any chance to convince people that he was innocent,” Meareg said in an interview, from his home near Minneapolis, where he is now living. “He didn’t get the choice to clarify the hate speech and disinformation. They just shot him and killed him in a brutal way.”
The lawsuit comes on the heels of criticism about the use of Facebook amid conflict in places like Myanmar and India. In Myanmar, where state violence against the country’s Muslim Rohingya minority has raged for years, the site was sharply criticized for letting hateful rhetoric and incitement to violence thrive on its platform.
“We have strict rules that outline what is and isn’t allowed on Facebook and Instagram,” Mike DelMoro, a spokesman for Facebook's parent company, Meta, said in a statement Tuesday. “Feedback from local civil society organizations and international institutions guides our safety and integrity work in Ethiopia. We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya.”
An Amnesty International report from earlier this year found that Meta contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017. In India, the country’s largest market based on the number of users, concerns have grown among some researchers about the use of disinformation on Facebook to foment ethnic and religious tensions.
Meareg said that Facebook allowed multiple posts with threats and misinformation about his father to stay on the site amid the country’s ethnically driven civil war, even after he flagged them for removal. Facebook, Meareg said, is “lethal by design.”
Meareg said the trouble that lead to his father’s killing began when Tigrayan staff at Bahir Dar University, where his father taught, were targeted online.
A Facebook page called “BDU Staff,” with 50,000 followers, posted a picture of his father on Oct. 9, 2021, saying that he was “hiding” at the university and had “carried about abuses,” according to Meareg’s affidavit. Commenters weighed in with exhortations to violence, according to the posts cited in the affidavit.
The next day, another post was made to the same group. This one also featured Meareg’s father’s photo as well as the neighborhood where he lived in Bahir Dar. And it included numerous false claims about his father, according to the affidavit, including that he had helped massacre people, that he was a corrupt property owner, that he had helped with military incursions into nearby areas and that he had stolen huge sums of money.
Those targeted posts came amid others made by prominent Facebook users in the country calling for violence against the Tigrayans, according to the affidavit.
“These posts were a death sentence for my father,” Meareg’s affidavit says.
Meareg said he reported both posts immediately after being alerted by a friend, but Facebook did not take any action until after his father’s killing. The first of those posts remained up as of Dec. 8. Facebook took the other post down, according to the documents.
“Facebook is a big gun, social media platform in Ethiopia,” Meareg said. “Facebook knows the platform is used for genocide, ethnic cleansing, extrajudicial killings. And intentionally, due to their deliberate dismissal of the consequences and harm, they just prefer to focus on their profit-making.”
DelMoro, the Facebook spokesman, declined to answer specific questions about the company’s content moderation staffing for Ethiopia, but he pointed to changes the company announced on Nov. 9, 2021, about a week after Meareg’s murder, which allows Facebook to proactively address potentially violent material in Ethiopia.
The company said at that time that Ethiopia was a particularly challenging environment for content moderation due in part to the many different languages spoken in the country, but that it had been classified internally as one of the countries at highest risk for conflict and violence for two years.
Changes the company made included reducing the spread of viral content and reducing the spread of material that the company’s automated moderation technology had flagged as being likely to be hate speech. The company also made it easier for Ethiopians and international and local human rights groups to flag violations. The company said it had taken action on more than 92,000 pieces of content shared on Facebook and Instagram in Ethiopia between May and October 2021 for hate speech violations, 98% of which was detected before being reported by users.
But the lawsuit alleges that Facebook’s lack of human resources in content moderation is allowing ethnic violence to worsen — with significant consequences for places like Ethiopia.
It demands Facebook demote incitements to violence, similar to the emergency steps the company took in the United States after the attack on the Capitol on Jan. 6, 2021, and that it bulk up its moderation staff to serve the complicated language markets in Africa. The lawsuit also demands the company creates a restitution fund of about $2 billion for victims of hate and violence incited on the platform in the region covered by its Nairobi hub, and another $400 million for harm to people in Kenya from sponsored posts.
Cori Crider, a director at the United Kingdom-based nonprofit Foxglove Legal, which is litigating the case, said that the lawsuit seeks to rectify imbalances in content moderation, where poorer countries and regions, many of which have complicated ethnic and language divisions, are given less resources than wealthier countries.
“However bad you and I think content moderation is in the U.S., it is an order of magnitude worse anywhere outside of the U.S. — and particularly bad in places facing crisis or conflict,” Crider said in an interview. “When people make posts calling for genocide or targeting people in certain areas, posts will go viral and it will not come down. What happened to Abrham’s father is horrific and also systemic.”