On Facebook, there are certain truths that are self-evident — such as the fact that the social network’s automated filters aren’t always going to make the right call.
In the days leading up to the Fourth of July, The Vindicator, a small publisher in Texas, posted passages of the Declaration of Independence on its Facebook page to make it “a little easier to digest.”
Earlier this week, Facebook’s algorithm apparently took issue with paragraphs 27-31, which include the phrase “merciless Indian savages,” and flagged the passage as hate speech.
The mix-up highlights the challenges that Facebook is grappling with as it uses a combination of artificial intelligence and human review to crack down on hate speech and other abusive content on its vast social network.
Casey Stinnett, managing editor of The Vindicator, wrote that “to be honest, there is a good deal in that passage that could be thought hateful” when taken out of context.
A Facebook spokesperson told NBC News that millions of reports are processed every week and “sometimes we get things wrong.”
Facebook apologized to The Vindicator and restored the post on July 3, according to the spokesperson.
The episode is just the latest in a long list of issues in which Facebook has flagged content that would generally be considered historically important or otherwise benign. In 2016, the social network removed an iconic photo from the Vietnam War, in which a young naked girl is running from a napalm attack. Facebook defended its actions at first, but later admitted it had erred.
Facebook faces similar challenges in more recent attempts to regulate political advertising on its platform. A search of Facebook’s political advertisement archive shows its system wrongly removed a number of ads for not adhering to Facebook’s new political ads standards, including one posted by Walmart for Bush’s baked beans and another from a church with a reverend named Clinton.
A Facebook spokesperson told NBC News this week that the review process isn’t perfect, but they’re being cautious and are working on it.
In April, Facebook updated its community guidelines, publishing for the first time rules on what moderators should be looking for when deciding to remove content. Facebook also has controls in place should any user get into a situation similar to The Vindicator’s.
Monika Bickert, Facebook’s head of global policy management, announced in April an appeals option for people who feel their page or posts were unfairly removed.
Under the policy, users can file an appeal if they believe a piece of their content has been unfairly removed or if they’ve flagged a piece of content that Facebook’s team of content reviewers decided not to remove.
Their appeal will be sent to a new human moderator, who will issue a decision within 24 hours.