Instagram has come under fire after introducing a new content filter that some say could end up censoring users from marginalized communities.
Last week Instagram rolled out its “Sensitive Content Control,” which lets users determine how much “sensitive” content — like nudity, guns and violence — they’d like filtered out of their Explore pages. The options are “Allow,” “Limit” and “Limit Even More.” All accounts default to the “Limit” setting.
“You can think of sensitive content as posts that don’t necessarily break our rules, but could potentially be upsetting to some people — such as posts that may be sexually suggestive or violent,” Instagram officials said in the announcement.
They said more about the filter’s function a few days later on Twitter: "Because we show you posts from people you don’t follow on Explore, we try not to show content that some may find sensitive. Things like posts about smoking, violent posts like people fighting, or about pharmaceutical drugs."
Ashlee Marie Preston, a Los Angeles-based activist and cultural commentator, said she keeps up with Instagram’s updates and noticed that engagement from her 400,000 followers had declined in the week since the control was implemented. Preston, who is a Black trans woman, said she routinely shares posts about transphobia, racism and trans identity.
That type of content, she said, often falls victim to Instagram’s filters.
“No one really knew it was placed on their account. It was a pre-emptive move,” Preston said of the filter. “I saw Instagram was ushering us into a new era in which they were allowed to be the ones who determined what was sensitive and unsafe and what was not. The problem with that is that our identities, experience and very being is going to be deemed sensitive and unsafe, because our experiences are unsafe."
Instagram launches enhanced security features for users under 18July 27, 202100:36
Preston said she has spent months looking at social media patterns and quality filters and is launching the "#TakeBackIG" campaign to address what critics say is Instagram's censorship, demand transparency and restore full access to content settings — "so users can apply or remove filters of their own volition, no default settings," she said.
The new filter is intended to keep people safe and allow users to curate their Explore pages by deciding for themselves how much content to filter, Instagram told NBC News. However, users won’t have a say in what content is considered sensitive. Depending on the level chosen, violent videos and posts featuring abuse might be filtered out, as well as photos of women in bikinis or posts in which activists condemn police brutality.
"Unfortunately, there will always be mistakes made and that is the exception. By no means is it the rule," a spokesperson for Facebook, which owns Instagram, said about concerns that Black user content might be filtered out under the content control.
The spokesperson said Instagram's Equity Team is devoted to making sure Black users aren't censored or unfairly hidden.
"We released Sensitive Content Control to give people more say over what they see on Explore," the spokesperson said in a statement. "We know that not everyone wants to have the same experience, and this control will let people decide if they want to see more or less sensitive content. This will have no impact on what they see in places like Feed or Stories, where we will continue showing them posts from people they follow."
Critics of the filter, like art promoter Phillip Miner, accused Instagram of burying the new control, according to The Washington Post. Miner shared a post instructing users how to turn off the filter, writing, "Instagram made it harder for you to see or share work that explores content that Instagram deems ‘inappropriate.’"
It's not the first time Instagram has been called out for bias in its algorithm. Instagram is one of many social media platforms accused of censoring Palestinian voices by deleting pro-Palestinian posts and accounts. It has also been accused of bias against women of color, and last year a group of civil rights organizations, including the NAACP, called on advertisers to drop Facebook, Instagram’s parent, for enabling racism and misinformation during the summer’s police violence protests.
Adam Mosseri, the head of Instagram, acknowledged in June 2020 that Black users were being harassed and “shadowbanned” on the app. In response, he announced that the company would look into the racial bias, specifically regarding harassment, algorithmic bias, account verification and content distribution.
Preston said on Instagram that some users, including her, weren’t able to change the filter from its default setting. She said posts that draw attention to abuse will most likely be suppressed under Instagram’s “sensitive” category.
“How do we mitigate gun violence or address police brutality, with the posts disappearing at mention of ‘guns?’" Preston asked Mosseri in a comment on his post.
Mosseri replied that Instagram is working to figure out why some users are unable to adjust the quality filter and fix the issue. “There is no perfect way to reduce the risks associated with guns or nudity without also risking limiting legitimate uses for sharing similar content,” he said.