Breaking News Emails

Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.
By Ben Kesslen

In the wake of a British teenager's suicide in 2017, Instagram has announced changes that will extend to the United States and globally, including launching “sensitivity screens” to hide content that shows self-harm.

Molly Russell was 14 when she took her life and her parents partly blame Instagram. They say their daughter viewed graphic images of self-harm and suicide on the popular photo-sharing app.

In response to Russell’s death, British Health Secretary Matt Hancock wrote in a letter to Facebook, which owns Instagram, and to other technology companies that he would use the power of his office to prosecute companies that fail to remove graphic content and create better policies to shield young people from images of self-harm.

Hancock said social media companies need to “purge this content once and for all.”

“It is appalling how easy it still is to access this content online, and I am in no doubt about the harm this material can cause, especially for young people,” he wrote.

“What we know about social media is that it can feed into feelings of inadequacy and inferiority in adolescents,” Dr. John Cullen, a family physician in Valdez, Alaska, told NBC News. “The stress from that can lead to things like cutting behavior, suicidal ideation, or suicide.”

Instagram told NBC News the company is globally rolling out “sensitivity screens,” which blur images of self-harm at first glance. Now, people will have to elect to see the images, instead of the graphic content just appearing on their feed.

The app has already made posts related to cutting and other forms of self-harm unsearchable, and has put in measures to make sure self-harm content isn’t suggested to users.

Adam Mosseri, head of Instagram, wrote in the British newspaper The Telegraph that the company is “not yet where we need to be on the issues of suicide or self-harm.”

“We have engineers and trained content reviewers working around the clock to make it harder for people to find self-harm images,” he wrote.

While Mosseri says the company does “not allow posts that promote or encourage self-harm,” Instagram has decided to not ban such content outright, saying people use the platform in a productive way to speak about their struggles with the issue.

Suicide is currently the third leading cause of death for youth between the ages of 10 to 24, according to the U.S. Centers for Disease Control and Prevention, and the suicide rate among teenage girls ages 15 to 19 hit a 40-year high in 2015.

“When there are children and adolescents who are having thoughts of self-harm and suicide, we do know there can be a contagion effect,” Dr. Joanna Stern, a program director at the Child Mind Institute, told NBC News.

Stern said it isn't likely that children and adolescents who aren’t thinking about self-harm will be compelled to hurt themselves if exposed to the self-harm content on social media, but it is a risk for young people who are already thinking about it.

“The real concern is when there are kids who have the thoughts of self-harm, who have heard something about it, but don't have any concrete plans and ideas about how they would carry this out,” Stern said. “What this content does is it helps them make plans. It makes [self-harm] very accessible.”

Cullen told NBC News that while there isn’t one simple solution, a good first step is to make sure children and teenagers get to speak privately with a physician.

“Teenagers are incredibly private about their lives, and they may be engaged in social media that their parents don’t know about,” he said.

“It is important to have a one-on-one visit with a physician to talk about these issues. Parents are oftentimes the last ones to know when something is going wrong.”