IE 11 is not supported. For an optimal experience visit our site on another browser.

Meta builds tool to stop the spread of ‘revenge porn’

Facebook and Instagram’s parent company partnered with a U.K. nonprofit to build a tool to let people submit nudes to a central website for removal from multiple platforms.
Photo of a woman laying in bed looking out a window.
Getty Images; NBC News

Facebook’s parent company, Meta, has worked with the U.K.-based nonprofit Revenge Porn Helpline to build a tool that lets people prevent their intimate images from being uploaded to Facebook, Instagram and other participating platforms without their consent.

The tool, which builds on a pilot program Facebook started in Australia in 2017, launched Thursday. It allows people who are worried that their intimate photos or videos have been or could be shared online, for example by disgruntled ex-partners, to submit the images to a central, global website called StopNCII.org, which stands for “Stop Non-Consensual Intimate Images.” 

“It’s a massive step forward,” said Sophie Mortimer, the helpline’s manager. “The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it.”

Karuna Nain, Meta’s director of global safety policy, said the company had shifted its approach to use an independent website to make it easier for other companies to use the system and to reduce the burden on the victims of image-based abuse to report content to “each and every platform.”

During the submission process, StopNCII.org gets consent and asks people to confirm that they are in an image. People can select material on their devices, including manipulated images, that depict them nude or nearly nude. The photos or the videos will then be converted into unique digital fingerprints known as “hashes,” which will be passed on to participating companies, starting with Facebook and Instagram. 

StopNCII.org, which was developed in consultation with 50 global partners specializing in image-based abuse, online safety and women’s rights, will not have access to or store copies of the original images. Instead, they will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies. 

Other large platforms have expressed an interest in joining the initiative, including social media companies, adult sites and message boards, Mortimer said, although they are not yet ready to announce their participation in the program.

“Having one system open to all parts of industry is critical,” she said. “We know this material doesn’t just get shared on one platform and it needs a much more joined-up approach.”

Each participating company would use hash-matching technology to check whether images matching the hashes had been uploaded to their platforms. If they were to detect matches, content moderators on the platforms would review the images to ensure that they violated their policies and that the tool had not been misused by someone submitting another kind of non-violating image. If platforms were to determine that the images or the videos violated their policies, they would delete all instances of them and block attempts to re-upload them.  

In Facebook’s 2017 pilot, images were reviewed by human moderators at the point of submission and converted into hashes, which attracted some criticism over privacy concerns in media reports.

Since the 2017 pilot, Facebook has developed further systems to combat NCII on the platform, NBC News reported in November 2019. However, at the time, the company’s head of safety, Antigone Davis, said more collaboration with other technology companies was needed to prevent people intent on sharing intimate images without the subjects’ consent from simply moving to another platform. 

“It’s important that Facebook and industry recognize that they can’t be at the front of this,” Mortimer said, noting that the technology industry’s reputation when it comes to “privacy and people’s data is not what they’d like it to be.”

“A hash bank needs to be held independently in a neutral space, and we have a lot of public trust and long-standing track record of helping people affected by the sharing of intimate images without consent,” she said.

Mortimer said she applauded the fact that Meta recognized the problem and was “prepared to put in the funding and expertise” to create the tool and then “step away.”

People who submit material to the platform can also track their cases in real time and withdraw their participation at any point. 

“You can see that your hashes have been uploaded and when there’s been a match with a partner platform,” Mortimer said. “Sometimes clients may want the images taken down but don’t want to know the numbers, but others need to obsessively know about every removal.”

Meta and the Revenge Porn Helpline recognized that people could abuse the system by submitting non-violating images just to get pictures taken down. But Mortimer said the review process by human content moderators should mean that non-violating images would remain on the platform.

Nain, of Meta, also flagged that the system can detect only exact image matches, so if people have “very egregious intentions,” they can make small changes to images or videos to evade detection, creating a frustrating game of whack-a-mole for the subjects of the images.

“There’s a lot we will have to watch. How can we make the system even more victim-centric? Our work doesn’t end here,” Nain said. 

Although the tool is supported by 50 global partners, it will be available only in English at launch.