IE 11 is not supported. For an optimal experience visit our site on another browser.

They appeared in deepfake porn videos without their consent. Few laws protect them.

Deceptively manipulated pornography used the likenesses of Twitch stars without their consent, and now they're calling for more to be done.
Photo Illustration: A collage of a woman's hair, hand, lips, eyes, with glitching technology effects and cursors
The number of deepfake pornographic videos available online has increased sharply, nearly doubling every year since 2018, according to research conducted by Genevieve Oh, a livestreaming analyst.Justine Goode / NBC News / Getty Images

Artificial intelligence-generated pornography featuring the faces of nonconsenting women is becoming more pervasive online, and the issue is spilling into the world of popular influencers and streamers.

In January, the British livestreamer “Sweet Anita,” who has 1.9 million followers on Twitch, where she posts videos of her gaming and interacting with followers, was notified that a trove of fake sexually explicit videos featuring the faces of Twitch streamers was circulating online.

Her first thought was: “Wait, am I on this?”

She quickly Googled her name alongside the term “deepfake,” a word used to describe a highly realistic but fake, digitally manipulated video or image, and a technique that is increasingly being used — typically without consent — for pornography purposes. Anita’s initial search brought up several videos that showed her face edited onto another person’s body.

“This has obviously been going on for quite a while without my knowledge, I had no idea — it could have been years for all I know,” said Anita, 32, who did not want to share her full name with NBC News out of concerns for her safety and privacy offline.

Hany Farid, a professor of computer science at the University of California, Berkeley, said deepfakes are a phenomenon that is “absolutely getting worse” as it’s become easier to produce sophisticated and realistic video through automated apps and websites.

The number of deepfake pornographic videos available online has seen a sharp increase, nearly doubling each year since 2018, according to research conducted by Genevieve Oh, a livestreaming analyst. In 2018, just 1,897 videos had been uploaded to a well-known deepfake streaming site, but by 2022 this number increased to over 13,000 with over 16 million monthly views.

Now suddenly the people who are vulnerable are people who have very small footprints online.

-Hany Farid, a professor of computer science at the University of California, Berkeley

Previously, celebrities were primarily the targets of deepfakes.

“Now suddenly the people who are vulnerable are people who have very small footprints online,” said Farid. “The technology is getting so good that it can generate images from relatively small training stats, not these hours and hours of video that we used to need.”

Anyone interested in creating deepfakes can quickly access a plethora of free and paid-for face-swapping apps available in the Google Play and Apple App stores, making it easy for anyone to upload a photo and edit it onto a photo or video within seconds. 

Some major platforms like Reddit, Facebook, TikTok and Twitter have attempted to address the spread of deepfake porn with policy changes. While each of the platforms specifically prohibits the material, some have struggled to moderate it. A search of Twitter, for instance, found deepfake pornographic videos claiming to feature Twitch stars, along with hashtags promoting deepfakes.  

In January, the proliferation of deepfake pornography made waves online, when a popular Twitch streamer with more than 300,000 followers admitted to paying for explicit material featuring AI-generated versions of his peers. 

On Jan. 30 in a tearful apology video that was reshared on Twitter and gained millions of views, Brandon Ewing — who uses the screen name “Atrioc” on Twitch — said he clicked on an ad for deepfake pornography while browsing a popular porn website. He said he then went on to subscribe and pay for content on a different website that showed other female streamers after becoming “morbidly curious.”

In a longer statement posted on Twitter on Feb. 1, Ewing directly addressed Twitch livestreamers Maya Higa and Pokimane, whose likeness briefly appeared in a tab for a website that hosts deepfake pornography during one of his livestreams. 

“Your names were dragged into it and you were sexualized against your will,” he said. “I’m sorry my actions have lead to further exploitation of you and your body, and I’m sorry your experience is not uncommon.”  

Ewing did not respond to request for comment.

Pokimane also did not respond to request for comment, but in a Jan. 31 tweet she wrote, "stop sexualizing people without their consent. that’s it, that’s the tweet."

Higa said she had no further comments to make beyond her Twitter statement on Jan. 31, in which she wrote, in part, the “situation makes me feel disgusting, vulnerable, nauseous, and violated -- and all of these feelings are far too familiar to me.” 

The incident highlighted the growing prevalence of nonconsensual AI-generated pornography and the ethical problems it creates.

There has been an “uptick” in websites that are “willing, eager and monetizing the hosting of this material,” Farid said. 

QTCinderella, another Twitch streamer who found out she had been featured on the deepfake website, said she found it particularly hurtful because Ewing is a close friend. 

“I think that’s what was most unfortunate: I didn’t find out from Atrioc. I found out from the internet talking about it,” said QTCinderella, 28, who also did not share her full name with NBC News in order to protect her privacy and safety offline.

She said she quickly tracked down the video content to an account on a subscription-based website and issued a takedown notice, but the videos continue to spread like “wildfire.” 

In the United States, while the majority of states have laws that ban revenge porn, only New York, Virginia, Georgia and California have laws that specifically address deepfake media, according to the Cyber Civil Rights Initiative. Meanwhile, the United Kingdom announced in November last year that it was planning to criminalize explicit nonconsensual deepfake media.

QTCinderella said the current legal framework is “disheartening.”

“Every single lawyer I’ve talked to essentially have come to the conclusion that we don’t have a case; there’s no way to sue the guy.”

While a lot of deepfake pornography can look amateur and low-quality, Farid said he’s now also seeing accounts offering to create sophisticated custom deepfakes of any woman for a small fee.

After seeing the deepfake videos that were being sold of her online, Anita said she felt numb, tired and disassociated.

“I’m being sold against my will,” she said. “I didn’t consent to being sexualized.”

QTCinderella said she experienced “body dysmorphia.”

“When you see a porn star’s body so perfectly grafted onto where yours should be, it’s the most obvious game of comparisons that you could ever have in your life,” she said. “I cried and was like ‘my body will never be like that.’”

Sophie Compton who campaigns against intimate image abuse with the organization My Image, My Choice said women who are targeted are “shamed or silenced” and feel their experience is minimized because there are few legal options available to those affected by deepfakes.

“We need to find a way to make these sites and their business model impossible,” Compton said.

Specific platforms that host nonconsensual sexual imagery need to be held accountable, rather than individual accounts and creators, Farid said. “If you really want to tackle this problem, go upstream,” he said. “That’s where all the power is.”

Anita said she wants there to be “very visible consequences.”

What unsettles her the most going forward is that it’s impossible to know who bought the fake videos.

“When I go to a meet-and-greet I could end up hugging and signing something for somebody who’s watched me be deepfaked … and I’d have no way to know that they’re consuming that,” she said. “That they’d buy my body against my will is just all really, really horrible.”