IE 11 is not supported. For an optimal experience visit our site on another browser.

Teen Marvel star speaks out about sexually explicit deepfakes: ‘Why is this allowed?’

Xochitl Gomez, 17, said her family has struggled trying to get the deepfakes taken off of X.
Xochitl Gomez at the premiere of "Wonka" in Westwood, Calif
Xochitl Gomez at the premiere of "Wonka" in Westwood, Calif., on Dec. 10, 2023. Richard Shotwell / Invision/AP file

A 17-year-old Marvel star and “Dancing With the Stars” performer, Xochitl Gomez, spoke out about finding nonconsensual sexually explicit deepfakes with her face on social media and not being able to get the material taken down. 

During a Jan. 10 episode of “The Squeeze,” a podcast hosted by actor Taylor Lautner and his wife, Taylor Lautner, Gomez said she saw sexually explicit deepfakes of her on Twitter (now called X). Gomez, who plays America Chavez in “Doctor Strange in the Multiverse of Madness,” said she asked her mother about the material and learned that her team had already tried and failed to get it removed.

“It made me weirded out and I didn’t like it and I wanted it taken down. That was my main thought process was, ‘Down. Take this down. Please,’” Gomez said during the podcast. “It wasn’t because I felt like it was invading my privacy, more just like it wasn’t a good look for me. This has nothing to do with me. And yet it’s on here with my face.” 

In a search Friday, NBC News was able to easily find several deepfakes of Gomez on X. A representative for X did not immediately respond to a request for comment.

NBC News reported in June 2023 that nonconsensual deepfakes of young female social media stars were circulating on X despite the platform’s rules against nonconsensual nudity. After reaching out to X, some but not all of the material was removed. 

Gomez joins a chorus of girls and women, some famous and some not, who have spoken up about the growing crisis of nonconsensual sexually explicit deepfakes. Typically, these deepfakes use artificial intelligence to graft the victim’s face into a pornographic image or video. Search engines like Google and Microsoft's Bing host such material in top image search results for prominent women’s names plus the word “deepfakes,” while links to websites that monetize the material appear in top web results. Both Google and Microsoft’s Bing have search results takedown request forms for nonconsensual deepfake victims and their representatives.

“It’s just weird to think if someone looked up my name, that’s also what pops up, too,” Gomez said during the podcast. “You can’t take them down.”

There is currently no federal legislation in the U.S. addressing nonconsensual sexually explicit deepfakes and only a patchwork of state laws pertaining to deepfakes, but a federal bill that would criminalize the nonconsensual sharing of the material is awaiting further action.

Gomez’s podcast episode was clipped and posted last week on X, where the segment about deepfakes was viewed more than 7 million times, according to X’s metrics.

“Why is it so hard to take down? That was my whole thought on it, was, ‘Why is this allowed?’” Gomez said. “In my mind, I knew that it wasn’t me, so it didn’t mess with me or anything like that. It was just something that felt really uncomfortable because I couldn’t take it down.”

“Nothing good comes from thinking about it,” she added. “I put my phone down [...] I do some skincare, I go hang out with my friends, something that will help me forget what I just saw.”