IE 11 is not supported. For an optimal experience visit our site on another browser.

Deepfake bill would open door for victims to sue creators

Sens. Dick Durbin, Lindsey Graham and Josh Hawley plan to introduce the Disrupt Explicit Forged Images and Non-Consensual Edits Act on Tuesday.
Senate Judiciary Committee Chair Dick Durbin (D-Ill.) speaks during a meeting on Capitol Hill May 18, 2023.
Senate Judiciary Committee Chair Dick Durbin, D-Ill., on Capitol Hill on May 18.Francis Chung / Politico via AP file

A bipartisan group of three senators is looking to give victims of sexually explicit deepfake images a way to hold their creators and distributors responsible.

Sens. Dick Durbin, D-Ill.; Lindsey Graham, R-S.C.; and Josh Hawley, R-Mo., plan to introduce the Disrupt Explicit Forged Images and Non-Consensual Edits Act on Tuesday, a day ahead of a Senate Judiciary Committee hearing on internet safety with CEOs from Meta, X, Snap and other companies. Durbin chairs the panel, while Graham is the committee’s top Republican.

Victims would be able to sue people involved in the creation and distribution of such images if the person knew or recklessly disregarded that the victim did not consent to the material. The bill would classify such material as a “digital forgery” and create a 10-year statute of limitations. 

“The volume of deepfake content available online is increasing exponentially as the technology used to create it has become more accessible to the public,” Durbin’s office said in a news release. “The laws have not kept up with the spread of this abusive content.”

In the release, the senators noted that Taylor Swift had recently become a victim of such deepfakes, which spread across Elon Musk’s X and later Instagram and Facebook.

“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit deepfakes is very real,” Durbin said. “Victims have lost their jobs, and may suffer ongoing depression or anxiety.”

Nonconsensual sexually explicit deepfakes use AI technology to create original fake imagery, “undress” real photos, and “face-swap” people into pornographic videos. Victims are overwhelmingly women and girls. 

“Nobody, neither celebrities nor ordinary Americans, should ever have to find themselves featured in AI pornography,” Hawley said. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court. This bill will make that a reality.”

Washington has taken notice, though no major legislation has been passed. In May 2023, Rep. Joe Morelle, D-N.Y., introduced the Preventing Deepfakes of Intimate Images Act, which would criminalize the nonconsensual production and sharing of AI-generated sexually explicit material.