IE 11 is not supported. For an optimal experience visit our site on another browser.

Police investigate AI-generated nude photos of students in Beverly Hills

Reports from a middle school allege that students were using AI tools to create and share fake nude photos of their classmates.
Get more newsLiveon

Beverly Hills, California, police are investigating reports that students made fake nude photos of their classmates at a middle school, a city official said Wednesday. 

Deputy City Manager Keith Sterling said the department is investigating students at Beverly Vista Middle School who authorities say used artificial intelligence tools to create the images and share them with other students.  

School officials were made aware of the “AI-generated nude photos” last week, Sterling said in a letter to parents.

Students and parents told NBC News they were afraid to go to school or send their children to school after the incident, which follows a string of similar AI-generated nude photo cases at schools around the world. The emergence of sophisticated and accessible apps and programs that “undress” or “nudify” photos and “face-swap” tools that superimpose victims’ faces onto pornographic content have led to an explosion of nonconsensual sexually explicit deepfakes that predominantly target women and girls.

Security guards at Beverly Vista Middle School in Beverly Hills, Calif.
Security guards at Beverly Vista Middle School in Beverly Hills, Calif., on Monday.Jason Armond / Los Angeles Times via Getty Images

The president of the Cyber Civil Rights Initiative, Mary Anne Franks, a professor at George Washington University Law School, has told NBC News that the AI-generated nude photos of students could be illegal depending on the facts of a case and what the images depict. 

For example, Franks said, a criminal case could involve sexual harassment, or the material could be considered child sexual abuse material (CSAM, a term experts and advocates favor over "child pornography"). Not all nude photos of children, AI-generated or not, fall under the legal definition of CSAM — but some do, including some AI-generated depictions. For depictions to be illegal, they must show sexually explicit conduct, which is a higher bar than nudity alone.

“We do have federal and other prohibitions against certain depictions of actual children’s faces or other parts of their bodies that are mixed in with other things,” Franks said. “Depending on the factual circumstances, there could be behavior that rises to the level of harassment or stalking.”