The Beverly Hills Police Department have opened an investigation into reports of students making fake nude photos of their classmates at a middle school, a city spokesperson said Wednesday. 

Keith Sterling, the deputy city manager of Beverly Hills, said the department is investigating students at Beverly Vista Middle School who authorities say used artificial intelligence tools to create the images and then share them with other students.  

School officials were made aware of the “AI-generated nude photos” of students last week, the district superintendent said in a letter to parents.

Students and parents told NBC News they were afraid to go to school or send their children to school after the incident, which follows a string of similar AI-generated nude photo cases at high schools around the world. The emergence of sophisticated and accessible apps and programs that “undress” or “nudify” photos, as well as “face-swap” tools that superimpose victim’s faces onto pornographic content, have led to an explosion of nonconsensual sexually explicit deepfakes that predominantly target women and girls.


Security guards at Beverly Vista Middle School in Beverly Hills, Calif., on Monday.Jason Armond / Los Angeles Times via Getty Images

Mary Anne Franks, the president of the Cyber Civil Rights Initiative and a professor at George Washington University Law School, previously told NBC News that the AI-generated nude photos of students could be illegal depending on the facts of the case and what the images depict. 

For example, Franks said, a criminal case could involve sexual harassment, or the material could be considered child sexual abuse material (CSAM, a term experts and advocates favor over child pornography). Not all nude photos of children, whether AI-generated or not, fall under the legal definition of CSAM — but some do, including some AI-generated depictions. For depictions to be illegal, they must show sexually explicit conduct, which is a higher bar than nudity alone.

“We do have federal and other prohibitions against certain depictions of actual children’s faces or other parts of their bodies that are mixed in with other things,” Franks said. “Depending on the factual circumstances, there could be behavior that rises to the level of harassment or stalking.”

Leave a Reply

Your email address will not be published. Required fields are marked *