Florida teens suspended for spreading AI-generated nudes of classmates
Two students were suspended from a Miami high school last week for creating and spreading AI-generated nude images of dozens of traumatized classmates, according to reports.
Administrators at Pinecrest Cove Preparatory Academy suspended two boys for making images so disturbing that several victims did not want to return to class.
The perpetrators obtained headshots of the students — both male and female — from the school’s social media account and used an AI deepfake app to create the nude images.
Those explicit shots were shared among students on social media.
One parent said her daughter is hesitant to return to school out of humiliation and fear.
“She’s been crying,” parent Vanessa Posso told CBS. “She hasn’t been eating. She’s just been mentally unstable. She does cheer and she didn’t even want to come to school to do it.”
The offending students were suspended for 10 days from the Florida charter school, but some parents want them booted permanently.
“Our daughters do not feel comfortable returning to school with these boys in the same hallways,” said parent Nadia Kahn-Roberts, according to the outlet.
Deepfake nudes of unsuspecting students are plaguing districts across the nation, with administrators struggling to stem the disturbing trend.
More than 30 female students at New Jersey’s Westfield High School fell victim to the practice in October after learning that the manufactured images were in wide circulation.
“I didn’t know how complex and scary AI technology is,” Francesca Mani, 15, told NBC. “I was shocked because me and the other girls were betrayed by our classmates.”
Distinguishing between authentic images and AI-powered fakes has become increasingly difficult — and victims often have limited legal recourse, the network reported.
Some of the pictures feature graphic sexual activity and often fuel school bullying and taunting.
New Jersey lawmakers have introduced a bill that would make deepfake pornography illegal, and subject violators to criminal and civil penalties.
Texas, Minnesota and New York have passed legislation criminalizing nonconsensual deepfake pornography.
California and Illinois have laws allowing victims to sue perpetrators for civil damages.
But law enforcement officials concede that throttling the problem will be difficult — and that countless deepfake nudes are likely already in circulation across the internet.