double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs vietnamese seafood double-skinned crabs mud crab exporter double-skinned crabs double-skinned crabs crabs crab exporter soft shell crab crab meat crab roe mud crab sea crab vietnamese crabs seafood food vietnamese sea food double-skinned crab double-skinned crab soft-shell crabs meat crabs roe crabs
US News

AI-generated nude images of girls at NJ high school trigger police probe: ‘I am terrified’

AI-generated pornographic images of female students at a New Jersey high school were circulated by male classmates, sparking parent uproar and a police investigation, according a report.

Students at Westfield High School — located in Westfield, a town about 25 miles west of Manhattan where the average household income is $259,377, according to Forbes — told the Wall Street Journal that one or more classmates used an online AI-backed tool to create the racy images and then shared them with peers.

A mother whose daughter is a student at Westfield High School, recounting what her child told her to the Journal, said sophomore boys at the school were acting “weird” on Monday, Oct. 16.

Multiple girls started asking questions, and finally, on Oct. 20, one boy revealed what all the whispering was about: At least one student had used girls’ photos found online to create the fake nudes and then shared them with other boys in group chats, per the Journal.

Several female students were also reportedly told by school administrators that boys had identified them in the fake pornographic images, parents said, though a spokesperson for the high school declined to tell the Journal whether staff members had seen the photos.

Fake nude images of female students at Westfield High School in NJ were being circulated among male classmates. Some of the school’s staff members even told girls that they were in the images, according to the Wall Street Journal. Google St View
The Westfield Police Department is reportedly looking into the deepfake images. At least one student was involved in generating the images, though it wasn’t clear what AI-backed tool they used. AFP via Getty Images

Another parent, Dorota Mani, said her 14-year-old daughter Francesca was told by the school that her photo was used to generate a fake nude image, known as a “deepfake.”

“I am terrified by how this is going to surface and when. My daughter has a bright future and no one can guarantee this won’t impact her professionally, academically or socially,” Mani told the Journal.

The concerned mother said she doesn’t want her daughter in school with anyone who created the images, and confirmed that she filed a police report.

According to visual threat intelligence company Sensity, more than 90% of deepfake images are pornographic.

Many also use celebrities’ likenesses, such as a recent viral video where an AI-generated deepfake showed supermodel Bella Hadid, whose father is Palestinian, express support for Israel. Earlier this year, deepfake images of Pope Francis in a Balenciaga puffer jacket and Donald Trump resisting arrest also took the internet by storm.

Snap, behind Snapchat, has taken steps to ban such images of minors and report them to the National Center for Missing and Exploited Children, according to the Journal, though there are close to zero safeguards in place to stop this from happening elsewhere on the internet.

It wasn’t immediately clear which AI website was used to create the pornographic images, though there are many free AI-backed image generators on the internet, including OpenAI’s Dall-E, Adobe’s Firefly and Canva, as well as a slew of lesser-known tools such as Freepik, Wepik, Craiyon and Fotor, just to name a few.

Deepfakes have become increasingly realistic. This one of Donald Trump resisting arrest went viral earlier this year. Twitter / Eliot Higgins

Other girls’ parents who spoke to the Journal — including two of the four who filed reports with local police — said they and their daughters hadn’t seen the images in question.

A person familiar with the police investigation said the police haven’t seen the photos either, the Journal reported.

Also on Oct. 20, Westfield High School Principal Mary Asfendis confirmed the incident to the parents of each of the school’s roughly 1,900 students after girls reported the photos to school administrators.

Girls at Westfield High School learned of the fake nude images after many of their male classmates were acting “weird” and whispering last month. The school confirmed the incident in an email to parents. Getty Images

Asfendis also said in the email obtained by the Journal that she believed the images had been deleted and were no longer being circulated.

“This is a very serious incident,” Asfendis penned. “New technologies have made it possible to falsify images and students need to know the impact and damage those actions can cause to others.”

She also vowed to continue teaching children about responsible technology use.

Deepfake images of Pope Francis in a Balenciaga puffer jacket also went viral earlier this year. TikTok/@vince19visuals

It wasn’t immediately clear how many students were involved in creating the fake nude images, or if any disciplinary action had been taken.

“To be in a situation where you see young girls traumatized at a vulnerable stage of their lives is hard to witness,” Westfield Mayor Shelley Brindle told the Journal of the incident, adding that as the town’s first female mayor, she considers herself an advocate for women and girls.

The Post has sought comment from Westfield High School and the Westfield Police Department.

Just this week, President Biden issued a sweeping executive order that regulates the development of AI, and implements “safeguards against … producing child sexual abuse material and against producing non-consensual intimate imagery of real individuals.”

State officials have outlawed the distribution of AI-generated porn or have given victims the right to sue in civil court in Virginia, California, Minnesota and New York.