Photo: Rich Graessle/Icon Sportswire via Getty Images
A high school in Westfield, New Jersey, launched an investigation after a student allegedly created AI-generated nudes of girls at the school. Sophomore boys were sharing the images in group chats, The Wall Street Journal reports. But on October 20, one of them apparently tipped off his classmates, who escalated the incident to school administrators.
“My daughter texted me, ‘Mom, naked pictures of me are being distributed.’ That’s it. Heading to the principal’s office,” one mom, Dorota Mani, told CBS. She went on to say that her daughter, who is 14, “started crying, and then she was walking in the hallways and seeing other girls of Westfield High School crying.”
According to the Journal, which spoke with a number of the girls’ parents, one student (or possibly more) took original photos of the girls and ran them through an artificial-intelligence tool to create fake nudes, also known as deep fakes. The school’s principal, Mary Asfendis, reportedly acknowledged in an email to parents that the school believes “any created images have been deleted and are not being circulated,” though “there was a great deal of concern about who had images created of them.” A person familiar with the police investigation told the Journal that police, who are also looking into the matter, haven’t seen the images and believe they were deleted.
But at a time when inappropriate — if not necessarily illegal — uses of AI are rampant, parents and students aren’t sure that those investigations offer enough protection for victims or consequences for perpetrators. After all, there’s ample evidence to suggest that women and teens are particularly vulnerable to misuse of AI. Sophie Maddocks, a researcher at the University of Pennsylvania, recently told the Washington Post that these days, AI is “very much targeting girls,” even those “who aren’t in the public eye.” Additionally, a 2019 study by Sensity AI, a company that tracks deep fakes, found 96 percent of those images are pornographic and nearly all of them target women.
Without a federal law banning deep-fake porn, some states have begun creating their own legislation, which can be hard to enforce. Some states only allow civil suits while other states allow criminal charges, but as the Washington Post points out, it’s hard to know whom to sue. Mani, whose daughter was affected by the Westfield deep fakes, said that she filed a police report but is still worried about how this could affect her daughter going forward. “No one can guarantee this won’t impact her professionally, academically, or socially,” she told the Journal. “I am terrified by how this is going to surface and when.” And in the meantime, her daughter still has to attend school with whoever created the photos in the first place.