She was sitting in the lunchroom when a friend approached her.
“Your nudes have been leaked,” my friend said.
However, the 16-year-old student never sent nudes. Instead, there were deepfake porn images of her that were circulated at her high school.
In her next class, she asked another friend, “Have you heard anything about my nude images?”
The friend said yes at the party. Someone showed her a naked photo and it turned out to be a 16-year-old girl.
Her parents told USA TODAY that the deepfake scandal began several months ago, in December 2024, when their daughter had a falling out with a friend. Rumors spread at a school in Pennsylvania’s New Hope-Solebury School District that her daughter was “sending nudes.” But the parents said they had no idea the explicit photos had been made and distributed to the students until an exchange in the cafeteria in February 2025.
The impact on students and their families has been devastating. Once an outgoing and cheerful person, the now 18-year-old has become reclusive, afraid to go to grocery stores or pharmacies for fear that other customers may have seen her deepfake images, her parents said. The parents and their legal representative, Matthew Faranda-Diedrich, a partner at the law firm Royer Cooper Cohen Braunfeld, claim the school failed to properly investigate the deepfake scandal and provide support in the aftermath. They are working with Faranda Diedrich to file a lawsuit against the school district. The family requested anonymity to protect their daughter’s privacy.
An attorney for the school district and a representative for the high school did not respond to USA TODAY’s requests for comment.
The school district’s attorney sent a letter to Faranda-Diedrich in March 2026, which USA TODAY reviewed. The letter states that a Title IX investigation conducted by the school found no evidence that the perpetrators accused of sharing the photos “actually spread the deepfake nudes or rumors thereof.”
However, her parents withdrew their daughter from the public school after what they described as persistent bullying. Her parents are proud of the progress she’s made at her new private school — she will skip her alma mater’s junior prom and attend senior prom with new friends — but they say the deepfake scandal still weighs heavily on the entire family.
They are not the only ones facing the consequences of deepfake abuse. A growing number of schools across the country are grappling with the rise of deepfakes and “denuding” applications, often leaving parents in the dark, according to AI experts, lawyers and affected families. For victims of deepfake abuse, the psychological toll can be severe and long-term. They often feel alone or misunderstood when seeking support.
Deepfake AI porn scandal shocks small town
A small town in Pennsylvania was rocked by an AI deepfake porn scandal. As these platforms evolve, school policies and legal measures lag far behind.
When the parents of an 18-year-old boy tell their friends about their experience, most people react with shock.
“People still don’t realize that this can happen to anyone,” her father says. “We live in a very resource-rich school district, we live in a very resource-rich region of the world. If it can happen here, it can happen anywhere.”
Here’s what other parents need to know about deepfake abuse and how experts are helping schools overcome this growing crisis.
“It’s a fake photo, but it’s real.”
The first time parents hear about porn deepfakes should not be when their children encounter or are victimized by this type of media.
“The school should be up to date on these types of events that are happening,” the mother says. “We need support for the community, because they’re also helping with other things like bullying.”
In 2024, a survey of 3,170 K-12 students, teachers, and parents was conducted by the Center for Democracy and Technology to demonstrate the prevalence of AI deepfakes in schools and how schools are prepared to respond to incidents.
The survey found that 6 in 10 teachers were not aware of their school’s policies and procedures for dealing with real and deepfake sexual images, and only 16% said their school’s teacher training covered how to protect the privacy of students depicted in deepfakes. Only 13% of students reported that their school told them that sharing this type of AI-generated media was harmful to the person depicted.
Evan Harris, a national expert on emerging AI risks in schools at Pathos Consulting Group, said schools already have a lot on their plate and are struggling to keep up with the technology. Harris has conducted thousands of webinars to help schools address this growing crisis, worked in the field to workshop policies with a variety of schools, and contributed to a free educational program co-led by deepfake abuse victim Elliston Berry and security awareness training platform Adaptive Security.
“A big part of my job is to make all of this feel manageable, because when people are in that fight-or-flight mode, it’s hard to get started on anything concrete or actionable,” Harris explains. He begins by establishing three buckets: policies, crisis plans, and preventive education for faculty, parents, and students.
The father of a Pennsylvania student says it’s a “shared responsibility” between schools, parents and students themselves. “It takes a village.”
Her mother also wants students to complete educational training that emphasizes “the human impact of doing this to individuals.”
“(My daughter) loved school and loved the town,” her father said. “And eventually she had to leave school and was afraid to go out on the street.”
“It’s a fake photo, but it’s real,” the mother says.
Wait to tell your kids about deepfakes
Experts say traditional parental advice and health warnings (‘don’t send nude photos’) are no longer enough. Anyone can create and share explicit photos of themselves, even if they’ve never taken one themselves.
Parents of Penn State students say it is essential to maintain an open dialogue with their children.
“Kids who are going through situations like this need to be heard. They need to be believed,” her father says.
“Now that we knew what was going on, we were able to help her,” he continued. This required some tough conversations, but she says it helped keep her daughter from falling further into a “dark place.”
Harris said schools and parents need to establish a “culture of anti-judgment” by modeling positive responses while educating stakeholders about deepfake abuse before a crisis occurs.
“(Students) need to hear the message that they are attending a school where if they come forward, they will be supported, believed and not judged,” he says. “The number one enemy here is shame. That’s what we’re trying to counter.”
Parents and experts call for stronger reporting laws
Parents of a Pennsylvania student said the school took too long to address multiple complaints about what was happening to their daughter and did not submit a ChildLine report to the state aimed at identifying possible child abuse until May 2025.
They want schools to put in place stronger protections and reporting practices to prevent deepfake abuse from occurring in the first place and to better support students when it occurs.
“It’s a lack of understanding,” Faranda-Diedrich says. “If someone were to come in and say that child pornography is rampant in the school, there’s no way the school would respond that way.”
State Sen. Tracy Pennycuick is a co-sponsor of a bill to reform Pennsylvania’s AI child pornography law and hopes the state’s bipartisan effort will spread across the country. In Pennsylvania, the state supports another bill that would strengthen reporting requirements for mandatory reporters, including teachers and school administrators.
Mr Pennycuick said he wanted to leave “no ambiguity” about what action mandatory reporters should take, adding: “If we suspect there is any child sexual abuse material, we will report it.”
The parents of a Pennsylvania student want to protect other families from this nightmare.
“Schools need to trust the students who come to school and recognize that this is a new world we live in,” the mother says. “If you hear something, say something. I’m with the kids. I hear their friends talking. Start a conversation with your kids about it. It’s uncomfortable, but it’s real these days.”
This article was supported by a grant from the Tarbell Center for AI Journalism. Funders do not provide editorial input.

