1 in 8 Teens Know Someone Targeted by Deepfake Nudes, New Report Finds
March 3, 2025
4 Minute Read
Despite rising awareness—31% of teens are familiar with deepfake nudes—many remain unclear on the legal and emotional consequences.
LOS ANGELES, March 3, 2025 — New research from Thorn, a nonprofit that builds technology to defend children from sexual abuse and exploitation, reveals deepfake nudes are already violating young people’s privacy and safety. In fact, nearly 1 in 3 teens surveyed have heard of the term “deepfake nudes,” and 1 in 8 reported personally knowing someone who has been victimized by them.
Thorn’s new report, Deepfake Nudes & Young People: Navigating a New Frontier in Technology-Facilitated Nonconsensual Sexual Abuse and Exploitation, surveyed 1,200 young people (ages 13-20) to uncover how deepfake nudes are rapidly transforming from a fringe concern into a mainstream threat. Unlike earlier forms of manipulated imagery, today’s deepfake technology allows anyone to create hyper-realistic, explicit content in seconds—no technical skill required.
“No child should wake up to find their face attached to an explicit image circulating online—but for too many young people, this is now a reality,” said Melissa Stroebel, Vice President of Research and Insights at Thorn. “With lower barriers to creation and easily accessible apps, the dangers of deepfake nudes continue to escalate, with effects we’re only beginning to measure. This research confirms the important role tech companies play in designing and deploying technology conscious of the risks of abuse, while also underscoring the need to educate young people and their communities on how to address this kind of digital abuse and exploitation, and for technology-driven solutions that can stem the creation and spread of deepfake nudes.”
Other key findings of the report include:
- Teens overwhelmingly recognize deepfake nudes as harmful. A substantial 41% of teens said deepfake nudes cause harm to the person in them. They cite emotional distress (30%), reputational damage (29%), and deception (26%) as top concerns.
- Yet, misconceptions persist—some still believe these images are “not real.” 16% of teens either dismissed the threat outright or believed it was context-dependent. Among those who saw no harm, the most common justification (28%) was based on a perception that the imagery was “not real,” revealing a dangerous misconception that overlooks the emotional and psychological toll on victims.
- Some young people are making deepfake nudes and the tools they use to make them are easily accessible. Among the small percentage of young people who admitted to creating deepfake nudes (2%), most indicated they learn about the tools through app stores, search engines, and/or social media platforms.
- Despite recognizing harm, victims often suffer in silence. Nearly two-thirds (62%) of non-victims said they would tell a parent if it happened to them—but in reality, only 34% of victims actually did.
- Among young people, uncertainty about the legality of deepfake nudes persists. While most respondents recognized that creating deepfake nudes of anyone is illegal, 1 in 5 teens believed it was legal to create deepfake nudes of someone else, including another minor (someone under the age of 18) (20%).
The report is the first in Thorn’s upcoming research series aimed at examining emergent online risks to youth – including sextortion, commodified sexual experiences, and more – to better understand how current technologies create and/or exacerbate child safety vulnerabilities and identify areas where solutions are needed.
Among those solutions are resources that Thorn has developed for parents and young people to help them better understand and navigate online harms. Thorn for Parents helps caregivers facilitate judgment-free conversations about digital safety. Thorn’s Navigating Deepfake Nudes resource guide provides actionable tips for parents on the topic.
Read the full report: Deepfake Nudes & Young People: Navigating a New Frontier in Technology-Facilitated Nonconsensual Sexual Abuse and Exploitation.
About Thorn
Thorn is a nonprofit that builds technology to defend children from sexual abuse. Founded in 2012, the organization creates products and programs to empower the platforms and people who have the ability to defend children. Thorn’s tools have helped the tech industry detect and report millions of child sexual abuse files on the open web, connected investigators and NGOs with critical information to help them solve cases faster and remove children from harm, and provided parents and youth with digital safety resources to prevent abuse. To learn more about Thorn’s mission to defend children from sexual abuse, visit thorn.org.