Skip to content

Protect your child: Explore Thorn’s Deepfake Nudes Resource Guide

Deepfake Nudes & Young People: Navigating a New Frontier in Technology-facilitated Nonconsensual Sexual Abuse and Exploitation

Drawing on a survey of 1,200 young people aged 13-20, this research explores their awareness of deepfake nudes, lived experiences with them, and their involvement in creating such content.

March 3, 2025

30 Minute Read

Key Findings
  1. Young people overwhelmingly recognize deepfake nudes as a form of technology-facilitated abuse that harms the person depicted.
  2. Deepfake nudes already represent real experiences that young people have to navigate.
  3. Among the limited sample of young people who admit to creating deepfake nudes of others, they describe easy access to deepfake technologies.
Research Conducted in Partnership:

Introduction

In 2023, Thorn’s annual youth monitoring research surveyed minors (aged 9-17) about their awareness of peers using AI tools to generate nudes of other children. Among the 1,040 respondents, 11% reported believing their friends or classmates had engaged in this behavior, while an additional 10% selected “prefer not to say.”

This research sheds light on young people’s experiences with deepfake nudes, exploring their awareness, perceptions, and experiences with this growing threat. To read the complete research report and explore its findings in greater detail, please download the full report.

An evolution in image-based abuse

Deepfake nudes represent a stark evolution in image-based sexual abuse, driven by the rapid rise and accessibility of generative artificial intelligence (AI).

Deepfake nudes: synthetic media creations that depict real people in sexually suggestive or explicit situations or activities

These synthetic media creations (e.g., images, videos, audio) depict real people in sexually suggestive or explicit situations or activities. When the victim is a minor, these images may be more formally described as “AI-Generated CSAM” (AIG-CSAM).

Rapid & scalable deepfake nudes

While photo manipulation tactics have long been used to create or alter child sexual abuse imagery, earlier methods were typically crude — requiring significant time, skill, and technical expertise to produce photorealistic outcomes.

Generative AI has fundamentally changed this dynamic, enabling the rapid and scalable creation of highly realistic content with minimal skill or effort.

Generative AI has enabled the rapid, scalable creation of highly realistic explicit imagery with minimal effort

Unlike traditional forms of photo manipulation software, deepfake technologies uniquely empower the creation of explicit content targeting anyone — regardless of consent, prior digital behaviors (e.g., sharing intimate imagery), or even knowledge of the content’s existence.

Creators

Adults with sexual interests in children may use AI to fabricate abusive images of children both known to them and not, while others exploit deepfakes for sextortion, threatening to release fabricated images unless the victim complies with their demands.

Some young people themselves may also create deepfake nudes of their peers out of sexual curiosity, peer pressure, or as a form of bullying.

Risks & challenges

For those working in child safety, deepfake technologies introduce significant new risks and challenges in the fight against technology-facilitated child sexual abuse, many of which  and undermine the ecosystem’s ability to respond and prevent abuse.

Some of these new challenges represent technical hurdles and are inextricably tied to the technologies’ ease of access, lack of unified implementation of available safeguards (e.g., safety-by-design principles), and capacity to produce highly photorealistic content at scale.

Deepfake technologies introduce significant new risks in the fight against technology-facilitated child sexual abuse

Other new challenges reflect systemic issues. While progress has been made to advance trauma-informed responses to victim disclosures of sexual abuse and exploitation, significant barriers persist, including shame, humiliation, fear of not being believed, and self-blame.

The growing prevalence of deepfake nudes, especially in youth spaces like schools and social media, underscores the urgency for immediate and systemic intervention alongside technical ones.

Intimate imagery & young people

For young people, technology can intersect with sexual exploration and romance, with many reporting experiences of sharing nudes with a romantic partner or visiting websites containing adult material (such as dating or pornography sites).

These behaviors can quickly intersect with significant risks, such as nonconsensual image abuse, blackmail, harassment, and bullying — often leading to serious personal, emotional, and social consequences that can be difficult, if not impossible, to resolve once the images circulate.

Deepfake nude technologies add new risks to an already complex landscape by enabling the rapid creation of realistic explicit imagery of anyone. This ease of misuse not only amplifies harm but also risks normalizing the production and distribution of nonconsensual intimate content.

Exploring adult apps

Despite age restrictions, a substantial proportion of teens (aged 13-17) reported accessing platforms exclusively designed for adult users (aged 18 or older).

Dating app and pornography site usage rates among minors

Among participants aged 13-17

QApps. Do you use any of the following dating or adult apps?

“Any dating app” includes participant selection for Bumble, Grindr, Hinge, Tagged, Tinder, or Other. Percentages reflect a net percentage of participants who selected “Yes, I currently use this,” or “I do not currently use this, but have in the past.”

Approximately 1 in 5 (22%) teens reported using a dating app, 1 in 10 (10%) had accessed OnlyFans, and nearly 1 in 4 (23%) had visited pornography sites.

Sharing sexual imagery

Young people were asked about their experiences sharing sexual imagery of themselves while under 18, with 1 in 6 (18%) reporting they had. Among teens, 1 in 8 (13%) reported having shared sexual imagery of themselves with others.

AMONG ALL PARTICIPANTS

1 in 6 young people have shared sexual imagery of themselves with others while under the age of 18

Past research has established that while many report sharing the imagery with someone they know offline, they also report sharing the content with other users who they only know online. That research also established that minors who share their nude imagery report doing so with other people across a spectrum of ages, including other minors, adults, and people whose ages they don’t know.

Online solicitations

Roughly 1 in 3 young people (36%), reported being asked to share sexual imagery of themselves while they were under the age of 18 by someone they had met online.

LGBTQ+ respondents (53%) and women and girls (41%) were the most likely to report having this experience. Rates of the experience appeared to increase with age.

Perceptions

Awareness

Overall, 41% of young people indicated they had heard of the term deepfake nudes, including nearly 1 in 3 (31%) of teens. Men and boys were more likely to have heard the term than women and girls (+9%). LGBTQ+ young people were also more likely to have heard the term than their non-LGBTQ+ peers (+20%).

AMONG PARTICIPANTS AGED 13-17

Nearly 1 in 3 teens has heard of ‘deepfake nudes’

Awareness of deepfake nudes

Among all respondents

QD1. Have you ever heard of the term “deepfake nudes”?

*Base size <100

Awareness of deepfake nudes

Among participants aged 13-17

QD1. Have you ever heard of the term “deepfake nudes”?

*Base size <100

AMONG ALL PARTICIPANTS

1 in 8 young people personally know someone who has been the target of deepfake nudes while under the age of 18

Young people reported that their peers were encountering a mixture of experiences with deepfake nudes as minors– both being targeted with abusive images and as people involved in creating or redistributing those images.

Around 1 in 8 young people indicated that they personally knew someone who, while under 18, had been the target of deepfake nudes (13%) and/or someone who had used the technologies to create or redistribute deepfake nudes of minors (12%).

In total, 1 in 6 (17%) respondents indicated they personally knew someone who had encountered at least one of the following experiences: 

  1. Had a deepfake nude made of them by someone else
  2. Discovered a deepfake nude of them was being (re)shared
  3. Created a deepfake nude of themselves
  4. Created a deepfake nude of a minor
  5. Reshared a deepfake nude of a minor
AMONG ALL PARTICIPANTS

1 in 8 young people know someone who has used the technology to create or distribute deepfake nudes of others while under the age of 18

While the comparative base size was small, LGBTQ+ teens were much more likely than their peers to indicate that they knew someone impacted; 18% reported knowing someone targeted by the technologies, and 23% reported knowing someone who used the technologies to create or distribute the content of others.

It is almost like the new generation of nudes being leaked. They affect someone the same way as normal ones.

19, FEMALE, MULTI-RACIAL, SOUTH

Everyone will see it, they will be embarrassed and it will never go away.

13, MALE, WHITE, MIDWEST

Perception of harm

Overwhelmingly, young people (84%) indicated they believed the content harmed the person depicted in deepfake nudes.

Perceived harm for victims of deepfake nudes

QD14. Do you think that deepfake nudes of real people cause harm to the person shown in the photo or video?

*Base size <100

Among those who believed deepfake nudes were harmful, the top three reasons they identified were:

  1. The emotional and psychological impacts experienced by the victim(s) (31%)
  2. The potential for reputational damage of the victim(s) (30%)
  3. The inability for viewers of the deepfakes to recognize the content is not authentic (25%)

By comparison, around 1 in 6 (16%) young people believed deepfake nudes either were not harmful to the person depicted or that it depended on the situation.

Among young people who did not think deepfake nudes caused harm to the victims, the top two leading reasons were because they saw the imagery as “fake” and/or “not real” (28%) or because no physical harm was involved (7%).

You control what offends you. Of course it’s wrong to make deepfake nudes but ultimately it’s fake.

13, MALE, MULTI-RACIAL, SOUTH

…as soon as everyone knows it’s a deepfake, all feelings of panic and fear are gone. It’s not actually you, so there’s no pressure. It’s a little stressful but it’s not actually their body.

16, FEMALE, WHITE, MIDWEST

When the portion of young people who indicated they thought deepfake nudes were either harmless or situationally dependent were asked what would make them more likely to recognize harm, the most influential factor was if viewers believed the deepfakes were authentic (40%).

One-third said they would be more likely to think deepfake nudes were harmful if the images couldn’t be removed from the internet. Similarly, 29% said sharing or distributing the content would make it more harmful — whether shared with people who knew the person in the image or posted on digital platforms.

Perception of legality

Young people were presented with three different scenarios involving a minor creating deepfake nudes: of themselves, of someone else under the age of 18, and of someone else aged 18 or older.

In total, most young people recognized the behavior as illegal across all three scenarios.

Young people were slightly more likely to believe that a minor creating an image of themselves was legal (23%) than of someone else (19%), regardless of the other person’s age. For some, they thought the legality was dependent on the situation involved.

Lived experiences

AMONG PARTICIPANTS AGED 13-17

1 in 17 teens reported having deepfake nudes created of them

1 in 17 (6%) respondents reported they had been the target of someone using technology to create deepfake nudes of them, including 1 in 17 teens.

This experience was most prevalent among younger teen boys and young adult women, with 1 in 10 (10%) in each group reporting this form of victimization.

Responding to victimization

Among the portion of young people who had experienced deepfake nude victimization (regardless of whether it happened while they were a minor or young adult at the time of victimization), the majority (84%) sought some form of support, using either online safety tools or seeking offline forms of support.

Responses to deepfake nude victimization

Among respondents who’ve had deepfake nudes created of them by someone else (n=82*)

QD13. You indicated that to the best of your knowledge, someone has created deepfake nudes of you. Did you do any of the following in response to learning someone had created deepfake nudes of you?

Question was multiple select. “Did not seek support” represents a net percentage of respondents who only selected the response options of “ignored it” or “unsure.” “Sought support” represents a net percentage of respondents who selected any other response. *Base size <100

Around 1 in 6 (16%) deepfake nude victims indicated they did not seek support at all — either because they ignored it or were unsure in their response. This rate is similar to rates observed in previous Thorn research looking at minors’ responses to potentially harmful online sexual interactions.

Among young people who’ve been victimized by deepfake nudes, over half (60%) used at least one online tool, such as platform-based blocking or reporting features, as part of their response, and more than half (57%) sought at least one form of offline support, by telling someone about what had happened to them, whether it be a parent, school authority, friends, or police. While the sample size was small, nearly half (48%) of teens who had deepfake nudes created of them indicated they told their parents/guardians or a trusted family member.

Blocking technologies were the most commonly utilized online tool, with nearly half (48%) of victims choosing to block the offending user, compared with 35% who reported the user. 

In fact, blocking the other user emerged as the most frequently reported response type, followed by reporting the user and telling a parent or other trusted family member (34%). Among teens, blocking the other user (49%) and telling parents (48%) were the most reported response types.

Response versus anticipated response

As with other online harms there was a notable gap between the anticipated responses of nonvictims of deepfake nudes and the actual responses of those who had been victimized.

Among those who have not personally experienced deepfake nude abuse they were most likely to think they would tell a parent or trusted adult (62%), while only 1 in 3 (34%) victims of deepfake nude abuse did. This gap was also seen among teens: 72% anticipated they would tell a parent or trusted adult, while only 48% of teen victims did. 

Other discrepancies between anticipated response and actual response were also noted, including a decreased likelihood for victims to report the offender to the police (-32%) or to the platform where the content is being shared (-22%). 

Alternatively, victims were more inclined to ignore the experience (+12%) or confide in online friends (+8%) when compared to nonvictims’ anticipated responses.

Deepfake nude creators

Among young people surveyed, 2% indicated they had used technology to create deepfake nudes of another person.

AMONG ALL PARTICIPANTS

2% of young people reported they have used technology to create deepfake nudes of someone else

While the sample size was small (n = 24), it offers an initial snapshot into the practices and motivations of young people involved in creating deepfakes.

The data on young deepfake creators explored in this section should be viewed as directional, underscoring the urgent need for more nimble and comprehensive research on young people’s role in perpetrating this abuse.

Victims targeted

Among the subsample, deepfake creators reported being most likely to have created deepfake nude imagery of an adult (62%), with roughly 1 in 3 (36%) indicating they had made deepfake nude imagery of a minor.

Creators of deepfake nudes also overwhelmingly said they created the content of females (74%).

Motivations

When asked about their motivations, the portion of young people who have created deepfake nudes of others described varied and individualized reasons, including sexual curiosity, pleasure-seeking, revenge, or creating the content as a result of pressure or influence from friends.

To get revenge on them for bullying me.

14, MALE, WHITE, NORTHEAST

Just to see what it would look like. Curiosity basically.

15, MALE, HISPANIC OR LATINX, MIDWEST

Technologies used

When asked how they learned about the websites, platforms, or apps they used to create the content, deepfake nude creators described multiple, highly accessible pathways. This included via social media platforms (71%), such as Facebook, Instagram, Snapchat, TikTok, and YouTube, via search engines (53%), such as Google and Bing, and via direct link (25%) that someone shared with them.

Deepfake nude creators were also asked how they accessed the technologies they used to create deepfake nudes.

Most (70%) creators said they downloaded the app they used from their device’s app store (e.g., Apple’s App Store or Google’s Play Store).

AMONG ALL PARTICIPANTS

70% of young people who created deepfake nudes of others downloaded the apps through their device’s app store

Roughly one in three creators (30%) indicated they didn’t have to download the technology they used to create the content.

When asked to identify the websites, platforms, or apps they used to create the deepfake nude content, deepfake nude creators described several technologies involved in some way in the creation of the deepfake nude content. 

Sometimes, they identified specific apps publicly advertised as creating generative AI content (e.g., DeepFaceLab, AnimeGenius).

In other cases, respondents identified general-purpose technologies for image editing (e.g., Photoshop, Canva) or image sharing more broadly (e.g., Snapchat, OnlyFans, PornHub).

The responses from deepfake nude creators about the technologies they used highlight how creating deepfake nudes often involves more than just the actual generative AI technology or model — it involves sourcing benign images to seed the deepfake and leveraging platforms to distribute the resulting content.

Sharing the deepfakes they created

Deepfake nude creators were also asked if they had shared the deepfake nude content they made of someone else. 

65% of self-reported deepfake nude creators said they had shared their content in some capacity

1 in 3admitted to sharing the content with peers at their school (30%).

1 in 4creators (27%) reported they never shared the content with anyone else.

Considering a portion of creators indicated they never shared the content with others, potentially implying that it was created solely for their personal consumption, reinforces a reality that reported victimization rates likely underestimate the prevalence of this experience. That is, not only do victims face challenges to disclosure, but also many victims may not know that the content exists.

Looking ahead

While more research is necessary to fully understand the impacts of deepfake nudes on young people, the findings in this report highlight immediate opportunities to strengthen intervention strategies that can prevent these harms and better mitigate their effects when they occur.

Young people overwhelmingly recognize deepfake nudes as a form of technology-facilitated abuse that harms the person depicted.

RECOMMENDATION: Align societal messaging around the harm of deepfake abuse.

The message is loud and clear: deepfake nudes are harmful.Young people recognize the tangible emotional, psychological, and social harms inflicted by deepfake nudes, and they emphasize that the violation lies not in how the content is created but in its existence and consumption. Moreover, young people highlight how the inability of viewers to discern whether the content is “real” is a quality that further contributes to victims’ feelings of shame, fear, and loss of autonomy.

By aligning societal messaging with the reality young people already recognize, prevention- and intervention-focused education, campaigns, and support systems can begin to address deepfake abuse with the urgency and empathy it requires. This clarity is essential for ensuring victims receive meaningful support, discouraging the continued evolution of nonconsensual behaviors, and laying the groundwork for systemic action to mitigate this abuse.

Deepfake nudes already represent real experiences that young people have to navigate.

RECOMMENDATION: Establish and socialize community responses for deepfake nude experiences.

The high level of familiarity young people have with deepfake nudes, coupled with the significant number reporting personal connections to this harm, signals how quickly this form of technology-facilitated abuse has infiltrated youth culture.

Without immediate intervention, deepfake nudes risk becoming an entrenched digital threat young people must endure. While societal messaging provides a critical foundation for acknowledging harm, the responsibility for raising awareness, establishing policies, and supporting victims must be shared across caregivers, youth-serving organizations, and schools.

To meet this challenge, youth-serving organizations and schools must be equipped with clear, actionable protocols for recognizing harm, documenting incidents, and connecting victims to critical resources, such as psychological support, content removal tools, legal aid, and helplines. 

By working together, schools and youth-serving organizations can not only provide immediate support for deepfake nude disclosure and recovery but also normalize seeking help for other nonconsensual harms.

Among the limited sample of young people who admit to creating deepfake nudes of others, they describe easy access to deepfake technologies.

RECOMMENDATION: Implement technical safeguards to minimize and prevent misuse.

Young people who admit to creating deepfake nudes describe simple and straightforward access to the technologies through app stores, search engines, and social media platforms. Addressing this interconnected network of accessibility is crucial to curbing misuse and preventing harm.

By voluntarily adopting responsible AI principles to mitigate the misuse of generative AI models and engaging with external experts to assess human rights and child safety implications, technology companies can immediately reduce the misuse of generative AI technologies, set a precedent for responsible innovation, and build momentum for long-term regulatory frameworks. Without these interventions, the spread of deepfake nudes will continue to outpace efforts to control the misuse of generative AI for these purposes, further normalizing the creation and distribution of nonconsensual content at scale. Limiting the accessibility of tools designed or misused for abuse is not about restricting innovation but ensuring these technologies are developed, deployed, and used responsibly.

Methodology

Research design

This research focused on young people aged 13-20 in the United States.

Research methods were designed to identify respondents’ perceptions and experiences related to deepfake nude imagery.

PHASE 1: EXPLORATORY INTERVIEWS WITH SUBJECT MATTER EXPERTS

The first phase of this research was dedicated to exploratory information gathering to help orient and frame the subsequent focus of the more in-depth survey instrument.

In total, 16 subject matter experts from across the child safety ecosystem were identified and consulted during this phase.

PHASE 2: QUANTITATIVE ONLINE SURVEY

In total, 1,200 young people from across the United States participated in an 18-minute online survey from September 27, 2024, to October 7, 2024.

Download the full report for more details on methodology, mitigations, and results.

Suggested citation

Thorn. (2024). Deepfake Nudes & Young People: Navigating a new frontier in technology-facilitated nonconsensual sexual abuse and exploitation. https://info.thorn.org/hubfs/Research/Thorn_DeepfakeNudes&YoungPeople_Mar2025.pdf 

 

Download Full Report

Resources

If you or someone you know has been a victim of deepfake nudes, resources are available for immediate support. It is never your fault and you are never alone. Some important resources are provided below:

If the person targeted is under 18

If the person targeted is under 18, report the material to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline. You can also submit the content to NCMEC’s Take it Down service, which helps remove the content from the internet.

If the person targeted is 18 or over

If the person targeted is 18 or over, you can submit the content to StopNCII.org, a project operated by the Revenge Porn Helpline and dedicated to supporting take-down efforts.

Additional resources and information on deepfake nudes

The Cyber Civil Rights Initiative (CCRI) also offers an image-abuse helpline (844.878.2274) in addition to a step-by-step guide for what to do if you find yourself the victim of deepfake nudes.

If you are a parent or guardian and are interested in talking with the young people in your life about deepfake nudes, check out the following resource to support that dialogue: Navigating Deepfake Nudes: +A Guide to Talking to Your Child About Digital Safety (Thorn).

Interested in Thorn research?

Join our research distribution list to stay up to date on our latest research findings.

Sign Up