Skip to content

REPORT: 1 in 10 Minors Say Peers Have Used AI to Generate Nudes of Other Kids

August 14, 2024

3 Minute Read

Monitoring research from Thorn, now in its fifth year, offers a snapshot of the evolving forms of sexual abuse facing kids growing up in the digital age.  

LOS ANGELES — August 14, 2024 — New research from Thorn, a nonprofit that builds technology to defend children from sexual abuse, reveals that roughly 1 in 10 minors report that they know of cases where their friends and classmates have created synthetic non-consensual intimate images (or “deepfake nudes”) of other kids using generative AI tools. This marks an evolution in how child sexual abuse material (CSAM) is produced and shared and the advent of a new weapon in bullying. 

The research, Youth Perspectives on Online Safety, 2023, now in its fifth year, surveyed 1,040 minors aged 9-17 from across the United States. The survey is one of the few globally that works directly with young people to learn about their experiences, challenges, and risks they face in online spaces today. 

“The fact that 1 in 10 minors report their peers are using AI to generate nudes of other kids is alarming and highlights how quickly online risks are evolving,” said Julie Cordua, CEO of Thorn. “This emerging form of abuse creates a potent threat to children’s safety and well-being. We must act swiftly to develop safeguards, educate young people about the dangers of deepfakes, and empower parents to have open conversations with their children about these risks. We can better protect kids in an ever-changing digital landscape by staying ahead of these technological shifts.”

Other key findings of the report include: 

  • More than half (59%) of minors report they had a potentially harmful online experience. That includes more than 1 in 3 minors who say they’ve had an online sexual interaction, and 1 in 5 preteens (9-12-year-olds) who report having an online sexual interaction with someone they believed to be an adult.

  • Roughly 1 in 17 minors report having personally experienced sextortion. In these situations, the extorter threatens to leak explicit imagery depicting the victim if they do not comply with demands, such as returning to (or entering into) a relationship; engaging in sexual acts; sharing explicit imagery of themselves, peers, or siblings; and paying the extorter money. This finding adds further context to Thorn’s Trends in Financial Sextortion report.

  • In 2023, the top platforms where the most minors say they’ve had an online sexual experience were Snapchat (16%), Instagram (14%), Messenger (13%), Facebook (12%), and TikTok (11%). Minor platform users were most likely to indicate they had experienced an online sexual interaction on Omegle (36%), Kik (23%), Snapchat (23%), Telegram (22%), and Instagram (20%).

  • Young people view platforms as an important part of helping them avoid and defend against threats. In 2023, 1 in 6 minors who experienced an online sexual interaction did not disclose their experience to anyone. For those who did choose to take action, platforms served as the first line of defense. Minors who had an online sexual interaction were nearly twice as likely to use platform safety tools such as blocking and reporting over offline support networks such as family or friends. 

Thorn offers several resources for parents and caregivers to address online risks. Thorn for Parents helps caregivers understand the dangers their children face online and facilitates judgment-free conversations about digital safety. NoFiltr offers resources, advice, and knowledge to help young people navigate complicated topics and risky online experiences. 

Read the full report here.

 

About Thorn

Thorn is a nonprofit that builds technology to defend children from sexual abuse. Founded in 2012, the organization creates products and programs to empower the platforms and people who have the ability to defend children. Thorn’s tools have helped the tech industry detect and report millions of child sexual abuse files on the open web, connected investigators and NGOs with critical information to help them solve cases faster and remove children from harm, and provided parents and youth with digital safety resources to prevent abuse. To learn more about Thorn’s mission to defend children from sexual abuse, visit thorn.org.


Get the latest delivered to your inbox