Skip to content

The state of the issue

 

Since 2019, Thorn has conducted annual research initiatives to better understand the experiences, challenges, and risks young people face in online spaces today. This work serves to: 

Offer a snapshot of the current state of the issue through young people’s eyes

Showcase emerging trends rooted in an evolving digital landscape

Elevate areas where further exploration and discussion are needed

Metrics at a glance

Online sexual interactions

Young people are navigating sexual interactions, such as requests for nudes and intimate text messages, in their online experiences. And while some of these experiences are consensual, the risk of harm rises when they are outside of their age group, unsolicited, or non-consensual. 

1 in 3 minors aged 9-17 reported they have had a sexual interaction online, including 1 in 5 9-12-year-olds.


Sharing of “nudes”

The decision to share nudes can be the result of sexual exploration and flirtation with a partner. However, some youth are being coerced into sharing through manipulation and deceit.

1 in 7 9-17-year-olds report they have shared nudes, including 1 in 6 teens.


Perceptions of normalcy

While the majority of 9-17-year-olds surveyed have not shared a nude image of themselves, many believe it is normal behavior among kids their age.

1 in 4 minors agreed that it’s normal for people their age to share nudes with each other.


Nonconsensual Resharing

Even in events where a young person believes they are sharing their intimate images with someone they trust, too often that trust is violated and their images are leaked. Unfortunately, the blame often falls on the victims for having shared their images in the first place.

1 in 5 9-17-year-olds report having been shown or sent someone else’s nudes without their consent, including 1 in 8 9-12-year-olds.


Responding to online risks

Minors who experienced an online sexual interaction engaged with both online tools and offline support, but many still choose to handle things alone.

1 in 6 minors faced with an online sexual interaction did not disclose their experience to anyone.


What we’re monitoring

Young people anticipate confronting risky situations online and they view platforms as an important part of helping them avoid and defend against threats should they arise.

Online spaces provide valuable opportunities for young people to explore and connect, but they also pose very real risks. For many young people, potentially harmful online experiences have become an inevitable component of their digital lives. The majority of minors (59%) report they have had a potentially harmful online experience — a sustained increase from 2020 — and 1 in 3 report they have had an online sexual interaction.

Still, 1 in 6 minors minors who experienced an online sexual interaction did not disclose their experience to anyone. Several factors might contribute to a minor’s decision not to take action when faced with a harmful online experience, and the increase in perceived normalcy of these encounters is one of them. Each year, when asked why they did not disclose their harmful online experience to anyone, the leading reason was that they “felt [it] was not a big deal”.

In the face of these risks, young people continue to look to online platforms to help them avoid and respond to potentially harmful online experiences, particularly those of a sexual nature. This opportunity and responsibility should not be neglected.

For kids who do choose to take action when navigating a sexual interaction online, platform safety tools such as blocking and reporting are the first line of defense. Minors consistently prefer online safety tools over offline support networks such as family or friends. Among these tools, blocking is strongly preferred over reporting by young people. While blocking can disrupt continued harassment for one minor, it does not trigger the content moderation protocols associated with reporting that could result in removing a bad actor from the platform and stopping wider abuse that might be at work. Worryingly, reporting was at its lowest rate since data collection started in 2020.

Despite a decrease in the use of reporting tools, young people remain interested in learning how to better leverage online safety tools to defend against such threats. In fact, roughly half report they want platforms to provide more information about online safety as well as how to report and block people.

Reporting and blocking systems need to be accessible, effective, and trusted for them to serve platform users, including young people, fully. Kids show us these are preferred tools in their online safety kit and are seeking more from platforms in how to use them. There is a clear opportunity to better support young people through these mechanisms.


Minors are navigating financial threats and offers in some online sexual interactions.

There is increasing worry about emerging commercial components in the production, spread, and weaponization of apparent SG-CSAM. This is taking different forms across young people’s experiences.

Reports of minors attempting to advertise and sell their own explicit imagery or live streams have highlighted the risks not only of minors being approached and offered money or gifts in exchange for their own material but also of cases wherein a minor has attempted to sell imagery of another minor.

In the 2023 survey, 1 in 8 minors reported they believed a friend had either been given money or gifts by someone they only knew online in exchange for SG-CSAM.

A recent report from Thorn has also highlighted an increase in cases of financial sextortion against minors, where someone is threatened leveraging explicit images in exchange for payment. Sextortion is, sadly, not a new threat for young people; however, the increasing frequency of financially motivated sextortion signals an evolution in the tactics, motivations, and offender and victim profiles.

 

In the 2023 survey, 6% of minors reported someone had threatened to leak an explicit image depicting them if they did not comply with a threat, with teen girls the most likely to have experienced this. Among those who had experienced sextortion, roughly one-quarter reported the perpetrator demanded money as part of the sextortion.

These findings point to areas of risk with limited research and limited public discussion. In both cases, additional attention is needed to better understand the pathways leading to these risks, the opportunities for interventions to combat abuse of technology for these purposes, and the best practices for data-driven safeguarding messaging to reduce the harms of these risks on young people.


Generative AI technologies are being used by young people to create nonconsensual explicit imagery (“deepfakes”) of peers.

As has been widely reported across both popular media and by organizations working in the child-safety ecosystem, generative AI tools are being used to create CSAM. This adds another tool for perpetrators to create new and custom material, re-victimizing survivors whose original abuse may already be known to investigators and offering means to create highly realistic abuse imagery from benign sources such as school photos and social media posts.

The use of generative AI in the production of CSAM, whether from inception or in post-abuse modification, does not change that the resulting photos are still child sexual abuse material and pose significant harm to the minors depicted in and threatened with the images.

In 2023, Thorn’s annual monitoring survey asked minors about their experiences with generative AI being used to create CSAM. One in ten minors reported that their friends or classmates had used AI tools to generate nudes of other kids.

While the motivation behind these events is more likely driven by adolescents acting out than an intent to sexually abuse, the resulting harms to victims are real and should not be minimized in attempts to wave off responsibility.

Significant work is needed within the tech industry to address the risks posed by generative AI technologies to children. However, that work is not a gatekeeper to addressing the behaviors surrounding these harms. While the technology may be new, peer-based bullying, harassment, and abuse are not. It is critical that we speak proactively and directly about the harms caused by “deepfake nudes” and reinforce a clear understanding of what conduct is unacceptable in our communities and schools, regardless of the technology being used.


Today’s digital world is rapidly changing, presenting new and sometimes unexpected challenges for young people as they explore and grow. It’s important for us to continue developing strategies that prioritize creating safe digital environments and implementing effective safeguarding measures. These strategies should be informed by the experiences and perspectives of young people themselves, ensuring they are relevant and impactful in addressing their needs and concerns.

About the survey

This data was collected through an online survey of 1,040 minors from across the United States participated in an online survey from November 3 to December 1, 2023. Specifically, sample makeup included 338 9-12-year-olds and 702 13-17-year-olds.

To ensure a representative sample nationwide, data was weighted to age, gender, race, and geography, based on U.S. Census data.

Research Center

Learn more

Explore key research topics or browse Thorn’s research library to see the latest insights on key topics impacting the child safety ecosystem today.

Explore
Research Updates

Stay in touch

Sign up to stay up-to-date on the most recent research findings from Thorn.

Subscribe