New Report Finds Sustained Increase in Minors Sharing “Nude Selfies” Since 2019
October 25, 2022
5 Minute Read
LOS ANGELES, Oct. 25, 2022 /PRNewswire/ — New research from Thorn, a technology nonprofit that builds tools and programs to defend children from online sexual abuse, found that roughly 1 in 5 teenagers report having shared a nude image of themselves. The research, Self-Generated Child Sexual Abuse Material: Youth Attitudes and Experiences in 2021, is the only annual tracking survey of its kind monitoring changes in minors’ behaviors and attitudes related to self-generated child sexual abuse material (SG-CSAM), more commonly known as “sharing nudes” or “sexting.”
SG-CSAM is a complex challenge and represents a variety of experiences, risks, and harms for kids. Some images represent a child who is being groomed and coerced, while others represent a teen innocently “sharing nudes” with a partner. But, regardless of how these images originate, they represent a real and serious risk to the child involved when their imagery is coercively solicited by adults or non-consensually re-shared among peers.
Since 2019, Thorn has conducted this annual study, which is based on a survey of over 1,000 minors aged 9-17, to inform its work tackling the challenges of SG-CSAM. The research grounds the critical work of child safety advocates and technologists in facts about minors’ behavior. Importantly, the year-over-year data helps illuminate areas for interventions that equip youth to be safe online.
The research’s key findings include:
- Since 2019, there has been a sustained increase in minors reporting that they have shared their own nude photos with others. In 2021, roughly 1 in 6 minors reported sharing their own SG-CSAM – a more than 60% increase from the 2019 survey. This includes 1 in 7 pre-teenagers and 1 in 5 teenagers.
- When compared with findings from 2019, more preteens and teenagers believe their friends non-consensually re-share nudes. In 2021, the perceived normalcy of re-sharing SG-CSAM increased, with 1 in 6 minors responding that their friends sometimes non-consensually re-shared someone else’s nudes. This figure has steadily increased every year since the start of the study.
- Boys in particular continue to demonstrate heightened risk for sharing nude selfies. Preteen and teenage boys perceive nude-sharing behavior as normal and report a higher likelihood than girls of believing re-sharing others’ content is legal. Since 2019, the number of preteen boys who reported sharing their own nudes doubled, while the number of teenage boys sharing their own nudes nearly tripled.
- Latino and Hispanic youth emerged as a group demonstrating heightened risk related to SG-CSAM compared to other groups. In 2021, Hispanic and Latino minors reported a higher likelihood to share their own nudes, believe their friends are sharing nudes, and to non-consensually re-share someone else’s content compared to their non-Hispanic and non-Latino peers.
The report also sheds light on minors’ activity on different platforms, finding:
- Minors who have shared or re-shared SG-CSAM have a notably higher daily usage on social platforms than their peers. While the popularity of certain platforms like YouTube, TikTok, Instagram, and Snapchat was consistent across the entire sample in the research, youth who have indicated they shared, re-shared, or were sent SG-CSAM generally spent more time on these platforms each day. Continued research in this area is essential to identify the connection between rates of general platform usage and likelihood to have a SG-CSAM related experience.
- In 2021, gaming platforms saw the greatest increase in popularity among youth. Out of all digital platforms, Minecraft, Fortnite, and Roblox showed the largest gain in the number of minors who have ever used them since 2020.
“These insights help us better understand what kids are experiencing online and, in turn, help parents understand the risks present in digital spaces and how to mitigate them,” said Julie Cordua, CEO of Thorn. “All young people go through phases of exploration and curiosity as a normal and healthy part of development. These findings underscore the need to equip parents with the tools to meet kids where they are while having productive conversations that help keep their kids safe online.”
To get these tools into parents’ hands, Thorn launched Thorn for Parents, a digital resource hub designed to assist parents and caregivers in having earlier, more frequent, and judgment-free conversations with kids about digital safety. Thorn for Parents brings caregivers face-to-face with the reality that digital safety conversations need to start much younger than they may think and underscores the importance of having them more often to help guide kids through these difficult topics with understanding, empathy, and support.
Thorn’s full report Self-Generated Child Sexual Abuse Material: Youth Attitudes and Experiences in 2021 can be viewed online here.
Methodology: This research was conducted by Thorn in partnership with Benenson Strategy Group. The survey collected self-reported data from minors aged 9-17. In total, 1,141 minors of a nationally representative sample participated in an 18-minute online survey from October 25 to November 28, 2021. Data was weighted to age, gender, race, and geography, based on US Census data. The 2021 survey also incorporated an increased recruitment of minor participants who identified as persons of colors. This research represents a continuation of research originally performed in 2019 and again in 2020.
About Thorn: Thorn is a nonprofit founded in 2012 to build technology to defend children from sexual abuse and eliminate child sex abuse material from the internet. Thorn creates products that identify child victims faster, provides services for the tech industry to play a proactive role in removing abuse content from their platforms, and works directly with youth and communities to build resilient kids. Learn more about Thorn’s mission to build technology to defend children from sexual abuse at Thorn.org.