As we look ahead to 2024, I’m taking time to reflect on the significant strides Thorn has made last year in our vital mission to defend children from sexual abuse. 

The urgency of our mission is at an all-time high. Our research shows that a majority of young people view the likelihood of adults attempting to befriend and manipulate a minor online (aka grooming) as common. In NCMEC’s 2022 CyberTipline report, we saw a dramatic rise in both online enticement of children for sexual acts (from 37,872 in 2020 to 80,524 in 2022) as well as the volume of suspected child sexual abuse material (CSAM) reported to NCMEC (from 20 million in 2017 to 88 million in 2022). 

We all have a role to play in creating a world where every child can simply be a kid. And that’s why Thorn continues our work, day in and day out – and why we won’t stop until we achieve that world together.

Here are some of the key themes I’m thinking about as we enter 2024:

1
We Must Continue to Build Safety by Design into New Technologies

As a nonprofit that builds technology, we stay one step ahead of emerging technologies—both to understand the risks they pose and to determine how they can be leveraged for good. 

Artificial intelligence-generated child sexual abuse material (AIG-CSAM) continues to stay top of mind. Our stance is that now is the time for safety by design, and AI companies must lead the way to ensure that children are protected as generative AI tech is not only built but also becomes more sophisticated. Our Head of Data Science, Dr. Rebecca Portnoff, shared more in The New York Times this year.

Collaboration will be crucial to get in front of this threat. Our robust report, Generative ML and CSAM: Implications and Mitigations, co-authored with our partners at Stanford Internet Observatory, explores the challenges that AI poses in child sexual exploitation. This report will continue to be updated and provide us with more insights about future AI threats. 

Additionally, Thorn’s consulting arm has been hard at work leading Red Teaming Sessions with AI companies to help implement a core component of safety by design. Thorn’s red teaming sessions are designed to stress test generative AI products and identify child safety gaps, edge-case considerations, and unintended consequences related to child sexual abuse and exploitation. Companies that have worked with Thorn have been able to improve the safety and efficacy of their models to manage the risk of child sexual exploitation and CSAM, and to reduce and prevent unsafe responses from the AI.

 

2
We Must Equip More Platforms to Detect CSAM

All content-hosting platforms must proactively detect CSAM. Behind each CSAM file is a real child; those who have not yet been found are in active abuse, while survivors are revictimized through the circulation of their content. 

Safer, our all-in-one solution for CSAM detection, uses advanced AI technology to detect, review, and report CSAM at scale. To date, Safer has found over 2.9 million files of potential CSAM. Now, Safer Essential can reach an even wider audience with a quicker setup that requires fewer engineering resources.

This year, Thorn will continue to build innovative technology to help platforms advance child safety. We can’t wait to share that work with you.

 

3
We Must Address Threats to Child Safety with Research and Resources

Sextortion, grooming, and self-generated child sexual abuse material (SG-CSAM) continue to pose considerable risks to child safety. 

Our original research helps our team and partners across the child safety ecosystem  gain meaningful insights into youth perspectives–from kids themselves. In 2024, we have new and insightful research projects planned to delve even deeper into the evolving issues facing youth. 

Our prevention programs equip both youth and their caregivers with digital safety resources. NoFiltr, our youth-focused program, reduces stigma and sparks open dialogue among young people while providing applicable safety knowledge and support messaging. And through the Youth Innovation Council, we’re partnering with youth who advise platforms and speak publicly to build the internet they deserve. With Thorn For Parents, we’re equipping caregivers with age-appropriate knowledge, tools, and tips to have judgment-free conversations with their kids. 

 

4
We Must Shape Policy and Legislation

Thorn regularly participates in legislative discussions among lawmakers, experts, and survivors. To create real change, we have to advocate for effective policy–not only in the U.S. but globally-–because child sexual abuse knows no borders.

Recent victories include the UK Online Safety Act that was recently signed into law and the Every Image Counts campaign to detect, report, and remove CSAM in the EU. 

Advocating for effective policy is key to accomplishing our goal of eradicating CSAM.

 

5
We Must Build a Philanthropic Community of Support

Our generous community of donors makes our work possible. I hope you’ll consider supporting us to help us make strides toward our mission for years to come.

Together, we are changing the way the world responds to child sexual abuse. Thank you for committing to building a world where every child can be safe, curious, and happy. 

—Julie

Subscribe to Thorn updates.