2024 IMPACT AT A GLANCE
700+
Law enforcement agencies using Thorn tools
4,162,256
Potential child sexual abuse material (CSAM) files detected
112.3 billion
Total files processed to detect child sexual abuse material
3,089,478
Lines of text processed to identify child exploitation
60+
Companies using Thorn detection tools
3
Research reports published
Letter from the CEO
Have you ever had an experience that shifted your understanding of something? A moment that pushes you forward while making you reflect back? That’s what 2024 was for Thorn. We made important progress in combating the spread of abuse material and helping investigators find child victims. But we also saw new threats emerge. The rise of financial sextortion and the misuse of generative AI are already having devastating impacts on young people. It has made me—and everyone at Thorn—more determined than ever to build a world where every child is free to simply be a kid.
Your support made it possible for us to meet these threats head-on. With your help, we had the resources to research these dangers and lead the response.
- We partnered with leading AI companies to create Safety by Design principles for rapidly growing generative AI technologies.
- We surveyed more than 2,250 young people to understand the risks they face and get a pulse on their online lives.
- We uncovered a troubling rise in financial sextortion, with teen boys increasingly being targeted.
Technology has created new, urgent risks for children, but we also see its potential to be a force for good. In a world where tech can harm kids, we believe it can also protect them. Thanks to your support, we’re using AI to help keep more children safe.
- More investigators are using our victim identification tools, helping them find child victims faster and reducing their exposure to traumatic abuse material.
- Safer Predict helps platforms detect previously unreported child sexual abuse material and spot conversations that may indicate a child being exploited.
Even as we lean into technology, we never lose sight of what matters most: the children. Behind every image or case is a child in need of safety and hope. Our work protects families, communities, and children around the world.
So, how are we responding to this crisis? We’re meeting it with urgency and a comprehensive strategy. We’ve evolved our mission: to transform how children are protected from sexual abuse and exploitation in the digital age.
We’re deepening our focus on youth-centered research, innovative technology, victim identification, and platform safety solutions. Together, these efforts strengthen our ability to identify, disrupt, and ultimately reduce child sexual abuse and exploitation—moving us closer to a world where every child is free to simply be a kid.
Thank you for standing with us and helping us protect children.
Julie Cordua, CEO
KEY INITIATIVES
Finding children faster
36
Countries where Thorn victim identification tools are used
700+
Law enforcement agencies using Thorn tools
Detective Michael Fontenot shares why saving investigative time means everything to a child being sexually abused.
When a child is in an active abuse situation, every moment matters: Child victims deserve to be found faster, and the amount of digital evidence investigators must review to find and help them keeps growing. Trying to find critical evidence files is like finding a needle in a haystack; it’s both time-consuming and emotionally taxing for investigators. Now more than ever, child victims need tools that help investigators work smarter, not harder.
Thorn’s CSAM Classifiers are incredible machine learning models that can quickly detect suspected child sexual abuse images and videos at a speed far beyond manual review. This helps investigators prioritize the most critical files in the midst of overwhelming case volumes and ultimately helps them identify children faster and remove them from harm. These classifiers are now deployed as part of Thorn Detect, our new digital forensics solution for investigators.
The impact of Thorn’s technology isn’t just about efficiency for investigators. It’s about children’s lives. Thorn Detect gives us back time, time to work these cases, time to work more of these cases, time to find more victims, time to stop the abuse sooner. What took us weeks now takes hours.
Det. Michael Fontenot, North Texas Internet Crimes Against Children Task Force (ICAC)
Connecting and moving the entire child safety ecosystem with research & insights
Thorn is a leader and pioneer in research initiatives that deepen our understanding of the experiences and risks young people face online. This work provides powerful insights to online platforms, policy makers, law enforcement, and other organizations that need to respond to emerging threats quickly. In addition to fielding research, we map digital threats to develop safety frameworks for critical technologies that proactively address their potential impacts on child safety.
3
Original research reports published
2,104,725,139
Report media impressions
11
Companies committed to Safety by Design principles
A look back on the research & insights Thorn donors helped make possible:
We deepened our collective understanding of financial sextortion.
Our groundbreaking study, conducted in collaboration with the National Center for Missing & Exploited Children (NCMEC), shows that financial sextortion is sharply on the rise, and is a threat primarily targeting unsuspecting teenage boys. Kids experiencing sextortion often do not see themselves as victims and instead believe it is their fault that this is happening to them. Kids facing this type of online abuse can feel shame, fear, hopelessness, and isolation, which perpetrators rely on to carry out their threats.
Read the blog for more insights.
Dive into the research report.
Watch the webinar covering the topic.
Our annual youth monitoring revealed that deepfake nudes were already a real concern.
Since 2019, Thorn’s research initiatives have helped us understand and track youth behaviors, experiences, and risks online.
Last year’s report showed that with generative AI tools at their fingertips, young people are creating deepfake nudes of their peers. While the majority of minors do not believe their classmates are participating in this behavior, roughly 1 in 10 minors reported they knew of cases where their peers had done so. This alerted us to a concerning trend and propelled our latest research on deepfake nudes released in 2025.
We took a proactive look at emerging technologies that could pose a threat to children.
We know that nearly all technological innovation in our digital lives brings opportunities and risks, and it often has safety implications for children. To proactively identify key technologies that stand to significantly impact the fight against child sexual abuse and exploitation in the digital age, Thorn partnered with the WeProtect Global Alliance on the Evolving Technologies Horizon Scan project.
This report took a look at specific technologies we believe will have an impact on child safety:
- Predictive AI
- Generative AI
- End-to-end Encryption (E2EE)
For a deeper understanding of each, we recommend reading the full report.
In 2024, we spearheaded the adoption of Generative AI Safety by Design principles.
To address growing concerns about generative AI safety, we partnered with leaders in the space to publicly commit to Safety by Design principles. We also published a white paper outlining the child safety risks and threats posed by generative AI and how to mitigate them through the principles of Safety by Design.
Along with this Safety by Design framework, Thorn has also collaborated with the National Institute of Standards and Technology (NIST) and the Institute of Electrical and Electronics Engineers (IEEE) to integrate the principles and recommended mitigations into new and existing industry standards.
Stay up to date with progress reports from participating firms.
Watch a recording of our conversation with Google, OpenAI, and StabilityAI.
See the Safety by Design panel discussion at 2024 TrustCon.
Regardless of whether you’re a parent or not, this issue is incredibly terrifying. We should all support Thorn’s mission to defend children from harm.
Maria, MA
Making the internet safer and reducing revictimization
As images and videos of a child’s abuse continue to circulate online, they perpetuate the victim’s trauma with each share and view. At Thorn, we equip tech platforms with purpose-built solutions and expert consulting to halt the spread of abusive content and reduce revictimization. In 2024, we continued to expand our technological capabilities to include detecting child sexual exploitation within text conversations, helping more partners address the full spectrum of harms affecting children online.
4,162,256
Total suspected CSAM files detected
2,237,225
Files classified as suspected CSAM
1,979,406
Known CSAM files matched
3,089,478
Lines of text processed to identify child exploitation
3,184
Lines of text classified as potential child exploitation
112.3 billion
Total files processed
60+
Companies using Thorn’s CSAM detection products
Key moment
In 2024, we expanded harm detection to text conversations with Safer Predict.
This comprehensive, AI-enabled solution allows content-hosting platforms to proactively detect and report previously unidentified CSAM across images, videos, and text conversations. This expanded scope of detection is critical to identifying children in active abuse situations and law enforcement’s ability to remove child victims from harm.
Now, platforms can use Thorn’s child sexual abuse text classification model to add another crucial layer of protection by analyzing conversation context to identify sextortion, access to children, and other exploitation risks before abuse escalates.
This innovative tool leverages Thorn’s advanced machine learning models, trained on data from the NCMEC CyberTipline.
case study
GIPHY gets proactive about CSAM detection.
GIPHY, a platform serving 10+ billion pieces of content to over one billion users daily, previously relied on user reports to identify child sexual abuse material, removing only about 100 CSAM files annually.
After implementing Thorn’s Safer tools in 2021, GIPHY now proactively screens 1-2 million GIFs monthly through both hash matching and AI classification.
The impact has been dramatic: GIPHY has increased CSAM detection and removal by 400%, while user reports of such content have virtually disappeared — with only a single confirmed user report since implementation.
That means less revictimization and fewer people online exposed to this horrific content.
I want my 11 grandchildren and every child to be safe. The only way to end child sexual abuse is if we all work together, and my contribution is donating to Thorn monthly.
Cindy, TN
Empowering parents and youth to prevent abuse
By providing targeted resources that engage both parents and youth, we have empowered more individuals with knowledge and support to navigate digital safety.
NoFiltr
A judgment-free resource where young people can find support, not shame, when navigating uncomfortable online experiences—whether they’re seeking help for themselves or looking for ways to support a friend.
12.3 million
Engagements across NoFiltr social channels
Thorn for Parents
These free resources help parents be their child’s safety net in the digital era. We focus on practical guidance for navigating challenging conversations about digital safety with the young people in their lives.
159,811
Visitors to our parents resource hub
I’m a survivor and I will do everything I can to make sure no child suffers this kind of abuse. Together as Thorn Builders, we are making a difference.
Lisa, NC
Informing policy
65
Policy discussions
12
Technical briefings
8
Speaking engagements
We help policymakers understand the complex intersection of technology, child protection, and emerging threats. Our expertise informs critical legislation in key regions — particularly in the U.S. and across the European Union — ensuring that regulatory frameworks address current challenges and anticipate future risks.
Thorn in Action
Together in 2024, we brought greater visibility to the harms children face in the digital era. Thorn was active in the media, helping spread awareness and understanding of urgent threats like sextortion and grooming. Many of our supporters joined us for important conversations to take sexual abuse and exploitation out of the shadows, standing with survivors to tell their stories.
Key Moments
Breaking the silence
Pauline Stewart and Lennon Torres joined us to bravely share their stories about how online sexual exploitation changed their lives.
Pauline Stewart lost her son to financial sextortion. She has become a passionate advocate for online safety, using his story to educate parents and children about the crime of sextortion.
Lennon Torres grew up in the spotlight as a professional dancer. It didn’t take long for her online presence to make her a target for bad actors looking to exploit her for sexual abuse material.
Creating conversations through media engagements
Thorn thought leaders engaged with 155 journalists in our issue space to provide insights, share data, and provide an expert point-of-view on the harms and threats facing our children online.
2024 FINANCIALS
Financial transparency
Your support enables us to make a difference for children every day.
Financial data is unaudited.
Join Us
Protecting children from sexual abuse and exploitation is only possible with your help.
Our generous donors believe in this mission and support our work financially, accelerating the speed of hope for child victims. The Thorn community moves our mission forward; every dollar helps build a world where every child is free to simply be a kid.
We can’t thank our donors enough for empowering a momentous 2024.
Take a stand to protect kids.
📣 Help raise awareness and celebrate your impact.
Access your social kit to share Thorn’s impact with your network.