Skip to content

12 Years of Thorn

November 4, 2024

6 Minute Read

12 years ago this month, Thorn set out with a bold purpose: to build technology to fight the sexual abuse and exploitation of children. It was an ambitious goal, but one we knew was essential. The digital landscape has since evolved rapidly, presenting new and complex threats that were unimaginable at the start.

At the beginning of this journey, we never could have imagined how rapidly the digital landscape would evolve in ways that drastically shape the experience of being a kid. We couldn’t have seen the myriad of new and complicated threats to children that would emerge over the next 12 years.

Who could have predicted, for example, a world where harmful AI-generated child sexual abuse material would be created and begin to spread? Or one in which organized crime rings exploit kids online at a massive scale?

It sounds daunting, and oftentimes, it is – but with your support, we’re fighting back. Here’s a glimpse at how we did just that in our 12th year:

Safety by Design

Starting a movement to address the misuse of generative AI to harm children

This year, we and our partners at All Tech Is Human launched our groundbreaking Safety By Design initiative – an effort that brings together some of the world’s most influential AI leaders to make a groundbreaking commitment to protect children from the misuse of generative AI technologies.

As part of the project, Amazon, Anthropic, Civitai, Google, Meta, Metaphysic, Microsoft, Mistral AI, OpenAI, and Stability AI have pledged to adopt Safety by Design principles to guard against the creation and spread of AI-generated child sexual abuse material (AIG-CSAM) and other sexual harms against children.

The companies agreed to transparently publish and share documentation of their progress in implementing these principles, and we’ve begun sharing that transparency reporting on a regular cadence.

By integrating Safety by Design principles into their generative AI technologies and products, these companies are not only protecting children but also leading the charge in ethical AI innovation. And, with a wave of new AI-facilitated threats to children, the commitments come not a moment too soon.

Deepening our knowledge of urgent threats to children 

With so many kids growing up online – forming friendships, playing games, and connecting with one another – we must recognize both the benefits and very real risks of the digital era for kids.
By understanding the threats children face online, we can develop systems to protect them against the harms introduced by rapidly advancing technologies.
That’s why we continue to conduct and share original research that drives child safety solutions, informs the technology we build, and equips everyone who has a stake in protecting children with the powerful knowledge they need to make informed, tangible change.

This year, we released two key studies:

  • Financial sextortion report: In collaboration with the National Center for Missing and Exploited Children (NCMEC,, we explored the rise in financial sextortion targeting teenage boys, revealing that 812 weekly reports are filed with NCMEC, with most involving financial demands.
  • Youth monitoring report: Our annual report now spans five years, tracking youth behaviors and highlighting emerging risks, such as the increasing use of deepfake technology by minors.

These studies were widely covered in the media and utilized by our partners, helping to raise awareness and inform strategies designed to defend children across the ecosystem.

Getting our tech into more investigators’ hands

In the fight against child sexual abuse, law enforcement officers face daunting challenges, not least of which is the overwhelming task of sifting through digital evidence.

Getting technology like our CSAM Classifier into the hands of as many law enforcement agencies as possible is critical. To help, this year Thorn announced our partnership with Griffeye, the Sweden-based world leader in digital media forensics for child sexual abuse investigations. Now, Thorn’s CSAM Classifier is available directly in Griffeye Analyze, a platform used as a home base by law enforcement worldwide. Through this partnership, we’re expanding our impact by providing law enforcement with better tools that create a stronger, more unified, and more resilient front against child sexual abuse. 

Building technology to detect child sexual exploitation in text conversations online

This year, Thorn launched a groundbreaking advancement in our mission to protect children online: Safer Predict.

By leveraging state-of-the-art machine learning models, Safer Predict now empowers platforms to cast a wider net for CSAM and child sexual exploitation detection, identify text-based harms, including discussions of sextortion, self-generated CSAM, and potential offline exploitation, and scale detection capabilities efficiently. By leveraging AI for good, this new technology enhances our ability to defend children from sexual abuse by detecting harmful conversations and potential sexual exploitation.

Expanding our protection efforts with tech companies

This year, we’ve expanded our base of detection by partnering with more technology companies committed to fighting child sexual exploitation alongside us. The adoption of our technology on even more platforms enables faster and more accurate detection of harmful content. These collaborations not only amplify our impact but also create a stronger, collective defense against the evolving threats children face every day in their online lives.

Looking ahead

As we celebrate 12 years of driving technological innovation for child safety, we’re excited for what lies ahead. Next year, we aim to harness this collective power even further, advancing our technology and empowering more partners to protect children. But in order to do so, we have to harness the power and generosity of those who believe in our mission. With you by our side, we’re confident that together, we can protect even more children and build a safer digital world for all. Want to support Thorn? Make a donation now to help us do even more to protect children in the coming year.



Get the latest delivered to your inbox