Skip to content

Protect your child: Explore Thorn’s Deepfake Nudes Resource Guide

Today, the internet is Safer.

April 28, 2025

4 Minute Read

At Thorn, we’re dedicated to building cutting-edge technology to defend children from sexual abuse. Key to this mission is our child sexual abuse material (CSAM)  and CSE detection solution, Safer, which allows tech platforms to find and report CSAM and text-based harms on their platforms. In 2024, we had more companies than ever deploy Safer on their platforms. This widespread commitment to child safety is key to building a safer internet and using technology as a force for good. 

Safer’s 2024 Impact

Even though Safer’s community of customers spans a wide range of industries, they all host content uploaded by their users or text inputs in generative engines and messaging features.

Safer empowers their teams to detect, review, and report CSAM and text-based child sexual exploitation at scale. The scope of this detection is critical. It means their content moderators and trust and safety teams can find CSAM amid the millions of content files uploaded and flag potential exploitation amid millions of messages shared. This efficiency saves time and speeds up their efforts. Just as importantly, Safer allows teams to report CSAM or instances of online enticement to central reporting agencies, like the National Center for Missing & Exploited Children (NCMEC), which is critical for child victim identification.

Safer’s customers rely on our predictive artificial intelligence and a comprehensive hash database to help them find CSAM and potential exploitation. With their help, we’re making strides toward reducing online sexual harms against children and creating a safer internet.

Total files processed

In 2024, Safer processed 112.3 billion files input by our customers. Today, the Safer community comprises more than 60 platforms, with millions of users sharing an incredible amount of content daily. This represents a substantial foundation for the important work of preventing repeated and viral sharing of CSAM online.

Total potential CSAM files detected

Safer detected just under 2,000,000 images and videos of known CSAM in 2024. This means Safer matched the files’ hashes to verified hash values from trusted sources, identifying them as CSAM. A hash is like a digital fingerprint, and using them allows Safer to programmatically determine if that file has previously been verified as CSAM by NCMEC or other NGOs.

In addition to detecting known CSAM, our predictive AI detected more than 2,200,000 files of potential novel CSAM. Safer’s image and video classifiers use machine learning to predict whether new content is likely to be CSAM and flag it for further review. Identifying and verifying novel CSAM allows it to be added to the hash library, accelerating future detection.

Altogether, Safer detected more than 4,100,000 files of known or potential CSAM.

Total lines of text processed

Safer launched a text classifier feature in 2024 and processed more than 3,000,000 lines of text in just the first year. This capability offers a whole new dimension of detection, helping platforms identify sextortion and other abuse behaviors happening via text or messaging features. In all, almost 3,200 lines of potential child exploitation were identified, helping content moderators respond to potentially threatening behavior. 

Safer’s all-time impact

Last year was a watershed moment for Safer, with the community almost doubling the all-time total of files processed. Since 2019, Safer has processed 228.8 billion files and 3 million lines of text, resulting in the detection of almost 6.5 million potential CSAM files and nearly 3,200 instances of potential child exploitation. Every file processed, and every potential match made, helps create a safer internet for children and content platform users.

Build a Safer internet

Curtailing platform misuse and addressing online sexual harms against children requires an “all-hands” approach. Too many platforms still suffer from siloed teams, inconsistent practices, and policy gaps that jeopardize effective content moderation. Thorn is here to change that, and Safer is the answer.


Get the latest delivered to your inbox