Skip to content

Safer Impact Report

August 25, 2022

3 Minute Read

Our Progress Towards Eliminating CSAM with Safer

At Thorn, we’re on a mission to create a safer internet for children, beginning with the elimination of child sexual abuse material (CSAM). In 2021, with the help of our industry tool, Safer, and its community of customers, we made progress toward this goal.

Check out Safer’s 2021 Impact Report Below

Safer’s customers range in industry and product focus, but they all have one thing in common: an upload button.

Safer empowers content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms at scale. Using Safer’s CSAM elimination strategies that leverage advanced AI/ML models, proprietary research, and an unmatched hash database, our customers made incredible strides in 2021. Together, we’re building a better internet.

Hash Matching

In 2021 Safer hashed 11.1 billion images and videos


In 2021, we hashed more than 11 billion images and videos and grew threefold the number of hashes in our matching service. With 32+ million hashes, Safer delivers the largest database of hashes available in the world to detect CSAM. And, this database is constantly growing thanks to SaferList, a tool we designed to accept contributed hashes from our members. Sharing this data among our members helps us to break down data silos and increase cross-platform intelligence in order to prevent the viral spread of CSAM.

In 2021 Safer customers detected 159,688 CSAM images and videos


Our customers detected more than 150,000 images and videos of known CSAM in 2021. In addition to detecting known CSAM, our classifiers use machine learning to predict whether new content is likely to be CSAM and flags it for further review by content moderators.

The use of classifiers enables our customers to find previously unknown CSAM. In 2021, our classifier detected nearly 100,000 images of potential CSAM that was previously unknown. Finding previously unknown CSAM is a critical step to eliminating CSAM from the open web.


Safer is the only all-in-one solution to detect, review, and report CSAM. Our customers are able to validate, compile and send reports to the National Center for Missing and Exploited Children (NCMEC) directly from our tool. This simplified reporting process contributed to a 1,696% increase of reports facilitated by Safer in 2021 compared to 2020.  (Read more about why more reports is actually a good thing.)

In 2021, NCMEC’s CyberTipline received more than 29.1 million reports from electronic service providers (ESPs) alone. These reports constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing this issue.

By detecting CSAM at scale, our customers contribute to keeping children safe online and provide necessary intelligence to NCMEC that can have a life-saving impact and lead to the rescue of child victims.

Join our mission

Combating the spread of CSAM requires a focused and coordinated approach. Unfortunately, platform protection remains inconsistent across the industry and relies on incomplete and siloed data. Safer is here to change that.

We have compiled the largest hashset available, which means Safer casts the widest net to detect CSAM. As we continue to collect hashes from our members via SaferList, our AI gets smarter, faster, and more accurate. With Safer, CSAM is less likely to fall through the cracks or to show up on content-hosting platforms in the first place.

Originally published July 2022 on

Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.