Skip to content

With a Coordinated Approach, We’re Eliminating CSAM from the Open Web

March 22, 2023

3 Minute Read

At Thorn, we’re on a mission to create a safer internet for children, beginning with the elimination of child sexual abuse material (CSAM). In 2019, we launched Safer, a CSAM detection platform that empowers tech companies to detect and report CSAM at scale. We know that content-hosting platforms play a critical role in tackling this issue. While our customers range in industry and product focus, they all have one thing in common: an upload button

With millions of files being uploaded every day to their platforms, Safer’s customers rely on our comprehensive hash database and advanced artificial intelligence and machine learning models to help them find CSAM. In 2022, our customers made incredible strides. Together, we’re building a better internet.

New Features Launched in 2022

Safer had two major milestones in 2022 with the release of our CSAM Video Classifier, a machine learning classification model, and the expansion of our reporting capabilities to Royal Canadian Mounted Police (RCMP). 

In order to understand the content contained within a video, we use a perceptual Scene-Sensitive Video Hashing (SSVH) technique to hash each of the scenes and frames within a video. Our CSAM Video Classifier then analyzes those hashes and returns a score that indicates the likelihood of that scene containing CSAM. The content is then reviewed by a moderator, who verifies whether or not the flagged content is CSAM. Machine learning classifiers like this are a powerful tool for detecting previously unknown content. In 2022, our customers detected 15,238 videos classified as potential CSAM.

The other major milestone in 2022 was Safer’s new reporting capabilities that enable Canadian tech companies to send reports to RCMP. This was the first expansion of Safer reporting beyond the National Center for Missing and Exploited Children (NCMEC). and is a critical step toward ensuring all platforms have the capability to detect and respond to CSAM – no matter where they’re located. The addition of RCMP reporting provides intelligence necessary that can have a life-saving impact and lead to the rescue of child victims.

Hash Matching

Hashing and matching are Safer’s core services. With the largest database of verified hashes (32+ million hashes) to match against, Safer can cast the widest net to detect known CSAM. In 2022, we hashed more than 42.1 billion images and videos for our customers. That empowered our customers to find 520,000 files of known CSAM on their platforms.

We’re also working to break down data silos with SaferList, a self-managed set of hashlists to which our customers can contribute verified hashes. Our customers have the option to share this data among Safer’s community to increase cross-platform intelligence and diminish the viral spread of CSAM. 

Image and Video Classifiers

In addition to detecting known CSAM, our classifiers use machine learning to predict whether new content is likely to be CSAM and flags it for further review by content moderators. 

The use of classifiers enables our customers to find previously unknown CSAM. In 2022, our classifier detected 304,466 images of potential CSAM—a 205% increase compared to 2021—that were previously unknown. This increase in images classified and the addition of the CSAM Video Classifier this year represent major steps forward toward reaching our goal of eliminating CSAM from the open web.

Reporting

Being an all-in-one solution to detect, review, and report CSAM, Safer enables customers to validate, compile and send reports to the National Center for Missing and Exploited Children (NCMEC)—and now RCMP—directly from our tool. In 2022, Safer customers sent 60,829 reports containing 74,265 files to NCMEC. 

In 2021, NCMEC’s CyberTipline received more than 29.1 million reports from electronic service providers (ESPs) alone. These reports constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing this issue. With Safer, Thorn is equipping the tech industry with an all-in-one solution to address CSAM on their platforms at scale.

Originally published March 2023 on safer.io.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.