Skip to content

🌟 See the difference you made: Read the 2024 Impact Report

Accelerating child victim identification: How technology transforms investigations

July 15, 2025

4 Minute Read

Investigators must review images or video of a child being sexually abused from various sources—CyberTips from social media platforms, dark web activity, or devices seized as the result of a search warrant. Regardless of the source, they face the same critical challenge: analyzing vast amounts of digital content to find clues that could help identify child victims.

These investigations can take weeks or months. Meanwhile, children may be enduring active abuse. The faster investigators find these clues, the faster they can remove children from harm.

The volume challenge

When reviewing child sexual abuse cases, investigators face enormous volumes of files—potentially hundreds of thousands of files, even up to terabytes of data. Each file must be processed because perpetrators hide child sexual abuse material (CSAM) by mislabeling files or embedding them among legitimate content.

Each file potentially holds a missing puzzle piece: a school logo, regional poster, or other clues about a child’s identity or whereabouts. CSAM is often located in folders containing additional identifying information that may provide content about dozens or hundreds of victims.

Our AI-powered solution

Helping investigators find children being sexually abused more quickly is one of Thorn’s four child safety pillars. Agencies in 40 countries use Thorn’s victim identification intelligence tools to address this needle-in-a-haystack problem.

At the heart of our solutions is Thorn Detect, featuring advanced CSAM Classifiers that provide three key advantages:

  1. Identify suspected new CSAM – Our classifier detects suspected new abuse material that would be missed using hashing and matching alone, often representing children in active abuse. 
  2. Trained directly on verified CSAM – Our models are trained in part using data provided by the National Center for Missing & Exploited Children through their CyberTipline program, helping predict the likelihood that content contains CSAM. 
  3. Continuously improved – Since 2020, real-world deployment and customer feedback allows our team to iterate and improve the model. 

Using state-of-the-art machine learning, these tools process more files faster than humans could manually, quickly finding suspected abuse material and transforming what used to be a painstakingly manual process.

This same innovative approach has been applied to other technologies and tools, providing solutions for victim identification specialists to discover abuse material perpetrators are sharing, how they cooperate to abuse children, and ultimately, to find the children who are being abused.  

Taking perpetrators off the street

Investigators often have a limited window of time to hold a suspect. Finding CSAM files quickly can mean the difference between maintaining custody of a suspected perpetrator or releasing them to potentially harm again. Additionally, the volume of CSAM in a suspect’s possession affects sentencing—quickly identifying the full scale helps put dangerous abusers behind bars for substantial time.

Supporting investigator wellbeing

Thorn’s solutions also help reduce the emotional burden for investigators. Repeated exposure to disturbing content creates vicarious trauma, but our tools mitigate this by detecting and categorizing files by likelihood of containing CSAM. Investigators can then choose when to review flagged content, giving them crucial control over their exposure.

Imagine you’re swiping through photos on another person’s phone. Suddenly, you see a horrible image. That shocking experience sticks with you for some time. Now imagine experiencing that repeatedly over the course of days and weeks. This kind of exposure is an occupational challenge for many types of first responders and is known as vicarious trauma.

For investigators involved in child sexual abuse cases, this repeated exposure is their reality. However, Thorn’s victim identification tools help relieve vicarious trauma by mitigating the burden of manual reviews. The solutions detect which files are likely to contain CSAM to varying degrees and categorize them accordingly. Then, the investigators can choose to review the CSAM files when they’re ready. That degree of control over their own exposure means a great deal to investigators who deal with this material daily.

A race against time

Investigators on the front lines of protecting children from sexual abuse serve an important role in our communities, and are often in a race against time. The faster they can detect CSAM and find clues that help identify a child victim, the faster they can remove that child from harm and put a perpetrator behind bars. We’re proud to build technology, like Thorn Detect, that accelerates these critical efforts, helping to create a new chapter and a brighter future for the children involved.

Help us find child victims faster

Thorn’s child victim identification pillar is primarily funded by donor support, which helps get tools like Thorn Detect into more investigators’ hands worldwide. Your philanthropic support allows us to provide transformative solutions to more investigators, building a digital safety net for children and transforming how they’re protected in the digital age.

Become a force for good and donate today.


Get the latest delivered to your inbox