Skip to content

🌟 See the difference you made: Read the 2024 Impact Report

How Thorn Makes the Internet Safer & Helps Stop the Cycle of Abuse

July 7, 2025

6 Minute Read

For young people, the internet is a place for self-discovery, socializing, and building meaningful connections online. But these same spaces can also be used by perpetrators who target children for grooming and sextortion, and use technology to share child sexual abuse material (CSAM). 

Because of this, technology companies play a key role in protecting children from abuse and exploitation in the digital age.

At Thorn, we empower tech companies in that pursuit. Our purpose-built solutions equip tech platforms to combat the spread of child sexual abuse material and help reduce the cycle of trauma that its circulation causes. As experts in child safety technology, we also help companies understand their specific role and capabilities in participating in the child safety ecosystem.

Combating CSAM is a critical step toward creating safer online environments and supporting survivors of abuse. Our multifaceted approach empowers Thorn and our platform partners to combat sexual abuse and exploitation on the open web, and protect children on a global scale.

 

Safer Built by Thorn logo

Stopping the spread of child sexual abuse material

It may be confusing to learn that the very platforms we use to connect with our friends and family are also used by perpetrators to create and share child sexual abuse material. Online, they are able to form tight-knit communities where they facilitate the creation and trade of child sexual abuse material.

What is CSAM?

But what exactly is child sexual abuse material, or CSAM? Child sexual abuse material is legally known as child pornography in the U.S. and refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, live streaming, and digital or computer-generated images, including AI-generated content, indistinguishable from an actual minor. The emergence of generative AI broadens the scope to include AI-adaptations of original content, the sexualization of benign images of children, and fully AI-generated CSAM.

How big a crisis is child sexual abuse material online? In 2004, 450,000 files of suspected CSAM were reported in the U.S. By 2024 that number had skyrocketed to more than 61 million files. That’s more than 100 files being reported each minute. The internet simply makes it too easy to produce and disseminate this horrific content.

How does revictimization occur?

Even after a child victim has been removed from active, hands-on abuse, photos and videos of their abuse can circulate online and continue the cycle of trauma.

Survivors of CSAM may have their abuse shared tens of thousands of times a year. Each time the content is shared, the victim is abused again.

How does Thorn’s technology stop the cycle of abuse?

Though millions of files of CSAM spread daily, they’re mixed in with even greater amounts of harmless images and videos. This influx of content makes identifying CSAM files incredibly challenging, resource-intensive, and nearly impossible for human review alone. Not to mention the debilitating emotional toll reviewing this material takes on the people working to keep online communities safe.

At Thorn, we developed Safer, our purpose-built solution, to empower tech platforms to detect, review, and report CSAM at scale.

Safer identifies known and previously reported CSAM through its hashing and matching capabilities. It also detects unknown suspected CSAM through its predictive AI image and video classifiers. Finding this unreported abuse material is critical as it helps alert investigators to active abuse situations so victims can be removed from harm. Thorn has also released technology that identifies potentially harmful conversations related to child sexual abuse to stop harm before it starts. Safer arms Trust and Safety teams with a proactive solution for finding CSAM and reporting it to authorities.

By fighting CSAM on their platforms, technology companies can protect children and also break that cycle of revictimization.

 

The effort is working

To date, Thorn has helped the tech industry detect and flag for removal almost 6.5 million child sexual abuse files from the internet. In 2024 alone, Safer detected more than 4 million files of suspected CSAM across tech platforms — making a tangible impact on the lives of children and survivors.

The companies we partner with range from small platforms to some of the world’s largest household digital names.

 

Flickr logo

In 2019, global photo and video hosting site Flickr became a Safer customer and relies on our comprehensive detection solutions to find CSAM on its platform. In 2021, Flickr deployed Safer’s CSAM Image Classifier. Using the classifier, their Trust and Safety team could detect previously unknown CSAM images they likely wouldn’t have discovered otherwise.

One classifier hit led to the discovery of 2,000 previously unverified images of CSAM and an investigation by law enforcement – in which a child was rescued from harm.

In 2022, Flickr reported 34,176 files of suspected CSAM to the National Center for Missing and Exploited Children. This is data that can be acted on to identify and remove child victims from harm.

 

VSCO logo

VSCO, an app for photo and video creation communities, deployed Safer in 2020. In the face of accelerating CSAM online, VSCO’s core dedication to safety drove them to prioritize detection on their platform.

VSCO uses Safer to proactively target CSAM at scale. The tool speeds their efforts and increases the amount of content they can review, allowing them to cast a wider net. In three years, they’ve reported 35,000 files of suspected CSAM to authorities.

 

A multifaceted approach to online child safety

Tackling child sexual abuse online requires a comprehensive approach, involving technology, industry education, policy, and community engagement. Thorn works at each level to create systemic change and strengthen the child safety ecosystem.

Safety by Design

In the tech industry everyone from AI developers to data hosting platforms, social media apps to search engines, intersect with child safety in some way. Thorn helps them understand and identify the threats that occur on their platforms and how to mitigate them.

The emergence of generative AI only accelerated the spread of technology facilitated child sexual abuse. Thorn urges companies to take a Safety-by-Design approach, which requires safety measures to be built into the core design of technologies.

At Thorn, we build technology to defend children from sexual abuse. But we’re just one piece of the puzzle, along with the tech industry, policymakers, and the public. 

When we work together, we can combat the spread of CSAM. In doing so, we’ll stop revictimization, and start to build a world where every child is free to simply be a kid.

 

Join us

Become a force for good. Learn more about Thorn’s solutions and how you can contribute to transforming the way we protect children from sexual abuse and exploitation in the digital age.

See Our Solutions


Get the latest delivered to your inbox