Skip to content

How Thorn Makes the Internet Safer & Helps Stop the Cycle of Abuse

April 5, 2024

6 Minute Read

For kids today, the internet serves as a place for self-discovery, socializing and building meaningful connections online. But these same spaces can also be used by bad actors, who frequent them to target children for grooming and sextortion, and share child sexual abuse material (CSAM).

Because of this, technology companies play a key role in defending children from sexual abuse.

At Thorn, we empower tech companies in that pursuit. Our innovative solutions equip tech platforms to combat the spread of abusive content and end the cycle of trauma that its circulation causes.

As experts in child safety technology, we also help companies understand their specific role and capabilities in participating in the child safety ecosystem.

Combating CSAM is a critical step toward creating safer environments online and supporting survivors of abuse. Our multifaceted approach empowers Thorn and our platform partners to make the internet safer and protect children on a global scale.

 

Safer Built by Thorn logo

Stopping the spread of child sexual abuse material

It may be surprising to learn that the very platforms we use to connect withwith our friends and family are also used by bad actors to create and share child sexual abuse material. They too form tight-knit communities, where they facilitate the creation and trade of this abuse content.

What is CSAM?

But first, what exactly is CSAM? Child sexual abuse material (CSAM), is legally known as child pornography in the U.S. and refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, live streaming, and digital or computer generated images, including AI-generated content, indistinguishable from an actual minor. The emergence of generative AI broadens the scope to include AI-adaptations of original content, the sexualization of benign images of children, and fully AI-generated CSAM.

In 2004, 450,000 files of suspected CSAM were reported in the U.S. By 2023, that number had skyrocketed to more than 104 million files. The internet simply makes it too easy to produce and disseminate this horrific content.

How does revictimization occur?

Even after a child victim has been rescued from active, hands-on abuse, photos and videos of their abuse often circulate online and continue the cycle of trauma.

Survivors of CSAM may have their abuse shared thousands and tens of thousands of times a year. Each time the content is shared, the victim is abused again.

How does Thorn’s technology stop the cycle of abuse?

Though millions of files of CSAM spread daily, they’re mixed in with even greater amounts of harmless images and videos. This influx of content makes identifying CSAM files incredibly challenging, resource-intensive, and nearly impossible for human review alone. Not to mention the incredible emotional toll that reviewing this material takes on the people working to keep online communities safe.

At Thorn, we developed our industry solution Safer to halt the spread of CSAM. The advanced technology solution empowers tech platforms to detect, review, and report CSAM at scale.

Safer identifies known and previously reported CSAM through its hashing and matching capabilities. It also detects previously unknown CSAM through its predictive AI/ML image and video classifiers. Finding this novel abuse material is critical — it helps alert investigators to active abuse situations so victims can be removed from harm. Supporting these actions as well, Thorn is currently working on new technology that aims to identify potentially harmful conversations related to child sexual abuse to stop harm before it starts. Safer arms teams with a proactive solution for finding CSAM and reporting it to authorities.

By fighting CSAM on their platforms, companies not only protect their users and children, but also break that cycle of revictimization.

And, the collective effort is working.

 

Stories of success

To date, Thorn has helped the tech industry detect and flag for removal more than 5 million child sexual abuse files from the internet. 

The companies we partner with range from small platforms to some of the world’s largest household digital names.

 

Flickr logo

In 2019, global photo and video hosting site Flickr became a Safer customer and relies on our comprehensive detection solutions find CSAM on its platform. In 2021, Flickr deployed Safer’s CSAM Image Classifier. Using the classifier, their Trust and Safety team could detect previously unknown CSAM images they likely wouldn’t have discovered otherwise.

One classifier hit led to the discovery of 2,000 previously unverified images of CSAM and an investigation by law enforcement – in which a child was rescued from harm.

In 2022, Flickr reported 34,176 files of suspected CSAM to the National Center for Missing and Exploited Children. This is data that can be acted on to identify and remove child victims from harm.

 

VSCO logo

VSCO, an app for photo and video creation communities, deployed Safer in 2020. In the face of accelerating CSAM online, VSCO’s core dedication to safety drove them to prioritize detection on their platform.

VSCO uses Safer to proactively target CSAM at scale. The tool speeds their efforts and increases the amount of content they can review, allowing them to cast a wider net. In three years, they’ve reported 35,000 files of suspected CSAM to authorities.

In 2023 alone, Safer detected more than 3 million files of CSAM across tech platforms — making a tangible impact on the lives of children and survivors.

 

A multifaceted approach to online child safety

Tackling child sexual abuse online requires a comprehensive approach, involving technology, industry education, policy, and community engagement. Thorn works at each level to create systemic change and strengthen the child safety ecosystem.

Safety by Design

In the tech industry, everyone from AI developers to data hosting platforms, social media apps to search engines, each intersect with child safety in some way. Thorn helps them understand the threats that can occur on their platforms and how to mitigate them.

The emergence of generative AI only accelerated the spread of CSAM. Thorn urges companies to take a Safety-by-Design approach, which requires safety measures be built into core design of technologies.

As AI technologies continue to advance, Thorn works with platforms to ensure the safety of children remains front and center.

Consulting Services

One way Thorn helps platforms navigate these issues is through our Child Safety Advisory consulting services. Thorn guides platforms through developing child safety policies and on-platform intervention and prevention strategies. We even help teams identify product vulnerabilities to misuse and malcious activity.

Prevention Campaigns

In addition to providing expertise to the tech industry, Thorn works with online platforms to develop CSAM-specific prevention campaigns and co-branded educational resources for youth in partnership with our NoFiltr program and Youth Innovation Council. Platforms can also gain unique youth perspectives on their platform’s safety measures through NoFiltr Youth Innovation Council custom workshops.

 

Creating a safer internet, together

At Thorn, we build technology to defend children from sexual abuse. But we’re just one piece of the puzzle, along with the tech industry, policymakers, and the public. 

When we work together, we can combat the spread of CSAM. In doing so, we’ll stop revictimization, and start to build a world where every child is free to simply be a kid.

 

Join us

Become a force for good. Learn more about Thorn’s solutions and how you can contribute to making the internet a safer place for children.

See Our Solutions



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.