Skip to content

Prioritize child safety, mitigate risk.

Every platform with an upload button or messaging capabilities is at risk of hosting child sexual abuse material (CSAM) or interactions that could lead to child exploitation. Thorn is committed to equipping content-hosting platforms with tools and expert guidance to mitigate these risks.


Safer Built by Thorn logo

A collection of icons displaying various numbers on them.

We’re making an impact for our customers

Safer has processed more than200.8 billiontotal files since 2019

NoFiltr has facilitated1.4 millionconversations about the issue
via social media channels

Safer has identified more than8.8 millionfiles of child sexual abuse material on customer platforms since 2019



Proactive solutions from child safety technology experts

Powered by innovative tech, trusted data, and issue expertise, our CSAM detection and child exploitation prevention solutions can help protect your platform and your users. Work with us to take meaningful action to redesign a safer tomorrow.

Trusted by leading content-hosting platforms

  • Vimeo logo
  • Patreon logo
  • Slack logo
  • ancestry
  • VSCO logo
  • Quora logo

Safer CSAM Detection

Protect your platform with industry-leading solutions for proactive CSAM detection. Safer detects both known and unknown CSAM and recognizes text-based online conversations that could lead to child exploitation.

Learn more

Marketplace Risk 2024 Excellence Program for General Trust & Safety badge

GoDaddy is proud to be a part of Thorn’s Safer community. Using their services, we can detect and remove [CSAM] content faster and safely share knowledge in the community in order to keep the internet a safe and enjoyable place, especially for children.

Chris Hauser, Director – Infosec at GoDaddy

Consulting Services

Get expert guidance on safeguarding children and your platform. Whether you’re just starting to develop child safety policies or looking to develop new on-platform interventions, we can help. Our consulting services range from guidance on reactive policy enforcement to proactive red teaming sessions to stress test AI products.

Contact Us to Learn More

Thorn is unique in its depth of expertise in both child safety and AI technology. The combination makes them an exceptionally powerful partner in our work to assess and ensure the safety of our models.

Chelsea Carlson, Child Safety TPM – Open AI

Prevention Campaigns & Partnerships

Partner with the NoFiltr team to develop CSAM-specific digital safety prevention campaigns; engage with the NoFiltr Youth Innovation Council to gain youth perspectives on your platform’s safety measures; and develop co-branded safety resources for youth on your platform.

Contact Us to Learn More

Ready to protect your platform?

Learn more about how Thorn can help you protect your platform and scale child safety.

Case study

VSCO + Safer

See how VSCO is using Safer’s CSAM Image Classifier to mitigate unexpected risks and make its platform safer for kids and creatives.

Read More

Is your platform at risk?

Explore cutting-edge research on threats and opportunities for online safety. Get clear on potential risks for content-hosting platforms.

Get the Report