Skip to content
SOLUTIONS FOR CONTENT-HOSTING PLATFORMS

Mitigate risk & scale child safety

Every platform with an upload button or messaging capabilities is at risk of hosting child sexual abuse material (CSAM) or interactions that could lead to child exploitation. Thorn is committed to equipping content-hosting platforms with tools and expert guidance to mitigate these risks.

 

Safer Built by Thorn logo

A collection of icons displaying various numbers on them.


We’re making an impact for our customers

Safer has identified more than5 millionfiles of child sexual abuse material on customer platforms since 2019

NoFiltr has facilitated120,000conversations about the issue
via social media channels

Safer has processed more than129.4 billiontotal files since 2019

Proactive solutions from child safety technology experts

Powered by innovative tech, trusted data, and issue expertise, our CSAM detection and child exploitation prevention solutions can help protect your platform and your users. Work with us to take meaningful action to redesign a safer tomorrow.

Trusted by innovative content-hosting platforms

  • Oracle logo
  • VSCO logo
  • Quora logo
  • bublup logo
  • Slack logo
  • Vimeo logo
  • ancestry

Safer CSAM Detection

Protect your platform from the risks of hosting CSAM with industry-leading detection tools. Safer was built by our child safety technology experts and offers different feature sets to fit your needs and engineering resources.

Visit Safer.io


GoDaddy is proud to be a part of Thorn’s Safer community. Using their services, we can detect and remove [CSAM] content faster and safely share knowledge in the community in order to keep the internet a safe and enjoyable place, especially for children.

Chris Hauser, Director – Infosec at GoDaddy

Consulting Services

Get expert guidance on safeguarding children and your platform. Whether you’re just starting to develop child safety policies or looking to develop new on-platform interventions, we can help. Our consulting services range from guidance on reactive policy enforcement to proactive red teaming sessions to stress test AI products.

Contact Us to Learn More


Thorn is unique in its depth of expertise in both child safety and AI technology. The combination makes them an exceptionally powerful partner in our work to assess and ensure the safety of our models.

Chelsea Carlson, Child Safety TPM – Open AI

Prevention Campaigns & Partnerships

Partner with the NoFiltr team to develop CSAM-specific digital safety prevention campaigns; engage with the NoFiltr Youth Innovation Council to gain youth perspectives on your platform’s safety measures; and develop co-branded safety resources for youth on your platform.

Contact Us to Learn More

Ready to protect your platform?

Learn more about how Thorn can help you protect your platform and scale child safety.

Case study

VSCO + Safer

See how VSCO is using Safer’s CSAM Image Classifier to mitigate unexpected risks and make its platform safer for kids and creatives.

Read More
Report

Is your platform at risk?

Explore cutting-edge research on threats and opportunities for online safety. Get clear on potential risks for content-hosting platforms.

Get the Report