Skip to content

How VSCO Protects Its Platform and Community of Creators from CSAM with Safer by Thorn

October 25, 2023

2 Minute Read

Since its founding, VSCO has believed in proactively protecting the wellbeing of its global community of 200 million users (or creators), who upload photos and videos to its platform every day. To protect its platform and creators, VSCO uses Safer – Thorn’s all-in-one solution to detect, review and report child sexual abuse material (CSAM) at scale.

A Safety by Design Approach

VSCO’s strong focus on creators’ work and experience on the platform is an extension of its safety by design ethos. As they have developed the platform and grown, the company has invested in infrastructure that safeguards against harmful content so its creator community never has to see it.

A desire to have comprehensive protection against CSAM led VSCO’s trust and safety team straight to Thorn. The VSCO team knew about our mission to build technology to defend children from sexual abuse, and were interested in utilizing Safer.

2.2 million files of potential CSAM identified since Safer launched in 2019.

Collaboration is Key

With the rise of user-generated content, the spread of CSAM has accelerated. Often, the public is surprised to find CSAM and child exploitation spreading on platforms they use everyday. 

Thorn is dedicated to providing tools and resources to content-hosting platforms as they are key partners in combating the viral spread of CSAM. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its creator community. 

quote from Anna Coffman, Sr. Manager of Trust and Safety of VSCO. "Thorn gives us confidence and credibility in what we're doing and really reinforces our commitment to Safety by Design."

In 2022, Safer flagged 35,378 images and videos as potential CSAM and detected 408 instances of known CSAM for VSCO. By proactively fighting the spread of CSAM, VSCO ensures creators aren’t exposed to this harmful content.

VSCO in 2022 had 35,378 images and videos flagged as potential CSAM, and 408 instances of known CSAM detected.

Together, VSCO and Safer are making an impact.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.