How VSCO Protects Its Platform and Community of Creators from CSAM with Safer by Thorn
October 25, 2023
2 Minute Read
Since its founding, VSCO has believed in proactively protecting the wellbeing of its global community of 250+ million registered users (or creators), who upload photos and videos to its platform every day. To protect its platform and creators, VSCO uses Safer – Thorn’s all-in-one solution to detect, review and report child sexual abuse material (CSAM) at scale.
A Safety by Design Approach
VSCO’s strong focus on creators’ work and experience on the platform is an extension of its safety by design ethos. As they have developed the platform and grown, the company has invested in infrastructure that safeguards against harmful content so its creator community never has to see it.
A desire to have comprehensive protection against CSAM led VSCO’s trust and safety team straight to Thorn. The VSCO team knew about our mission to build technology to defend children from sexual abuse, and were interested in utilizing Safer.
Collaboration is Key
With the rise of user-generated content, the spread of CSAM has accelerated. Often, the public is surprised to find CSAM and child exploitation spreading on platforms they use everyday.
Thorn is dedicated to providing tools and resources to content-hosting platforms as they are key partners in combating the viral spread of CSAM. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its creator community.
In 2022, Safer flagged 35,378 images and videos as potential CSAM and detected 408 instances of known CSAM for VSCO. By proactively fighting the spread of CSAM, VSCO ensures creators aren’t exposed to this harmful content.
Together, VSCO and Safer are making an impact.