Skip to content

Two years after TED, we stand at a critical moment in the fight to eliminate CSAM

April 15, 2021

4 Minute Read

Two years ago, I stepped onto the TED stage to deliver a message about what my team at Thorn was working on: 

“We’re building the technology to connect these dots, to arm everyone on the front lines — law enforcement, NGOs, and companies — with the tools they need to ultimately eliminate child sexual abuse material from the internet.”

For the first time, I outlined our plan to eliminate child sexual abuse material (CSAM) from the open web. 

 

 

Two years later, we’re executing on that plan, most visibly in the form of Safer, the first holistic solution for content-hosting platforms to identify, remove, and report child sexual abuse material at scale. Alongside our work to equip law enforcement and other NGOs worldwide with the tools needed to accelerate victim identification, we have made real progress on our mission.

Since that TED Talk in April of 2019:

 

  • A single report by Flickr, a Safer customer, led to the recovery of 21 children from abuse. While we can’t always talk about it publicly to protect victims, we know there are many more stories like this.
  • Thorn’s tools are connecting data across the ecosystem, whether in the form of platforms contributing to hash sets to improve CSAM detection, or our Victim Identification team connecting data across the law enforcement and child protection landscape, in tandem accelerating the time it takes to identify victims and remove their content from the internet. We’re seeing the power of connecting the dots every single day.
  • Thorn’s Data Science team built a CSAM classifier that uses machine learning to detect CSAM faster, particularly unknown CSAM. Internal assessments show that 99% of files classified as CSAM by the model are indeed CSAM, with 93% of all CSAM present detected. The classifier was deployed to Safer in early 2020 and is also being shared with the broader child safety ecosystem as a technology standard for continuous, collaborative improvement.
  • Safer has processed nearly 5 billion files and so far the Safer community has helped identify over 150,000 files of CSAM for removal and reporting.
  • Tools built by Thorn have helped to identify thousands of child victims of sexual abuse, for a total of nearly 20,000 child victims identified. 

We’ve also seen more companies reporting CSAM than ever before, according to the National Center for Missing and Exploited Children (NCMEC). This is a sign that more platforms are begining to heed the call to proactively detect child sexual abuse material, a necessary and critical step in this fight.

In my TED Talk I talked about how – years ago – we had to say “no” when asked if we could help identify a child victim of sexual abuse, because we didn’t have the tools or data to do so. Today, I can say with confidence — thanks to the support of our partners and community — that the answer is a resounding “yes, we can help.”

But we’re not done, and our mission is facing new and unprecedented challenges. 

 

A pivotal moment for child victims of sexual abuse

While online privacy is rightly demanded by adult users, the tradeoffs around child protection are not fully acknowledged. Thorn VP of External Affairs Sarah Gardner, my colleague, recently addressed the most pressing obstacles facing this work as part of TEDxWarwick.


 

The discussion around privacy and encryption has been dominated by the notion that it is a zero-sum game and the spread of child sexual abuse material is simply an unfortunate byproduct of platforms moving to increasingly private environments. 

But as Sarah points out: “I am here today to share with you that privacy is a continuum, it’s not an all or nothing proposition.”

And she goes on: “We must fiercely guard our right to privacy…but we must also fiercely guard one of our greatest resources — our children.”

At Thorn, we believe in privacy for all, including the children whose abuse has been documented and shared without consent across the internet. Content that can spread for years and follow survivors well into adulthood. 

These aren’t just numbers, these are human beings — the most vulnerable human beings — whose lives and wellbeing hang in the balance.

The truth is that we can have it all. We can have privacy, we can protect children, and we can continue to innovate and use technology to solve big problems. But we stand at a moment where, as a society, we have a choice to make: look the other way as children are victimized on the platforms we use every day, or have difficult and honest conversations and commit to finding a solution that benefits everyone, no matter how difficult it may be. 

Together we can demand that collective action be taken to build a better internet. 

As Sarah says in her TEDx Talk: “It’s hard and it’s never been done before. But we’re going to have to do it.”

This is not just a minor blip that is easily overcome. It is a moment that as a society — as a global community of people who stand with victims and survivors — we must make a choice. 

We choose to use technology for good, even when it’s the harder choice to make, even when it means having some of the most difficult conversations we’ll ever have as a society. 

But I think we, all of us, are up to the task. Together we can build a world where every child can simply be a kid.

–Julie Cordua, Thorn CEO



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.