Skip to content

Here’s what you need to know about The New York Times podcasts on CSAM

March 5, 2020

6 Minute Read

The Daily, a podcast from the New York Times, recently produced two episodes on the state of child sexual abuse material on the Internet which were the culmination of months of reporting.

It’s a hard story to tell, but we are grateful that more outlets and people are willing to talk about and address this issue.

When Thorn CEO Julie Cordua is asked about how to build an organization that really makes an impact, her answers vary along a theme: “find the place no one is willing to look, go deep, and start building.”

Listening to this podcast series and reading the New York Times coverage reminds us of what we felt eight years ago when we started this mission: knowing nothing, shocked every day, looking for the path forward.

And the Thorn from those early days would be ecstatic to hear how far we’ve come. Whether you are now learning about this issue, or have been following it for years, we want to share the hope that, we’ve discovered, reveals itself on a daily basis.

This issue is deeply challenging—technically, and emotionally—and we aren’t looking away. We’re going deeper, and we’ve built meaningful solutions along the way.

Let’s jump into some of the key moments from the most recent The Daily podcast on this topic, dig into what we’ve learned, and talk about how these challenges are being addressed.

“WE’VE GOT THIS TECHNOLOGY. WE CAN GO TO THE MOON…. WE CAN GET THIS [CONTENT] OFF THE WEB.”

–Parent of CSAM victim, The Daily Podcast, 2/20/2020

This lament from the parent of a victim of CSAM is deeply felt, and correct: We can go to the moon, yet we cannot keep the websites we use every day clear of this horrific content. Why?

When Ernie Allen started the National Center for Missing and Exploited Children (NCMEC), he learned that if a car is stolen, there is a big national system in place to help locate and return that car, to find the person who stole it, to right the wrong; but if a child was kidnapped or had gone missing, there was nothing—no guidelines, no system that would kick in to address the urgent problem. So he started an organization focused solely on missing and exploited children. And it’s changed the lives of hundreds of thousands of people.

That will to effect change proportionate to the scale of the problem is exactly what we need today. The danger is no longer strangers in white vans. We now understand that many child victims are abused by individuals in a position of trust, and that the documentation of that abuse often moves freely across the internet.

We struggle to take down this content for two main reasons:

-Removing abuse content requires deep collaboration. Every known image needs to be hashed (translated from a visual into an alphanumeric digital identifier) and those hashes shared with every company that hosts user-generated content. This level of collaboration is difficult and time consuming. It requires buy-in from the entire industry, from today’s largest social media platforms to the latest messaging app that launches tomorrow. All of them need to be inoculated.

-We only know what we know. For example, Facebook Messenger makes up 2 out of 3 reports to NCMEC, a result of the fact that they are proactively scanning for this content to remove and report it. But every platform with a content-sharing function likely has child sexual abuse material on it. Until all companies are committed to looking, we are only seeing a fraction of the problem.

FAMILIES LIKE THIS ARE GETTING REPORTS EVERY YEAR WHEN THEIR KIDS’ IMAGES HAVE BEEN FOUND.

If you listened to The Daily podcast on February 20, 2020, you heard the story of the parents who learned their young daughter had been victimized. Her abuser was arrested six years ago, but because law enforcement has a legal mandate to notify victims (or, in the case of minors, their parents) every time they intercept another predator viewing that content online, the family continues to receive around 100 reports every year that the imagery of their daughter’s abuse is still being viewed.

In fact, the proliferation of this content is overwhelmingly alarming, and victims are re-traumatized with every viewing. We’ve learned through our partners who work directly with survivors of this kind of abuse that the lasting effects of this trauma are immense.

Knowing that abuse content continues to circulate and continues to be requested every day does not give survivors what they deserve: the ability to make their trauma something that happened in the past.

“YOU’RE MAKING IT HARDER FOR [THE LIVES OF THESE CHILDREN] TO EVER BE PRIVATE.”

–Michael Barbaro, host of The Daily Podcast, 2/20/2020

The conversation around privacy is nuanced. What we loved hearing in these podcasts is an echo of what we say at Thorn on a regular basis: We must find solutions that protect the privacy of children who are being abused.

When we talk about encryption, or any other privacy measure, bringing the voices of the voiceless into the conversation about the technology we build (and how we build it) is critical to moving this work forward.

The only way to do it is through collaboration. Advocates, policymakers, tech companies, and daily users of these products all have to agree that child victims deserve privacy as much as adults, and commit to finding solutions that protect everyone.

There are three things we think about when building our technology and partnerships. These are guiding lights in our process:

First, how do we make sure fewer children become victims? Our programmatic work addresses prevention through behavior change and deterrence initiatives that take the entire scope and complexity of the issue into account.

Second, if there is a victim, how do we find that victim faster? The tools we build for law enforcement, such as Spotlight, are reducing investigative time by more than 60%.

And third, once a victim is found, how do we do all we can to support healing and recovery? This point is one that can be heartbreaking.

In The Daily podcast, the parents of the child whose abuse is still being viewed online made the choice not to tell her about the ongoing reports they receive each year. They know, however, that when she turns 18 the federal government will legally be required to send those reports directly to her.

This family’s hope is that in the next four years we can solve this problem. Doing so would mean victims would no longer be recognized in public due to the images of their abuse, and wouldn’t have to live with the repeated trauma this cycle creates.

This is the same hope that sparked the creation of Thorn, and it drives us to this day.

And it’s more than hope—it’s a plan. Every day we are building the tools, relationships, and public support for a future where CSAM is something that’s read about not in the news, but in history books.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.