Skip to content

New Report Shows an Increased Effort by Tech Companies to Detect CSAM on the Internet

March 18, 2022

4 Minute Read

If you’re following Thorn’s work, you know the critical role that tech companies play in identifying and reporting child sexual abuse material (CSAM) on their platforms.

Today, the National Center for Missing and Exploited Children (NCMEC) released its annual overview of the number of CSAM reports they received in 2021. The findings are encouraging, especially when looking closely at the role platforms played in reporting CSAM last year.

The report shows that 230 companies across the globe are now deploying tools to detect child sexual abuse material. That’s a remarkable 21% increase since 2020.

The significant uptick in the number of platforms that detect child sexual abuse material has led to more reports being filed and more CSAM hashes created, helping make the fight against the viral spread of abuse material ever more effective.

In fact, in 2021, more than 29 million reports were filed, containing a total of around 85 million unique files – a 38% increase from those found in 2020.

What makes these numbers even more remarkable is that Meta’s Facebook messenger platform – which in the past has contributed a substantial number of reports to NCMEC – stopped detecting and reporting child sexual abuse material in the EU for most of 2021.

This interruption resulted from legal uncertainty surrounding the EU’s ePrivacy directive – and it resulted in a devastating 58% decrease in reports of EU-related CSAM, according to NCMEC.

Let’s recap: Why are more child sexual abuse reports a good thing?

Last year, we shared with you why the rise in child sexual abuse reports is actually a good thing: Every piece of CSAM is the documentation of a digital crime scene. Behind every CSAM file is a child victim who needs support or a survivor whose trauma continues to spread online.

When platforms don’t report CSAM, it’s not because it isn’t there – it means it’s simply not being detected. Additionally, lower numbers of reports relative to a platform’s user base may signal CSAM is being discovered inadvertently and only reported reactively.

Through Thorn’s work across platforms, we’ve learned that when a platform logs a yearly increase in CSAM reports to NCMEC, that means they are proactively detecting, removing, and reporting abuse content – and getting better at it.

When we see an overall increase in reports year over year – as we have in the last two years – it may signal that more platforms are detecting CSAM than the previous year. This continuing pattern is a big win for Thorn and everyone who is working to make the internet safer for kids.

While we are encouraged that more companies have committed to making their services safer for children, we have not yet reached the peak of this epidemic.

Reports of CSAM will begin to come down when there is a universal adoption of proactive detection. But we’re not there yet. The epidemic is still on an upward trajectory, and we must continue to innovate and come together with shared knowledge and tools. We must continue to have difficult conversations and shed light on this issue.

What else does the 2021 NCMEC report tell us?

Platforms are detecting and responding to child sexual abuse material faster.
In 2021, it took platforms an average of 1.22 days to take down CSAM after detecting it. This is a newly surveyed figure by NCMEC – however, numbers by the international organization of hotlines INHOPE show that in 2020, 74% of reported CSAM was removed within three days.

To arrest perpetrators, we must provide sufficient resources to law enforcement agencies.
Apart from rescuing children in danger, prosecuting perpetrators is another critical pillar in fighting child sexual abuse. In the past year, law enforcement agencies worldwide have arrested close to 2,700 perpetrators. But, to maintain this momentum and avoid backlog, we must continue to provide law enforcement with sufficient resources.

In short: Collaboration is key. And our work is far from done.

Eliminating child sexual abuse material from the internet isn’t up to just one person, organization, or platform. Collaboration, coordination, and systemic change are essential.

The internet was designed without child safety in mind and now it is being co-opted to sexually exploit children. Our experience and humility teach us that we cannot achieve our goals alone.

Thorn is committed to moving this ecosystem (tech companies, law enforcement, parents/caregivers) from a reactive, siloed, child-isolated network to a child-centered/child supported, globally-connected, proactive, and accountable community.

Together, we can change the future of the internet and make it safer for children, now and for generations to come.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.