Skip to content

Why an increase in reports of CSAM is actually a good thing

February 24, 2021

4 Minute Read

Today the National Center for Missing and Exploited Children (NCMEC) released its annual overview of the number of child sexual abuse material (CSAM) reports the organization received in 2020. 

The 2020 overview from NCMEC is highlighted by a 28% increase in reports of CSAM compared to 2019, which might leave you asking the question: Why are the numbers going up when so many have invested so much in combating the spread of CSAM online?

While it might feel counterintuitive, the rise in CSAM reports is actually a good thing.

Here’s why:

More platforms are proactively detecting CSAM, and getting better at detection.

21.4 million reports were filed with NCMEC in 2020. We must remember that every piece of CSAM is the documentation of a digital crime scene. Behind every CSAM file is a child victim in need of support, or a survivor whose trauma continues to spread online.

The human toll these numbers tell can be overwhelming – but we can’t afford to look away. We’re still in the process of shedding light on this issue and measuring not only how technology enables the sexual abuse of children, but whether the tools deployed to fight back are effective.

Since our inception 8 years ago, Thorn has learned that if there’s an upload button on a platform, it will be used to host child sexual abuse material. This occurs regardless of the size of the platform, or its intended use.

When platforms don’t report CSAM, it’s not because it isn’t there – it means it’s simply not being detected. Additionally, lower numbers of reports relative to a platform’s userbase may signal CSAM is being discovered inadvertently and only reported reactively. 

With what we know today, when a platform logs a yearly increase in CSAM reports to NCMEC, that means they are proactively detecting, removing, and reporting abuse content – and getting better at it. When we see an overall increase in reports year over year, it may signal that more platforms are detecting CSAM than the previous year – which is a huge win for this mission.

In some cases, a single report has led to the removal of children from harm. More reports means there is more detection, and more detection can change lives. 

To eliminate CSAM from the internet, we have to identify more content. That’s how we’ll build a world where every child can simply be a kid. That’s how we’ll ensure children are being identified and reached in days or hours instead of months or years.

Reports need to continue to go up until every platform is proactively detecting CSAM at scale.

The technology will change, it will get better and faster and more efficient, and eventually how we measure progress in the fight against CSAM will look different. But today, when we are still in the early days of this fight, a yearly rise in CSAM reports is an important signal that our collective efforts are working.

We have not reached the peak of this epidemic 

At some point we will turn a corner where reports start to come down because there is universal adoption of proactive CSAM detection.

But we’re not there yet.

The CSAM epidemic is still on an upward trajectory, and we must continue to innovate and come together with shared knowledge and tools. We must continue to have difficult conversations and shed light on this issue.

If CSAM reports were declining at this stage, it would be a sign that platforms have stopped proactively detecting, a dire situation for the child victims whose sexual abuse is documented and disseminated across the web.

This is why we must remain vigilant in ensuring that platforms are not only committing to defend children through the proactive detection of CSAM, but that legislative environments support that detection. We have been actively engaged in policy discussions in the European Union, where we’re working to ensure that child sexual abuse detection tools are allowed to remain in use in that region.

We will win this battle – together

In the fight to eliminate child sexual abuse material from the internet, no one is doing it alone. We will win this battle by coming together not as single waves crashing against the rocks, but as a tidal wave of advocates who demand an end to the online exploitation of children in unison. 

With tangible technology solutions, platforms of all sizes now have the opportunity to join the movement.

****

To learn more about how Thorn’s industry tool, Safer, detects CSAM, take a look at the posts below.



Get the latest delivered to your inbox