Skip to content

Rising Awareness and Urgency: Thorn’s Insights from NCMEC’s 2022 CyberTipline Report

June 21, 2023

4 Minute Read

Thorn is committed to driving innovation to defend children from sexual abuse. We rely on data to guide our work, and the National Center for Missing and Exploited Children’s (NCMEC) annual CyberTipline report provides crucial insights.

In 2022, NCMEC’s CyberTipline received more than 32 million reports of suspected child sexual exploitation. This staggering number is more than just a statistic—it represents millions of children actually experiencing abuse and underlines the growing need for comprehensive measures to respond to and counteract child sexual abuse.

At Thorn, we know that the overwhelming prevalence of child sexual abuse material (CSAM) and online grooming is far greater than the instances that are reported – and that platforms must up their detection efforts, which will in turn increase reporting numbers before we can eradicate the problem of child sexual abuse.

We have long maintained that the increase in reports of child sexual abuse material (CSAM) is a good thing. That’s because when platforms don’t report CSAM, it’s not because it isn’t there – it means it’s simply not being detected. Additionally, lower numbers of reports relative to a platform’s user base may signal CSAM is only being discovered inadvertently and reported reactively.

However, there are nuances to the most recent CyberTipline Report findings, and a rise in numbers isn’t a good thing in every single instance.

The numbers are rising. Sextortion and grooming may be partly to blame.

One of the striking revelations in this year’s report was the dramatic rise in the Online Enticement of Children for Sexual Acts category of CyberTipline Reports. These numbers saw a dramatic rise from 37,872 in 2020 to 80,524 in 2022. 

Also in 2022, 88 million CSAM files (images and videos) were reported to NCMEC — up from 20 million files in 2017.

While there are multiple reasons that could explain this uptick, one probable reason is the increase in online grooming and sextortion cases that are known to be happening on every type of online platform.

Recent research from Thorn shows that 2 in 5 of kids have been approached online by someone who they thought was attempting to “befriend and manipulate” them. And, between 2019 and 2021, the number of reports to NCMEC involving the sextortion of children or teens more than doubled. As the prevalence of this type of online abuse grows, so do the number of people and platforms reporting it.

However, it’s important to reiterate that an increase in these reports could still be indicative of some types of positive progress. It likely does signify that more platforms are becoming aware of the issue, actively engaging in, and reporting suspected abuse on their sites.

In addition, we know that kids themselves are increasingly proactive when it comes to taking charge of their online safety, understanding the dangers of the digital landscape, and flagging/reporting harmful conversations on platforms. Our recent research shows that more than half of kids believe online grooming is a common experience for kids their age.

As the people who have a stake in protecting children, alongside the platforms and children themselves, continue to grow more vigilant and empowered, we will begin to see positive changes.

The tech industry has the potential to do more in the way of reporting.

Finally, it’s worth addressing a somewhat concerning figure: 4% of CyberTipline reports submitted by the tech industry in 2022 contained such limited information that it was impossible for NCMEC to pinpoint the offense location or the appropriate law enforcement agency to receive the report. 

This low number indicates that CSAM is being detected, but that critical user information isn’t being included in reporting.

The ecosystem must work together.

As is true every year when the CyberTipline report is released, these findings are not just numbers; they are also a call to action. We must continue to evolve our strategies, advance our technology, and expand our partnerships to ensure the digital world is a safe space for the children who now spend so much time in it – as well as stop the revictimization through the viral spread of CSAM.

As Thorn works to equip those on the frontlines of protecting children, we know that no single entity alone can change the trajectory of the issue of child sexual abuse and that the number and frequency of reports filed rely on all of us doing our unique part to defend children. 

For Thorn, that means working to bring more tech companies to the table and equip them with the necessary tools to detect, review, and report CSAM effectively, as well as working directly with law enforcement to equip them with the tools and resources they need to identify victims as efficiently as possible. It means bringing key partners together – those who have the interest and ability to protect children – and developing the best solutions to work toward our goal of defending children from sexual abuse. 

It is an ongoing process — but we know that together, we can make substantial strides toward a world where every child is free to simply be a kid.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.