Skip to content

Protect your child: Explore Thorn’s Deepfake Nudes Resource Guide

What the 2024 NCMEC CyberTipline Report says about child safety

May 16, 2025

5 Minute Read

Each year, the child safety ecosystem looks to the National Center for Missing & Exploited Children’s CyberTipline Report for a snapshot of the online exploitation landscape. The data—how many reports were made by platforms and the public, how many files of suspected child sexual abuse were shared, and what percentage depicted toddlers versus teens—offers one of the few indicators we have of the scale and nature of technology-facilitated abuse.

And every year, we ask: What do the numbers mean? Are we making progress, or falling behind?

This year, the answer is yes… to both. We are making progress. We are losing ground. And this remains only the tip of the iceberg.

One thing we know for a fact is that the scale of abuse is still staggering. In 2024, the CyberTipline received 20.5 million reports, including nearly 63 million files—images, videos, and other materials related to child sexual exploitation.

Each report, each file, each incident reflects a child who has been harmed. So while the numbers may be lower than what we saw in last year’s data, they remain unacceptably high—and they must be addressed through continued vigilance, innovation, and cross-sector collaboration.

The impact of technology and awareness

We’re seeing growing evidence that both technological innovation and public awareness are influencing the pipeline of reporting in ways that improve detection and prevention, while new technologies also introduce new challenges for child safety:

  • Bundling: One of the notable declines in this year’s reporting may be explained by NCMEC’s introduction of report “bundling,” which consolidates duplicate tips tied to a single viral incident.
  • Platform changes: Updates like default end-to-end encryption (E2EE) and revised content policies are likely altering what content is detected and how it’s reported. These changes matter—they reflect evolving approaches to privacy, safety, and trust & safety design. 
  • Policy momentum: The REPORT Act, enacted in 2024, now mandates that platforms report cases of online enticement and child sex trafficking. That policy shift likely contributed to the spike in online enticement reports—showing that child safety legislation paired with platform compliance can improve visibility into specific types of harm.
  • Public detection and response:: As we’ve seen with sextortion, public recognition of emerging threats can play a pivotal role in surfacing harm that platforms miss. This year’s surge in public reports tied to violent online groups highlights both a growing willingness to report—and ongoing gaps in detection and disruption by platforms.

Key findings from the 2024 NCMEC CyberTipline Report

A closer look at this year’s data reveals several important trends and notable shifts in the child safety landscape:

  • 20.5 million reports of suspected child sexual exploitation were submitted to NCMEC in 2024—a 43% decrease from the 36.2 million reports in 2023. However, when adjusted for incidents (to account for bundled reports), the number is 29.2 million distinct incidents, still reflecting a staggering scale of harm.
  • 62.9 million files were included in 2024 reports—33.1 million videos, 28 million images, and nearly 2 million other file types. These files are evidence of abuse, and every one is tied to suspected abuse or exploitation of a child.
  • Online enticement (crimes involving an adult communicating with a child for sexual purposes) reports rose 192%, reaching more than 546,000 tips. This dramatic increase is likely due in part to the new REPORT Act, which requires companies to report online enticement and child sex trafficking for the first time.
  • Reports involving generative AI surged by 1,325%, climbing from 4,700 in 2023 to 67,000 in 2024. While this remains a small percentage of total reports, it’s a clear signal that AI-generated child sexual abuse material (AIG-CSAM) is growing —and demands proactive safety interventions like Safety by Design, ethical AI development, and robust transparency reporting.
  • NCMEC also saw more than 1,300 reports tied to violent online groups, representing a 200% increase from 2023. These groups promote sadistic forms of abuse, including self-harm, sibling exploitation, and animal cruelty. Strikingly, 69% of these reports came from members of the public — such as parents or caregivers —underscoring a high stakes gap in detection by platforms. 

Protecting the children behind the numbers

No amount of suspected exploitation reports is acceptable in the world we want for our children. 63 million suspected abuse files are far too many files.

Behind each file and report is a child—someone experiencing abuse, coercion, or exploitation. That’s the reality we cannot lose sight of.

And while changes in reporting systems, technologies, and policies can all shift the numbers year over year, what remains constant is the urgent need for a smarter, more unified response. Lower numbers don’t necessarily mean less abuse. In some cases, they mean less visibility into it.

That’s why Thorn continues to champion a broader, more resilient approach to child safety that includes things like:

  • Adapting technologies and platform design to mitigate risks from increased use of E2EE and updated content policies, which may impact what’s detectable—alongside a new generation of technology companies stepping up to proactively address these risks through responsible reporting and intervention.
  • Transparency reporting from online platforms, helping the entire child protection ecosystem and the general public understand what platforms are detecting, how they approach child safety, and what they may be missing.
  • Safety by Design principles that tech companies can follow and adopt early in technology development, so platforms are built with child safety in mind from the outset. 
  • Robust detection tools, including AI-powered classifiers that help identify, review, and report abusive content before it spreads. 
  • Support services for victims and survivors, who often experience revictimization each time their abuse material resurfaces online. 
  • Well-resourced law enforcement, equipped with the tools and staffing needed to identify more child victims faster. 
  • Original, youth-centered research to surface emerging threats and ensure we understand how abuse evolves in digital spaces. 
  • Cross-sector collaboration, because no single actor—platform, policymaker, or nonprofit—can solve this issue alone. 

Final thoughts

No matter how the numbers change year over year, Thorn’s mission remains steadfast: To transform the way children are protected from sexual abuse and exploitation in the digital age. 

Real progress requires that we supplement what we learn from these numbers by listening to what children are experiencing today, investing in the systems that can protect them tomorrow, and addressing the threats hiding beneath the surface of the data. 

Read the full 2024 CyberTipline Report here, and visit thorn.org to learn more about how we are building a safer world for children.


Get the latest delivered to your inbox