Skip to content
topic Overview

Child sexual abuse material (CSAM)

What is CSAM?

Child sexual abuse material (CSAM) refers to sexually explicit content involving a child. Visual depictions can include photographs, videos, or computer-generated images indistinguishable from a specific minor.

There are various terms used across the globe to describe this material, including:

  • CSAI — child sexual abuse imagery
  • CSEI — child sexual exploitation imagery
  • IIOC — indecent images of children

While U.S. federal law still refers to this material as “Child Pornography,” CSAM is now the preferred language, and efforts are underway in many jurisdictions to update the terminology in legal guidelines.

 

CSAM has a profound, lasting, and sprawling impact on the depicted victims.

 

CSAM serves as a record of a child’s abuse for their abuser, and as material to fulfill a fantasy for other collectors worldwide. Abusers frequently use CSAM to revictimize and stalk victims long after the original producer has been identified, and the material can be leveraged to groom other children for future abuse.


The scale of CSAM

Decades ago, distribution of a child’s abuse imagery was slower and less expansive, often transmitted via direct exchange among networks, illegal storefronts, or through the mail service. However, the explosion of the internet in the ‘90s and early 2000s removed many of the barriers to access.

As technology developed and advanced, abuse imagery could be uploaded and shared with others around the globe in moments.

The full scale of online child sexual abuse and the volume of resulting abuse imagery today is hard to quantify, with estimates only reflective of the material we have discovered.

32 million

incidents of suspected child sexual exploitation reported to CyberTipLine

88 million

files related to CSAM were reported by registered Electronic Service Providers

Levels of abuse & victim demographics

CSAM is the result of a spectrum of abuse types but very often depicts extreme levels of violence and brutality involving young children — sometimes so young, they are still preverbal. The types of abuse depicted can be described in four categories:

 

1.

nudity or erotic posing with no sexual activity

2.

non-penetrative sexual activity between children or adults and children, or masturbation

3.

penetrative sexual activity between adults and children

4.

sadism or bestiality


Unfortunately, there has been an apparent trend toward more violent and extreme material over time.

 

This table showcases the percentage of NCMEC’s actively traded cases that fall within each abuse category over time. The category for a case is determined by the highest level of abuse depicted in a series of images. The increased prevalence of cases categorized as 3 (penetrative sexual activity between adults and children) and 4 (sadism or bestiality) indicates an increase in the amount of violent and extreme material in circulation.

 

Data sourced from Thorn


Trends in age & gender

CSAM exists to depict the exploitation and abuse of children of all ages and genders. Based on available data, CSAM in circulation is more likely to show prepubescent children than older minors.

67 %
More than half of all identified child victims with CSAM in circulation online are prepubescent or younger (infants and toddlers).

CSAM is more likely to depict girls than boys, according to the 2022 records of identified victims known to NCMEC’s Child Victim Identification Program (CVIP). However, according to a study conducted by INTERPOL & ECPAT, male victims are more likely to be seen in imagery with higher levels of violence than female victims.


Access & opportunity

Sadly, the majority of CSAM is created by people with legitimate access to the child like parents, uncles, neighbors, and family friends.

2 in 3

victims are abused by someone known to them in their offline communities.

Data sourced from NCMEC

While offline access is the primary way abusers find the children they abuse and record in CSAM, the increasing role of technology and the internet in our lives has opened up new ways to target, groom, and exploit young people.

Now, someone seeking to abuse a child does not need to be in the same room with them. They simply need to be connected via the internet.


Self-generated child sexual abuse material (SG-CSAM)

CSAM produced without any clear offender present in or to take the images is considered SG-CSAM.

These images or videos result from multiple sources, such as:

  • A romantic exchange with a friend from school
  • A child being groomed and extorted to take explicit images of themselves at the demand of an offender online
  • A screen capture of a live stream

Regardless of how these materials are produced, they are classified as CSAM. Once distributed, these images can be weaponized to manipulate the child, potentially for obtaining more images, arranging a physical meeting, or extorting money. Additionally, these images can circulate widely, catering to those seeking specific fantasies. Predators might also use these images to groom other potential victims.


According to Thorn’s report on YOUTH PERSPECTIVES ON ONLINE SAFETY in 2022:

 

1 in 6

minors aged 9-17 have shared their own SG-CSAM.


1 in 4

view this as normal behavior for kids their age.

While these behaviors are more common among teenagers, 1 in 7 9-12-year-olds still say they have shared their own SG-CSAM.

It is slightly more common for SG-CSAM to be shared as part of an offline relationship. However, roughly 40% of kids who have shared SG-CSAM say they have done it with someone they only know online.

In addition, 42% of minors reported they believed the person they had shared SG-CSAM with was over 18

compared with 66% who believed the person they had shared SG-CSAM with to be under 18.


AI-generated child sexual abuse material (AIG-CSAM)

As technology continues to evolve, so too are the avenues of online child sexual abuse and the production of CSAM. Generative AI technologies are the latest example of this.

This technology’s use in everyday life is still in its early stages, but it has quickly become a mechanism through which to abuse kids.

While currently responsible for a small amount of CSAM, generative AI models are already being manipulated to produce custom CSAM. This is being used to generate new material of historical survivors, and custom material of specific kids from benign imagery for the purposes of fantasy, extortion, and, in some cases, peer-based bullying within schools and communities.

What’s next?

CSAM, in all its forms, is often the only clue available to locate a child and identify their abuser. Sadly, these images may not be detected by law enforcement until months or years after being recorded, leaving the child at risk for continued abuse.

Exploring the pathways leading to these images can offer critical insights and potential opportunities for intervention before CSAM is created.

 

Explore the Issue

We’re building a brighter tomorrow. And we need your help.

The issue of CSAM is massive. But every day we work to accelerate hope.

See Our Solutions Donate Now