1.
nudity or erotic posing with no sexual activity
Child sexual abuse material (CSAM) refers to sexually explicit content involving a child. Visual depictions can include photographs, videos, or computer-generated images indistinguishable from a specific minor.
There are various terms used across the globe to describe this material, including:
While U.S. federal law still refers to this material as “Child Pornography,” CSAM is now the preferred language, and efforts are underway in many jurisdictions to update the terminology in legal guidelines.
CSAM serves as a record of a child’s abuse for their abuser, and as material to fulfill a fantasy for other collectors worldwide. Abusers frequently use CSAM to revictimize and stalk victims long after the original producer has been identified, and the material can be leveraged to groom other children for future abuse.
Decades ago, distribution of a child’s abuse imagery was slower and less expansive, often transmitted via direct exchange among networks, illegal storefronts, or through the mail service. However, the explosion of the internet in the ‘90s and early 2000s removed many of the barriers to access.
As technology developed and advanced, abuse imagery could be uploaded and shared with others around the globe in moments.
The full scale of online child sexual abuse and the volume of resulting abuse imagery today is hard to quantify, with estimates only reflective of the material we have discovered.
CSAM is the result of a spectrum of abuse types but very often depicts extreme levels of violence and brutality involving young children — sometimes so young, they are still preverbal. The types of abuse depicted can be described in four categories:
nudity or erotic posing with no sexual activity
non-penetrative sexual activity between children or adults and children, or masturbation
penetrative sexual activity between adults and children
sadism or bestiality
This table showcases the percentage of NCMEC’s actively traded cases that fall within each abuse category over time. The category for a case is determined by the highest level of abuse depicted in a series of images. The increased prevalence of cases categorized as 3 (penetrative sexual activity between adults and children) and 4 (sadism or bestiality) indicates an increase in the amount of violent and extreme material in circulation.
Data sourced from Thorn
CSAM exists to depict the exploitation and abuse of children of all ages and genders. Based on available data, CSAM in circulation is more likely to show prepubescent children than older minors.
CSAM is more likely to depict girls than boys, according to the 2022 records of identified victims known to NCMEC’s Child Victim Identification Program However, according to conducted by INTERPOL & ECPAT, male victims are more likely to be seen in imagery with higher levels of violence than female victims.
Sadly, the majority of CSAM is created by people with legitimate access to the child like parents, uncles, neighbors, and family friends.
victims are abused by someone known to them in their offline communities.
Data sourced from NCMEC
While offline access is the primary way abusers find the children they abuse and record in CSAM, the increasing role of technology and the internet in our lives has opened up new ways to target, groom, and exploit young people.
Now, someone seeking to abuse a child does not need to be in the same room with them. They simply need to be connected via the internet.
CSAM produced without any clear offender present in or to take the images is considered SG-CSAM.
These images or videos result from multiple sources, such as:
Regardless of how these materials are produced, they are classified as CSAM. Once distributed, these images can be weaponized to manipulate the child, potentially for obtaining more images, arranging a physical meeting, or extorting money. Additionally, these images can circulate widely, catering to those seeking specific fantasies. Predators might also use these images to groom other potential victims.
minors aged 9-17 have shared their own SG-CSAM.
view this as normal behavior for kids their age.
While these behaviors are more common among teenagers, 1 in 7 9-12-year-olds still say they have shared their own SG-CSAM.
It is slightly more common for SG-CSAM to be shared as part of an offline relationship. However, roughly 40% of kids who have shared SG-CSAM say they have done it with someone they only know online.
In addition, 42% of minors reported they believed the person they had shared SG-CSAM with was over 18
compared with 66% who believed the person they had shared SG-CSAM with to be under 18.
As technology continues to evolve, so too are the avenues of online child sexual abuse and the production of CSAM. Generative AI technologies are the latest example of this.
This technology’s use in everyday life is still in its early stages, but it has quickly become a mechanism through which to abuse kids.
While currently responsible for a small amount of CSAM, generative AI models are already being manipulated to produce custom CSAM. This is being used to generate new material of historical survivors, and custom material of specific kids from benign imagery for the purposes of fantasy, extortion, and, in some cases, peer-based bullying within schools and communities.
CSAM, in all its forms, is often the only clue available to locate a child and identify their abuser. Sadly, these images may not be detected by law enforcement until months or years after being recorded, leaving the child at risk for continued abuse.
Exploring the pathways leading to these images can offer critical insights and potential opportunities for intervention before CSAM is created.
The issue of CSAM is massive. But every day we work to accelerate hope.