LGBTQ+ Youth and Online Risks
We examined the heightened risks of sexual exploitation online for LGBTQ+ youth and how this group responds to harmful digital experiences.
In 2023, your unwavering support was pivotal as we faced rapid advancements in powerful technologies, new dangers to children online, and the continued spread of child sexual abuse material (CSAM) across the internet. Through it all, you remained at the heart of Thorn’s mission to defend children from sexual abuse.
With your generosity, Thorn stayed nimble and resolute, responding to emerging threats to children. Your role in protecting kids has been profound, and today I’m proud to share the impact we’ve achieved together.
Because of your support:
As you read this report, I hope you feel our gratitude for your help in driving these incredible accomplishments.
The reality is that child sexual abuse is happening everywhere, in our communities and to children we know — real kids with big hopes and dreams.
But, you and I and our incredible staff at Thorn are improving the way the world responds to child sexual abuse. Thank you for your dedication to creating a much brighter future — a world where every child is free to simply be a kid.
Julie Cordua, CEO
Total child sexual abuse material (CSAM) files detected
Total files processed
Average triage time saved by investigators using the CSAM Classifier
Research reports published
NoFiltr social engagements
Parents signed up for conversation tips
For kids today, the internet is a place for self-discovery, socializing, and building meaningful connections. But it also brings significant risks. Talking about these risks with youth — early and often — can make a tremendous difference in keeping kids safe online. Thorn’s youth and parent education programs do exactly that, fostering vital conversations in safe, open, nonjudgmental ways.
In 2023, our youth-centered program, NoFiltr, received over one million engagements on its social media — reaching youth with critical prevention and support messaging in a fun and informative way. Additionally, NoFiltr’s peer-to-peer conversations continued to elevate youth voices around the issues they face online every single day.
Prevention content pieces published
Engagements across NoFiltr social channels
Prevention-related livestream views
Educational quizzes submitted
NoFiltr was such a transformative experience for me! It’s unique in that it empowers youth to advocate for digital safety. […] I’m so grateful for the networking, professional development, and internet safety education I got.
Last year, the NoFiltr Youth Innovation Council got the chance to connect with tens of millions of youth on the community app Discord. Partnering for Safer Internet Day, they discussed how to build safe and healthy digital habits.
“My experience collaborating with Discord not only grew my knowledge about digital literacy, [and] it was very fulfilling to know that such a large company actually wants to hear and implement the ideas of younger generations.” —Cayden, 16
Discussing online risks — like sharing nudes — can be pretty awkward and tough for many parents and caregivers. Fortunately, Thorn for Parents is there to help. And in 2023, over 10,000 parents visited our resource hub for tips on conversation starters, as well as other tools to help them have judgment-free — and potentially lifesaving — conversations with their kids to prevent abuse before it starts.
Your support helps us create these invaluable resources.
Parents signed up for tips
Visitors to our parent resource hub
Once I learned about the reality of abuse so many children face every day, I couldn’t turn away from it. I’m honored to support Thorn and to be part of the life-changing work they’re doing. I hope everyone who knows about this issue stands with me.
Not only does Thorn support youth and parents navigating these everyday challenges, but in 2023, we helped others on the front lines of defending children from sexual abuse: law enforcement officers.
When investigating child sexual abuse cases, law enforcement officers face daunting challenges, including the overwhelming task of sifting through mountains of digital evidence. The forensic review of those files can be time-consuming and traumatic for officers exposed to such content. That’s where Thorn’s CSAM Classifier is a game changer. In 2023, our technology helped investigators significantly reduce the time it took to identify victims, so they could remove those children from harm.
Average triage time saved by users of our CSAM Classifier
Law enforcement agencies using Thorn tools
Manual review of digital evidence in child sexual abuse cases can take months, and span phones, laptops, and hard drives. Our CSAM Classifier automates and speeds this forensic review. Using predictive AI, the classifier detects new and previously unreported CSAM — material that hasn’t yet been classified as CSAM. Finding new CSAM helps officers identify child victims and solve cases faster.
The classifier also helps protect the well-being of officers by reducing their exposure to harmful content.
I learned how prevalent child sexual abuse is in the US and I felt called to defend all children (including my own) from predators. I do that by donating to Thorn – an organization that even law enforcement relies on to rescue and protect victims.
Getting the CSAM Classifier into as many law enforcement agencies as possible is critical to accelerating their investigations. In 2023, we proudly launched a beta partnership with Griffeye, the Sweden-based world leader in digital media forensics for child sexual abuse investigations. Our CSAM Classifier is now available in Griffeye Analyze, a platform used by law enforcement worldwide.
The viral spread of CSAM has compounding effects, from revictimizing survivors to normalizing this horrific behavior. That’s why Thorn has an audacious goal: eliminate CSAM from the internet.
Even after a child is rescued, images and videos of their abuse can circulate online for years, continuing the cycle of trauma. At Thorn, we equip tech platforms with advanced tools, insights, and connections to halt the spread of abusive content and prevent revictimization. In 2023, more companies than ever partnered with us in these efforts.
Total files processed
70%+ from 2022
Companies using Thorn’s CSAM detection products
CSAM files detected
365%+ from 2022
Files classified as potential CSAM
CSAM files matched
Last year, our comprehensive CSAM detection solution empowered tech platforms to detect more CSAM files than ever before.
Our team of experts offers guidance to platforms on developing child safety policies, on-platform interventions, new safety features, and prevention strategies — and in 2023, we began child safety red teaming for AI models.
I‘m so thankful for Thorn’s team. It really does restore a bit of hope in this world to know there are people dedicated to defending children who can’t defend themselves.
Despite the huge volume of content uploaded to Flickr daily, the company has always prioritized safety.
By deploying Safer’s CSAM Image Classifier, Flickr’s team could detect new and previously unreported CSAM images they likely wouldn’t have discovered otherwise.
One classifier hit led to the discovery of 2,000 previously unverified images of CSAM and an investigation by law enforcement.
Thorn’s unique position at the intersection of technology and child sexual abuse is supported by our groundbreaking research, which sheds light on this dark and highly complex issue.
Since 2019, Thorn has undertaken research initiatives that deepen our understanding of the experiences and risks young people face online. Through this work, we gain powerful insights that allow us to stay nimble in the ever-accelerating digital landscape and respond to emerging threats quickly. Our findings inform the programs we create and provide critical insights for the broader child safety ecosystem.
Original research reports published
Report media impressions
We examined the heightened risks of sexual exploitation online for LGBTQ+ youth and how this group responds to harmful digital experiences.
Read the Report
In a collaborative initiative, we conducted an extensive study on the risks that AI poses in child sexual exploitation, publishing some of the earliest research on the issue.
Read the Report
Our annual research with youth showed continued trends in sharing nudes, online sexual interactions with adults, and the need for youth-friendly safety tools.
Read the Report
Through a collaborative study, we identified demographic, technology-use, and social factors associated with youth sharing self-generated sexual content.
Read the Report
We surveyed 1,000 minors about their perspectives on disclosing harmful online sexual interactions, including their use of reporting and safety tools.
Read the Report
Thorn’s deep expertise in the ever-changing digital threats to youth means we also take our efforts to Capitol Hill, Brussels, and beyond, defending children from sexual abuse and exploitation at the policy level.
Thorn works with politicians globally — especially in the U.S. and EU — to help them understand why it’s critical to protect children from harm, and why technology must be part of the solution.
Policy discussions
Technical briefings
Speaking engagements at summits
Last year, we were instrumental in EU child safety policy and regulation discussions, advising key members of government, industry, and nongovernmental organizations (NGOs). Our coalition, European Child Sexual Abuse Legislation Advocacy Group (ECLAG), hosted an event in Brussels to discuss the ongoing EU Child Sexual Abuse Regulation. Thorn’s director, Cathal Delaney, spoke on a panel highlighting the importance of this regulation for children’s safety online. Here at home, we advanced our efforts across U.S. legislation to bring an end to sexual harms against children online.
Leading up to the AI Safety Summit, Thorn participated in multiple UK government panels alongside Policing Minister Chris Philp.
We discussed child safety policy in industry roundtable discussions with leaders from Discord and WhatsApp.
Our teams conducted technical briefings for members of the European Parliament, 27 EU states, and NGOs.
In the U.S., we partnered with the End Online Sexual Exploitation and Abuse of Children (OSEAC) Coalition to impact critical pieces of legislation.
Thorn is doing some of the most important work in the world. This issue affects every person, whether directly or indirectly, hidden in the shadows where we work and live. I’m beyond grateful to help Thorn build a brighter, safer future for all kids.
In 2023, we continued to deepen our understanding of emerging technologies, influence the global conversation on child safety, and collaborate with other leaders to strengthen our collective ability to create a safer world for kids.
Already, the era of generative AI is introducing threats to children. To truly understand its implications, Thorn partnered with researchers at Stanford Internet Observatory, and in 2023, released Generative ML and CSAM: Implications and Mitigations, a leading-edge report detailing how AI can be misused to further sexual harms against children.
Thorn took the main stage at AWS re:Invent conference in a significant moment that bridged the gap between cutting-edge technology and its role in social responsibility.
In 2023, Thorn staff were interviewed as thought-leaders for 7 national publications.
Everything we achieve is only possible because of our generous donors. By believing in our work and supporting it financially, our Thorn community propels our mission to defend children forward. In this way, our supporters — like you — are true heroes, helping us build a world filled with childhood joy.
We need your help to defend children from sexual abuse. When you make a gift to Thorn, you take a stand to protect kids and help create for them a safer, brighter future.
Together, we are unstoppable.
Your support enables us to make a difference for children every day.
Financial data is unaudited.
We envision a world where kids experience the joy of childhood free from sexual abuse. And where child sexual abuse material is eliminated from the internet.
Thorn won’t stop until we get there. And we’re grateful to have you by our side. Thank you for your support in 2023 and 2024, and for the years to come.