Skip to content

New Thorn Research Examines Youth Experiences and Attitudes about Online Grooming

June 6, 2022

4 Minute Read

Many of us remember being taught about “stranger danger” as kids. It seemed fairly straightforward at the time: Be careful around strangers. Avoid talking to people you don’t know.

But in this digital age, socializing has completely changed – and risks have, too. The majority of young people make new friends online every day. They don’t know these individuals, but they don’t consider them strangers, either.

So where does this leave caregivers, educators, and platforms when it comes to preventing abuse online? What happens when online interactions cause the line between “stranger” and “friend” to disappear?

The first step to answering these questions is to understand the full scope of the problem.

WHAT DOES THE RESEARCH TELL US?
New research from Thorn – Online Grooming: Examining risky encounters amid everyday digital socialization – finds that nearly half of all kids online (40%) have been approached by someone who they thought was attempting to “befriend and manipulate” them. 

It also shows that roughly one in three young people said friends they make online are among their closest confidants.

OTHER KEY FINDINGS INCLUDE:

  • Young people view flirting or dating adults online as common.
    In fact, 1 in 4 9-12-year-olds sees it as normal for kids their age to date adults aged 18-20.
  • Minors are regularly approached online by someone they believe is attempting to “befriend and manipulate” them.
    4 in 10 minors (40%) said they have been approached online by someone they believe was attempting “to befriend and manipulate” them, with 47% of teen girls saying they have experienced this.
  • Young people are frequently asked for nudes by contacts they only know online.
    40% of minors have experienced cold solicitations for nudes online, including roughly 1 in 4 (28%) of 9-12-year-olds.
  • Online-only contacts often ask young people to move conversations from public platforms to private chats, increasing the vulnerability and opportunities for abuse.
    Nearly 2 in 3 (65%) minors said they have experienced an online-only contact to invite them “to move from a public chat into a private conversation on a different platform.” 

You can read more details about the research here.

WHAT SHOULD PARENTS AND CAREGIVERS KNOW?
Parents have a unique opportunity to help kids navigate online well-being and safety. But at Thorn, we know that these conversations aren’t easy to have. And no caregiver is alone in this, because we’re here to help.

In September 2021, we launched Thorn for Parents, a digital resource hub designed to assist parents and caregivers in having earlier, more frequent, and judgment-free conversations with kids about digital safety. 

There is no perfect way to navigate these tricky situations. Like most other things in parenthood, there really is no manual for getting it “just right.” But with the right resources and tools, we can make strides in helping defend children from harm online – together.

WHAT IS THE ROLE OF PLATFORMS IN PREVENTING AND ADDRESSING ONLINE GROOMING?
Platforms, including messaging services, must continue to improve and prioritize reporting functionality and their ability to respond quickly.

As things stand, the burden currently falls on children to defend themselves and report suspicious and uncomfortable behavior. We need to think collectively about shifting that burden to the platforms themselves so that kids can be free to safely explore online spaces.

This critical work is already in motion – but we have a long way to go before children are safe online.

The National Center for Missing and Exploited Children (NCMEC) recently released its annual overview of the number of CSAM reports they received in 2021. The findings are encouraging, especially when looking closely at the role platforms played in reporting CSAM last year.

The NCMEC report shows that 230 companies across the globe are now deploying tools to detect child sexual abuse material. That’s a remarkable 21% increase since 2020.

There is still much more work to be done – but the significant uptick in the number of platforms that detect child sexual abuse material has led to more reports being filed and more CSAM hashes created, helping make the fight against the viral spread of abuse material ever more effective.

WHERE DO WE GO FROM HERE?
On the internet, we’ve created a place where people explore freely — not only because of increased access but also because of the internet’s promise of anonymity and privacy. This is a good thing. 

However, the reality is that individuals continue to weaponize this technology to harm kids. We can, and must, be fully aware of this reality as we design environments that proactively minimize risk, deliver relevant programs that empower young people to explore safely and create scalable response systems to protect kids when they need help.

Thorn is committed to focusing on a child-centered/child-supported, globally-connected, proactive, and accountable community.

Together, we can make the internet safer and defend all children.

For media inquiries, contact Cassie Coccaro at cassie.coccaro@wearethorn.org.



Get the latest delivered to your inbox