Skip to content

New Technology to Help Companies Keep Young People Safe

June 21, 2023

5 Minute Read

To respond to the growing threat of inappropriate relationships between adults and children on digital platforms, Thorn and the Tech Coalition— an alliance of global tech companies working together to combat child sexual exploitation and abuse online—are pleased to announce a new initiative that enables the tech industry to respond to the issue.

In partnership with multiple Tech Coalition members committed to this effort, Thorn works closely with trust and safety and engineering teams to tailor, train, and evaluate technical solutions to identify and address attempts to groom and exploit a young person.

Most often, companies develop novel technology designed to enforce their platform’s child safety policies and terms of service, recognizing how harm can manifest specifically on their services. Developing technical solutions that are useful and usable for a range of platforms offering text-based communications that are subject to enforcement measures means Thorn and the Tech Coalition help more companies keep young people safe.

Online grooming is extremely common. Children are increasingly vulnerable because they regularly connect with people they only know online and yet don’t consider as strangers, even when that online friend is an adult.

In a recent report about grooming, Thorn discovered that:

  • 1 in 3 (32%) young people said that the friends they make online are among their closest friends. Only 14% of minors categorized their online-only contacts as “strangers.”

  • Nearly half of all kids online (40%) have been approached by someone who they thought was attempting to “befriend and manipulate” them.

  • 1 in 4 minors stayed in contact with online-only connections despite being made to feel uncomfortable.

  • 40% of minors have experienced contact online from people they’ve never engaged with before, immediately soliciting nude images, including roughly 1 in 4 (29%) of 9-12-year-olds.

Likewise, the National Center for Missing and Exploited Children (NCMEC) saw an 82% increase in “Online Enticement” reports from 2021 to 2022, including scenarios of grooming and related harms such as financial sextortion.

The concept of grooming is not new, but technology and the Internet have changed how it manifests in daily life. Through features such as online gaming, livestreaming, metaverse platforms, instant messaging social platforms, and more traditional photo and video sharing platforms, it’s never been easier for adults seeking to abuse children to gain access, develop trust, and manipulate children into harmful situations on a global scale. These platforms create a complex ecosystem where harm may initiate on one platform and then move onto the next, and where the communities that minors form with each other can be targeted for exploitation and abuse.

Rapid advances in perpetrator tactics require tech companies to innovate even faster to address the threat. Companies face different problems identifying when this harm occurs in text-based exchanges. The sheer volume of text on a chat platform makes it difficult to sift through and find instances in which this content violates a platform’s child safety policies. Similarly, the volume of user reports can make triaging and sorting out false positives infeasible for trust and safety teams. That’s why over the past several years, Thorn’s team has been working on an NLP (natural language processing) classifier, or machine learning model, that detects and categorizes when online content or behavior falls into defined “classes” related to grooming (such as exposure to sexual material or seeking an in-person meetup with a minor) as well as an overall score for how related a conversation is to grooming.

Here’s how it works:

Grooming Risk: 90%

  • User 2: u got any other girls my age u chat with? [Age: 19.3%]

  • User 1: one

  • User 2: yeah

  • User 2: where does she live?

  • User 1: shes in ny [PII: 98.9%]

  • User 2: how old is she? [Age: 98.5%]

  • User 1: shes 11 but she looks mature in her profile pic  [Age: 87.6%, PII: 39%]

Thorn has taken advantage of recent advances in artificial intelligence to develop comprehensive solutions for grooming detection. Rather than waiting for users to report after harm has occurred, Thorn’s ensemble of classifiers enables real-time detection and prioritization of conversations where a child may be in immediate danger, including grooming situations, pressuring children to send illicit imagery and other forms of inappropriate contact between adults and minors. A company can use these classifiers in unencrypted spaces and where users can expect their communications, even one-on-one with another user, must follow the company’s policies and terms of service.

At the base of the ensemble is a language model that learns grooming-specific language patterns. Additional classifiers are layered on top of the language model to predict different categories, all related to grooming, for each message in a conversation as well as producing a risk score for the conversation as a whole. Trust and safety teams can then use this information to quickly identify cases and prioritize a trained team member to review further. Problematic conversations can be promptly surfaced to these teams, and the parts of the conversation that violate platform policies are immediately highlighted for review.

The goal of the joint initiative between Thorn and the Tech Coalition is to build tools that enable those tasked with enforcing a platform’s child safety policies and terms of service to better identify, prevent, and mitigate harm done to children. With shared goals of developing best-in-class technology, lowering barriers to adoption, and facilitating collaboration through partnerships, Thorn and the Tech Coalition are proud to use this technical solution to disrupt online grooming and prevent harm to children in digital spaces.

What you can do:

To understand more about the state of online grooming and considerations for detection, response, and prevention see the Tech Coalition’s latest research here.

————

Originally published: June 20,2023 on technologycoalition.org

Rob Wang, Staff Data Scientist, Thorn

Dr. Rebecca Portnoff, Director of Data Science, Thorn

Lauren Tharp, Tech Innovation Lead, Tech Coalition



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.