54% of children across the world have experienced at least one harmful sexual encounter online.
That’s one of the many startling statistics from WeProtect’s most recent Global Threat Assessment report, coming from a survey of 5,000 respondents now aged 18-20. And on the heels of WeProtect’s report, the National Center for Missing and Exploited Children (NCMEC) revealed that they had surpassed 100 million received reports of child sexual exploitation—almost all relating to images and videos circulating online of children being sexually abused, with nearly 80% of those files depicting children under the age of 12, according to the Canadian Centre for Child Protection.
The more we learn about the risks children face online, the clearer it becomes that no pocket of the internet, no pocket of the world, is immune. It’s a global, pervasive problem—and what happens next in the European Union (EU) may make the difference in responding with the urgency and scale this issue demands.
At Thorn, our mission is to eliminate child sexual abuse from the internet. To get there, the EU will play a pivotal role in combating the spread of child sexual abuse material (CSAM) online. The effects of what the EU does will be felt in the United States and beyond.
We stand at a critical moment as the right to privacy—something we strongly believe in at Thorn—is often falsely contrasted with tools that help platforms to detect, remove, and report CSAM.
The truth is that we can have both: we can protect the right of every user to privacy while defending children from sexual abuse. And we need the EU’s leadership to set a global example that will move us closer to universal adoption of the proactive detection, reporting, and removal of CSAM.
Here’s how lawmakers in the EU can help us to defend children across the world:
Make detection, removing, and reporting mandatory
In the United States, platforms are legally required to report CSAM if they find it on their platforms. In the EU, platforms can simply block or delete CSAM without reporting it, meaning perpetrators aren’t held accountable and victims may continue to be abused who would otherwise be identified and removed from harm.
Detecting and reporting CSAM to relevant law enforcement or reporting bodies should be mandatory for all EU based internet service providers and platforms. To accomplish this, we need a clear system that avoids duplicate reports to multiple jurisdictions and enables effective collaboration between enforcement bodies at a global scale.
The EU can set new standards for the safety of children, providing a clear legal framework for tech companies that encourages collaboration and proactive solutions.
Create an EU center that fits into the global child safety ecosystem
The EU needs a centralized entity to streamline and protect the most sensitive data—the documentation of a child’s sexual abuse—while working in partnership with platforms and law enforcement to develop best practices and effective tools to stop the spread of CSAM.
The European Union essentially needs its own version of the U.S.’s NCMEC. This entity must seamlessly fit into the global ecosystem while acting as a regional masthead. The efficiency such an entity would provide could greatly increase the removal and reporting of CSAM, disrupting the cycle of trauma of survivors, while accelerating the identification of victims in potential danger.
The flow of data is critical in ensuring collaboration across borders to identify child victims and prosecute criminals. This new center will need to ensure systems are able to talk to each other so data can be shared securely with law enforcement entities globally.
Encourage innovation through legislation
Legal certainty on the use of smart and secure technology such as hash-matching, classifiers, and anti-grooming tools is vital, and should be considered fundamental. We must also ensure that innovation is not only legally protected but encouraged in order to meet the rapidly evolving threats in this space.
Any new legislation from the EU should allow for innovative tools to be created, tested, and used. We need transparency and safeguards—but not in a way that would make technology useless by providing perpetrators a roadmap to reverse engineering or circumventing solutions.
The interim derogation to the ePrivacy directive that was adopted in the EU last summer found a good balance by setting in place prior consultation and greater reporting standards. It didn’t create an overburdensome set of requirements that prohibits technologies from being created, tested, and implemented. This balance is important and should be maintained in any long-term legislation.
Global systems as they are today aren’t doing enough to defend children from sexual abuse online. We need the EU’s leadership now to make a long-term commitment, backed with thoughtful legislation, to building a safer internet—one where every child can simply be a kid.
For media/press inquiries, contact Caroline Schröder at Caroline.Schroeder@fgh.com.