Today, Google announced plans to begin rolling out encryption for Android Messages. As experts working at the intersection of technology and child protection, we understand the real-world impact of this decision will be a step backwards in child safety. The minute end-to-end encryption is enabled, it will become impossible for Google to detect, remove, and report illegal and harmful images and videos of children being sexually abused that are shared on this service.
The trading of child sexual abuse material (CSAM, known legally in the U.S. as child pornography) on the Internet is a well-documented crisis. In 2019, there were over 69 million files of CSAM reported to authorities, representing over 15,000% growth of this horrific content in the past 15 years. And this is only what has been detected – we know there is much more that is undetected.
As more companies move to encryption, we continue to ask: How will you maintain the progress of the past decade in detecting, removing, and reporting the millions of child sexual abuse files that get shared daily across the internet?
Last year, Google filed nearly half a million reports of child sexual abuse material across some of their platforms. These reports to the National Center for Missing and Exploited Children are critical to the removal of abuse content across the internet, and can lead to the recovery of a child in immediate need. Removing abuse content from the web also disrupts the cycle of trauma for survivors who are re-victimized with every share, and reduces the demand for content from abusers.
Google has demonstrated that a balance between privacy and child protection is possible. The industry needs their continued leadership in this space.
Thorn supports privacy. Privacy for every person – including the children whose abuse is circulating online. While some have created a false binary between privacy and child safety, it is not a zero sum game. We can have both.
Google’s announcement has us asking: How are they going to ensure that their platforms are not being used for the illicit trade of child sexual abuse material?
Google’s plans come just one year after Facebook’s announcement to encrypt Messenger. Now, as the largest industry actors move in this direction, we must name the trade-offs of this trend. If tech companies continue to choose encryption without solutions for detecting child sexual abuse material in place, the world stands to lose 99% of its intelligence on CSAM. This means that abusers will be able to share illegal and harmful child sexual abuse material undetected on the same platforms that we, and our children, use every day.
Early in the new year, Thorn will be convening technical, product, and policy experts from within industry to build and establish middle ground solutions that both ensure privacy and protect children. We urge Google to join us in that effort.