Today we commend Apple in taking a critical step in announcing their commitment to identify and report CSAM within a privacy-forward environment — and detailing the technology they’ll use to do so.
Apple has outlined a series of advancements that have the potential to reduce online risk for children, including “innovative new technology [that] allows Apple to provide valuable and actionable information to [the National Center for Missing and Exploited Children] and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques….”
This is a significant milestone that creates the potential for systemic change, bringing us a step closer to a world where the cycle of trauma is disrupted for survivors, every platform with an upload button is proactively detecting CSAM, and every child has the opportunity to simply be a kid.
Balancing privacy and child safety
At Thorn, we work every day towards our long-term goal of eliminating child sexual abuse material from the internet. As we do this work, we hold a belief in privacy for all, including for children whose sexual abuse is documented and distributed online without their consent. We know that discovering how to defend children from sexual abuse while maintaining user privacy is difficult and will require ingenuity, creativity, and collaboration.
As our VP of External Affairs, Sarah Gardner, said in her recent TEDx Talk about the urgent need to balance online privacy with digital safety for children: “It’s hard and it’s never been done before. But we’re going to have to do it.”
We consistently advocate for the will and resources from some of the brightest minds in technology to be directed at creating a future free of online child sexual abuse. This is not a battle of one, but a battle of many. There is a growing community of those who demand an end to the spread of child sexual abuse material (CSAM); who say yes to solving the world’s most difficult challenges, recognizing that we — the builders, creators, and defenders of the internet — choose to use technology for good.
And to be successful we need companies like Apple — and many others — to continue to collectively turn their innovation and ingenuity to this issue, creating platforms that prioritize both privacy and safety.
It’s something we have to do — for every child victim and every survivor whose most traumatic moments have been disseminated across the internet. And as is often the case when doing this work, someone has to take the first step.
Taking the next steps
It’s difficult to overstate the importance of this moment, yet it’s still a first step. We look forward to continuing to work with Apple to understand the impact of this technology, and to continue to optimize and build upon it.
As technology continues to move towards private environments, Thorn has asked the question before: how will platforms maintain the progress of the past decade in detecting, removing, and reporting the millions of child sexual abuse files that get shared daily across the internet?
We now have one answer and will continue to advocate and work with global technology leaders to channel more will, resources, and ingenuity to address this issue. We must continue to collaborate and work together across industry, law enforcement, NGOs, and policy to ensure the children who need us the most are defended with the best and most innovative ideas we can collectively conjure and build.
Today we are one step closer to the internet we believe in, one where every child can be safe, curious, and happy. I look forward to taking many more steps, together, to make that world a reality.
—Julie Cordua, Thorn CEO