We’re disappointed in Apple’s decision to pause its work to implement child safety features without providing a new roadmap or timeline for implementation.
Our expectation of Apple is that they publish a detailed timeline and clear deliverables to demonstrate how they will maintain their commitment to improve their child safety measures and implement scalable detection of child sexual abuse material (CSAM) in iCloud Photos.
We can create solutions where the privacy rights of all people – adults and children – are balanced and respected. If this pause means Apple can deliver an even stronger solution, then we look forward to working with them to strengthen their efforts.
Inaction is not an option.
Every child whose sexual abuse continues to be enabled by Apple’s platforms deserves better.