Skip to content

Child Safety Solutions: Hackathon Elevates Image Hashing

May 24, 2016

3 Minute Read

Philip Hölzenspies is a Software Engineer at Facebook. He also works to maintain the platform’s PhotoDNA infrastructure, which helps accelerate the identification, removal and reporting of child abuse imagery.

In the (online) Child Safety space, tech companies get to forget about being companies for a bit and focus on what really matters. People who work in this space in tech companies know this, sort of. It is invigorating, though, to experience it so acutely during a cross-company Child Safety Hackathon. This type of event provides a rare networking opportunity to meet people and share insights with those working in the same space. While conferences on child safety cover a much wider range of highly critical areas — law enforcement, signals for social workers, post-traumatic pastoral care, etc. — they’re far removed from the day-to-day job of a software engineer. A hackathon like this provides the opportunity to share war stories, personal drivers and innovative solutions that are tied to this work.

My job at Facebook is to build and maintain our PhotoDNA infrastructure. Microsoft has made this technology available to many companies for the purpose of fighting Child Exploitation Imagery (CEI). Recognising that start-ups are not always in a position to implement such technology, they even make it available as a cloud service.

Strengthening the Approach to Image Hashing

At a hackathon, you typically try to solve a problem that you don’t work on every day. How often can you sit in a room with the points-of-contact for PhotoDNA infrastructure at Google, Facebook and Microsoft and work with passionate freelancers, people from the National Center for Missing and Exploited Children (NCMEC) and Thorn?

Since the answer to that question is the rather obvious “not very often,” we took the opportunity to work out kinks in PhotoDNA. Like any hashing technology, there are opportunities for improvement. When you bring experts from different companies together to brainstorm about strengthening the industry standard, you learn new things. That said, the most common problems, everybody had come across. For example, some images are harder to recognize than others.

Another challenge in improving an industry standard image hashing approach is that you simply cannot change the hashing itself. NCMEC maintains databases of hashes of known child exploitation imagery. If you change how a hash is computed for a photo, it can no longer be compared to the hashes in that database. We managed to solve a few common problems not by changing the hashes, but rather how these hashes are compared. The full impact, in terms of an increase in CEI detection, is still to be determined, but I would call the hackathon a great success by the resulting work alone. This might seem somewhat counter-intuitive, but this hackathon provides a rare opportunity to share insights across platforms.

If you’d like to share your talent and time to stop online child abuse, and get involved in the next Thorn event, please submit a Digital Defenders survey.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.