April 3 could create a dangerous gap in child safety across Europe
April 3, 2026
6 Minute Read
April 3, 2026 update: The legal exemption has expired, leaving a crucial child safety gap in Europe. To find out more about what this means for platform safety, please visit our explainer blog.
——–
On April 3rd, a temporary legal exemption allowing platforms to detect child sexual abuse material in Europe is set to expire — and the consequences could be significant. Thorn’s Head of Communications, Cassie Coccaro, sat down with Director of Policy Emily Slifer to break down what’s happening, why it matters beyond Europe, and what’s at stake for children if this legal gap takes effect.
Transcript
Cassie Coccaro, Thorn
Okay, there is something happening in Europe right now that is not getting nearly enough attention and it should. So on April 3rd, platforms could actually lose the legal ability to detect child sexual abuse material across the EU. Not because anyone wants this, but because policymakers couldn’t get to an agreement on time. So I’m sitting down today with Emily Slifer. She’s Thorn’s Director of Policy. She’s based in Brussels. She’s been watching the situation really closely and watching it unfold.
She’s gonna break down today for us what this means, why the stakes are so high, and why, despite this being a thing that’s happening in Europe, the world needs to care. So thank you so much for joining me to have this chat, Emily.
Emily Slifer, Thorn
Thanks so much, Cassie, for having me.
Cassie Coccaro, Thorn
Okay, so first, if you could just set the scene, what’s happening in Europe right now that has you worried?
Emily Slifer, Thorn
So yeah, you said it at a high level, but come April 3rd, there’s no longer going to be a legal basis that allows for companies to detect for child sexual abuse material. And like you said, it’s not because it’s what people want, but in a way, politics kind of got in the way here. It should have been a very easy quick fix that meant that all they did was extend a piece of legislation, but instead on April 3rd, this legislation will expire and there is no legal basis to do so.
Cassie Coccaro, Thorn
Okay, so from what I understand from our conversations, we’ve actually seen this before, right? What happened the last time there was a gap? Can you walk us through that a little bit?
Emily Slifer, Thorn
Yes, so this is about five years ago, it was in 2021, when they first had to draft this legislation to fix something. There was about a seven month gap in which there was no legal basis for the detection of CSAM. Most companies decided to take that risk on, a couple chose not to, and it meant that there is a 58 % reduction in files reported to NCMEC.
That equates to like 2.5 million pieces of abuse roughly. So quite a lot of material wasn’t found, you know, and as you know, Cassie, that’s not, it’s not just files. Those are children, you know, that is abuse material that is not being taken down and removed and got to law enforcement.
Cassie Coccaro, Thorn
Yeah, this sounds like it could potentially be a pretty serious situation. So we’ve seen it before. Why? I’ve heard you say that this time it might actually be worse. Why is that?
Emily Slifer, Thorn
I think there are two things. One, the technology is different. We’re in a different era at this point. We’re not only seeing greater volumes of CSAM, but we’re seeing it become more violent and more kind of aggressive CSAM. And then on top of that, when you go into the politics, because of this expiration, there’s not really a clear path forward on how we fix it. It’s going to take quite a lot of work and quite a lot of time to find a new legislative solution here. So this could go on for much longer than seven months this time.
Cassie Coccaro, Thorn
Wow, okay, so the thing that gets me is working at Thorn, I know that the companies actually want to do this, even with all the news lately about safety issues on tech platforms. This isn’t a story necessarily about tech refusing to act, right?
Emily Slifer, Thorn
Not at all. This is very much about a policy problem. And as I’ve been following this, we’ve seen the companies come out very publicly and say that they want to continue doing what they’re doing. Nobody wants CSAM on their platforms and they want to be able to do what is necessary. They’re the ones who know how to innovate in the fastest way to create solutions and things. They want to be able to do that. So this isn’t about a lack of will this time. It’s about a lack of political will actually.
Cassie Coccaro, Thorn
Okay, so we decided to go out there and kind of talk to the world about this and tell them that this is happening, but help me understand the global piece a little bit. I’m worried that when people hear this, they’re gonna say, that’s a European law and kind of put it in the back of their heads. Why should someone in the US or even elsewhere care about this when it’s happening in Europe?
Emily Slifer, Thorn
Well, first and foremost, the data is all connected. You can’t silo the data to just one geographic part of the world anymore. But even more so, this abuse doesn’t happen in a vacuum in one singular place. That is one of the things that comes with the internet, right? It could be a European child who’s being abused and live streamed in the US, or images of an American child that have been taken in the US but are sent to European users. And we’re going to lose that. We’re not going to be able to see that because they’re not going to be able to detect for that anymore.
So again, it’s not just a European problem, it’s a global problem, as is all of the work we do at Thorn.
Cassie Coccaro, Thorn
Yeah. What about AI generated content specifically, AI generated child sexual abuse material, AI facilitated harms, things like that. Why does that make this piece so much more urgent?
Emily Slifer, Thorn
Yeah, I think the biggest thing that we’ve seen with AI generated material is how it can scale the harm. No one wants their product to be used in this way, but it unfortunately has been used to create AI generated CSAM. So it could be that it’s completely innocent images of a child that are turned into CSAM or existing pieces of CSAM and the child’s been rescued, but they use those images to create more abuse images of that child. It’s very much like an enabler, it helps scale the problem at a rate that we didn’t experience five years ago when we had a gap last time.
Cassie Coccaro, Thorn
That’s really scary. So I have a feeling people are going to hear this and hopefully think more about it and be worried. So what do you want people, politicians, others following this to do with all of this information right now?
Emily Slifer, Thorn
I would say if you’re the general public, if you’re an EU citizen, you need to be talking to your policymakers and tell them to go back to the drawing board. We need to have a solution. We can’t allow this gap to happen. They do still have a couple of days to do so. If we do end up making it to April 3rd and we have a gap, we still need to put pressure on. We still need them to come up with a solution.
Cassie Coccaro, Thorn
Okay, thank you so much. I’m assuming you’ll keep us all posted on what happens in the coming weeks.
Emily Slifer, Thorn
Absolutely. Thank you so much.