Skip to content

🌟 See the difference you made: Read the 2024 Impact Report

Thorn celebrates the passage of the Take It Down Act

May 19, 2025

5 Minute Read

One of the latest threats targeting teens and children is deepfake nudes. These AI-generated images depict real people in sexually suggestive or explicit situations or activities, and can be nearly indistinguishable from real photos.

Policymakers play a key role in protecting children, including from the dissemination of deepfake nudes. Among many other interventions, policymakers must draft and pass legislation designed to protect children from threats like deepfake nudes.

That’s why Thorn supports the Take It Down Act, an important new law that will close a key legal gap by criminalizing the knowing distribution of intimate visual depictions of minors—whether authentic or AI-generated—when shared with intent to harm, harass, or exploit. 

The new law also strengthens protections against threats of disclosure used for intimidation or coercion, ensuring that those who exploit and extort children online are held accountable.

Why Thorn Supports the Take It Down Act

We support the Take It Down Act and applaud Sen. Ted Cruz, Sen. Amy Klobuchar, Rep. Maria Salazar, and Rep. Madeleine Dean for driving forward this critical piece of legislation as a step toward protecting kids from online exploitation, including deepfake nudes.

Our latest research at Thorn found that 31% of teens are already familiar with deepfake nudes, and 1 in 8 personally knows someone who has been targeted. These manipulated images can be used for harassment, blackmail, and reputational harm, causing significant emotional distress for victims.

As deepfake technology grows more accessible, we have a critical window of opportunity to understand and combat this form of digital exploitation—before it becomes normalized in young people’s lives—and to act on their behalf to defend them from threats.

By closing a key legal gap, this law criminalizes the knowing distribution of intimate visual depictions of minors—whether authentic or AI-generated—when shared with intent to harm, harass, or exploit. Importantly, it also extends penalties to threats of disclosure used for intimidation or coercion, providing stronger protections for child victims.

The Take It Down Act represents a crucial step toward our collective ability to keep pace with evolving threats and ensure that those who exploit children online are held accountable.

About the Take It Down Act: What You Need to Know

What are the key components of the Take It Down Act?

  1. The Take It Down Act introduces criminal penalties for any person who knowingly publishes intimate visual depictions of an identifiable adult without consent and with intent to cause harm. This includes both authentic imagery and AI-generated imagery (deepfakes).
  2. The Take It Down Act introduces criminal penalties for any person who knowingly publishes intimate visual depictions of minors, with intent to humiliate, harass, or degrade the minor; or sexually arouse any person. This includes both authentic imagery and AI-generated imagery (deepfakes). These criminal penalties do not apply to imagery that is considered child pornography, since that is already criminalized under 18 U.S. Code § 2256 and 18 U.S. Code § 1466A.
  3. The Take It Down Act introduces criminal penalties for any person who intentionally threatens to distribute intimate visual depictions of minors or adults, as described above, for the purpose of intimidation, coercion, extortion, or to create mental distress.
  4. The Take It Down Act requires covered platforms to establish a “notice and removal process” to remove non-consensual intimate visual depictions, including AI-generated digital forgeries, and its copies within 48 hours of notice.

What could this bill mean for combating child sexual exploitation and abuse?

This new law has several implications for combating online child sexual exploitation and abuse. 

First and foremost, the Take It Down Act’s introduction of criminal penalties for the knowing publication of intimate visual depictions of minors fills an important legal gap around nude and exploitative images of a child. These are images that would be considered offensive, but do not meet the legal definition of child pornography—and, thus, are not criminalized in the same way that child sexual abuse material (CSAM) is. This presents a barrier to prosecution in some cases. By closing this legal gap, and criminalizing both authentic and AI-generated nude and exploitative images of a child, prosecutors will be able to better pursue offenders in child exploitation cases and ensure justice for all child victims. 

Secondly, the Take It Down Act’s addition of criminal penalties for the threat of disclosure of intimate visual depictions of minors, for the purpose of intimidation or extortion, is a critical step toward addressing the growing crisis of sextortion in this country. Our recent sextortion research indicates that 812 reports of sextortion are received by the National Center for Missing and Exploited Children (NCMEC) weekly, with more than two-thirds involving financial demands.  

Lastly, the Take It Down Act requires covered platforms to establish a process to remove intimate visual depictions, both authentic and AI-generated, within 48 hours of being notified by the victim. This process provides an important avenue for remedy for both child and adult victims. 

The swift progress of the Take It Down Act through Congress and to the President’s desk indicates that preventing the online sexual exploitation and abuse of children is a serious priority. We applaud the legislators who championed this bill and worked diligently to ensure that these important protections for children were signed into law. 

Learn More

Together, we can continue to defend children from sexual abuse and exploitation.


Get the latest delivered to your inbox