The Take It Down Act: A Federal Solution to Revenge Porn, Sextortion, and Deepfakes
Take It Down Act Makes NCII a Federal Crime
In a rare moment of political unity, the federal government recently passed the Take It Down Act. Whereas most states already have similar laws, this act makes it a federal crime for someone to publish, share, or upload nonconsensual intimate imagery (NCII). Think nude, partially nude, or sexually explicit content not produced for broad sharing. The new law adds further protection to those online from falling victim to nonconsensual intimate images stemming from “revenge porn[1],” sextortion, and deepfakes. The following entry outlines the details of the new bill, as well as some potential negative consequences.
The Take It Down Act, signed into law on May 19, 2025, actually stands for “Tools to Address Known Exploitation By Immobilizing Technological Deepfakes on Websites and Networks Act.” It makes it a federal crime to publish one’s intimate images knowingly. With the new law, one could be charged with a federal offense if they upload a past lover’s intimate photos that were previously shared consensually. This example reflects the growing problem of “revenge pornography,” wherein a person discloses online intimate content produced by an ex-partner. Under the new law, a perpetrator could potentially be sentenced to three years in prison and fined for violating the act.
New Ways to Combat Nonconsensual AI-Generated Content
The law makes it unlawful for a person to use a computer service to knowingly publish an intimate visual depiction or digital forgery of an identifiable victim. The law defines an “intimate visual depiction” as an identifiable person engaged in “sexually explicit conduct.” A digital forgery includes instances where a sexual image is created or altered using artificial intelligence or other technological means. The latter provision captures incidents in which a perpetrator may use software to “undress” a clothed subject in a photograph. New technology also allows someone to manipulate the posture of the photo’s subject, potentially resulting in the person being placed in sexually suggestive positions. Such examples refer to the broad category of image doctoring known as “deepfakes.” This forged content is an increasing problem for youth, who alter the images of their peers to produce sexual content that is then disseminated among larger groups of young people. Under the new law, an identifiable victim does not need to be an actual person in their entirety. For instance, a victim may be identified by partial likeness involving their face, similarities, or distinguishing features. A distinguishing feature can include a tattoo, birthmark, or unique physical aspect.
Tech Platforms Required to Act Fast on Removing Nonconsensual Intimate Imagery
As part of the law, technology companies are now legally responsible for removing such content from their platforms as soon as possible but no longer than 48 hours of notification. For the law, companies that share user-produced content, such as social media, fall under the act’s purview. Conversely, those companies with platforms that do not permit user commentary and publishing are exempt.
Additionally, covered companies must establish processes that enable victims to report NCII to the companies. The new processes must allow an individual to notify the company in writing of the illegal content and include means of identifying it. Notably, the law prevents a technology company from being held liable when exercising good faith in removing content that ultimately proves to be legal. Technology companies have one year to comply with the law.
Digital Rights Groups Caution Against Overreach
On the surface, the Take It Down Act seems to be a step in the right direction on online safety. However, some opponents point to the law’s amorphous language as opening the door to free speech restriction. For instance, the Electronic Frontier Foundation, a nonprofit digital rights group, contends that the law applies to a broader domain of sexual content than just NCII. In their analysis, images involving any sexual or intimate content could be subject to takedown. It is unclear from the law’s wording how a company should determine if the material was consensually published or not.
Within this conceptualization, a bad actor may use the well-intended law to scrub the internet of content they do not personally approve of. For example, someone disapproving of expressions of LGBTQ intimacy could use the vaguely worded law to take such content down from the internet despite it being consensually produced. Opponents of the law fear that individuals may hijack the reporting process to scrub the internet of material they find morally objectionable. Flooded with takedown requests, companies may acquiesce to such demands rather than dedicating extensive resources to evaluating the reported material and engaging in the ethically challenging conversations of content moderation.
NCMEC Offers Simple Steps to Remove Intimate Content
As stated, the law makes publishing NCII a federal crime, but such laws already exist at the state level. In turn, many technology companies have already developed offices to address such issues specifically. A list of these companies can be found on the National Center for Missing and Exploited Children’s (NCMEC) website For example, if your images were uploaded to the social media site Discord, you can consult the website for step-by-step instructions on how to log a complaint report.
Also, NCMEC has an impressive new feature where those with under 18 intimate content online can directly address the issue themselves. By visiting NCMEC’s Take It Down (no relationship to the federal law of the same name), one can follow an easy set of instructions to have their content protected and flagged. In brief, the process involves assigning a hash value, or digital fingerprint, to an individual’s intimate image, video, or file. Once this personalized hash value has been cataloged, internet companies and platforms can use the unique hash value to identify intimate media and remove it from the internet. What’s great about NCMEC’s approach is that hash values can be assigned without needing to upload the content in question. The process is straightforward, easy, and can be completed in as few as four steps.
In total, like any technological advancement, the effectiveness of the Take It Down Act will ultimately depend on how it is implemented. I, for one, hope that the law becomes another instrument in society’s larger toolbox for combatting online exploitation. Finally, it is essential to note that the effectiveness of any security measure or system is directly proportional to the number of protective layers or tools it employs. In other words, while efforts by the federal government and NCMEC are invaluable, they alone will not be able to stop inappropriate or illegal online sexual behavior. Undoubtedly, a key part of any safety intervention involves talking to youth and minors before a mishap occurs. From this perspective, it is never too early to discuss online safety and good practices with youth. Any such discussion should ideally precede an introduction to the internet and technology.
[1] While the term “revenge pornography” is used in this blog, other phrases, like nonconsensual sharing of intimate images, are being increasingly used instead. To understand the language shift, one must consider the power of words and how specific phrases can perpetuate this harmful dynamic. First, the use of the word ‘revenge’ implies that someone has done something warranting their private information to be shared. When using the word revenge, it implicitly justifies the perpetrator’s actions. Also, the use of the word pornography inaccurately describes the dynamic. Pornography is a consensual sexual act with the explicit intent of widespread sharing. Sharing a personal
Connect with Dr. Rodrigues
Have questions about our resources or interested in having Dr. Rodrigues speak at your next event? We would love to hear from you.