Trump Signs Bipartisan Law Criminalizing Deepfakes and Revenge Porn

Trump Signs Bipartisan Law Criminalizing Deepfakes and Revenge Porn

President Donald Trump signed the Take It Down Act into law on Monday, establishing the first federal criminal penalties for sharing nonconsensual explicit images and requiring social platforms to remove such content within 48 hours. The bipartisan legislation targets both real and AI-generated explicit imagery, including revenge porn and deepfakes.

Key Points

  • First federal law criminalizing revenge porn and explicit deepfakes with penalties including imprisonment
  • Platforms must remove flagged content within 48 hours and delete duplicates
  • Bipartisan effort sponsored by Senators Cruz (R) and Klobuchar (D), with First Lady Melania Trump's support

The new law marks an important shift in how online platforms must handle nonconsensual intimate content. Until now, victims have relied on a patchwork of state laws with varying degrees of protection. Tech companies have largely set their own policies for handling such material.

"This law gives victims real recourse," said Senator Ted Cruz (R-Texas), who co-sponsored the bill with Senator Amy Klobuchar (D-Minn.). Cruz mentioned that he was motivated to act after learning about a case where Snapchat allegedly took nearly a year to remove an AI-generated deepfake of a 14-year-old girl.

The full title—Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act—reflects its comprehensive approach to both traditional revenge porn and newer AI-generated content. Violators can now face federal criminal charges, including fines, imprisonment, and mandatory restitution to victims.

For tech platforms, the 48-hour takedown requirement represents a new compliance challenge. The law defines "covered platforms" as any public website, online service, or application that primarily provides a forum for user-generated content. These platforms must establish clear processes for victims to report content and must also remove duplicate instances of the same material.

First Lady Melania Trump, who lobbied for the legislation, called it "a critical step toward protecting dignity in the digital age." The law makes important distinctions between adult and minor subjects. For adults, the publication must be intended to cause harm. For minors, the bar is lower—content is prohibited if intended to harass or to gratify sexual desire.

The bipartisan support for the law shows growing consensus around the need for federal regulation of harmful digital content. Previous legislative attempts to regulate online platforms have often stalled due to concerns about free speech or technical feasibility.

For tech companies, particularly social media platforms, the law creates new operational requirements. They'll need to develop or enhance content moderation systems capable of quickly identifying and removing flagged content across their networks.

Legal experts note that the law's passage may signal a shift in Washington's approach to regulating tech platforms. Rather than broad reforms to Section 230—the law that generally shields platforms from liability for user content—Congress appears to be carving out specific exceptions for particularly harmful content.

The law goes into effect immediately, giving platforms a narrow window to ensure compliance with the 48-hour takedown requirement.

Chris McKay is the founder and chief editor of Maginative. His thought leadership in AI literacy and strategic AI adoption has been recognized by top academic institutions, media, and global brands.

Let’s stay in touch. Get the latest AI news from Maginative in your inbox.

Subscribe