President Donald Trump has signed the Take It Down Act into law, which makes posting nonconsensual intimate images a federal crime. This also includes those created using AI.
The new law addresses the rising crisis of revenge porn and AI-generated deepfake abuse, particularly targeting women. Whether real or AI-generated, any intimate image shared without consent could now lead to up to three years in prison.
“Anyone who shares intimate images without consent will face real consequences,” Trump said at the White House Rose Garden signing ceremony.
This legislation was supported by both parties and shows a rare moment of unity in Congress. It’s also a major policy win for First Lady Melania Trump, who championed the bill as part of her revived Be Best initiative.
The law requires online platforms to quickly remove flagged content, closing a long-standing loophole that allowed harmful material to circulate unchecked.
This is done in response to the disturbing surge in AI tools being used to create fake nude images, often to harass and humiliate women. Victims have ranged from celebrities like Taylor Swift to everyday teens being targeted by classmates.
“It’s a national victory,” said one speaker at the event. “This gives families and parents a real tool to fight back.”
Still, not everyone is cheering. Groups like the Electronic Frontier Foundation have raised concerns, calling the law a potential “dangerous censorship tool” that could restrict free speech if misused.
Others say the law is long overdue. AI ethicist Renee Cummings called it a “significant step,” stating the need for strong enforcement as technology keeps evolving.
One mother whose daughter was a victim of deepfake abuse put it plainly: “Now I have a legal weapon in my hand. No one can ignore this anymore.”
Technology may evolve, but so must the laws that protect people from its misuse.