The newly signed Take It Down Act, designed to curb revenge porn and deepfake abuse, has been lauded as a long-overdue victory for victims of nonconsensual image distribution.
Despite its good intentions, the law has also raised serious concerns among digital rights advocates and free speech watchdogs who fear it could lead to overreach, vague enforcement, and broad censorship.
What the Take It Down Act Does
Signed into law following a bipartisan push and a public endorsement from former President Donald Trump, the Take It Down Act criminalizes the distribution of nonconsensual explicit images. It doesn’t matter if they are real or AI-generated.
The law requires social media platforms and websites to remove the offending content within 48 hours of a takedown request or face potential liability.
“We want to protect victims, but the bill’s enforcement mechanisms are alarmingly vague,” said India McKinney, Director of Federal Affairs at the Electronic Frontier Foundation (EFF). “Content moderation at scale is widely problematic and always ends up with important and necessary speech being censored.”