Privacy and digital rights advocates are raising alarms over a law that many would expect them to cheer: a federal crackdown on revenge porn and AI-generated deepfakes. The newly signed Take It Down Act makes it illegal to publish nonconsensual explicit images — real or AI-generated — and gives platforms just 48 hours to comply with a victim’s takedown request or face liability. While widely praised as a long-overdue win for victims, experts have also warned its vague language, lax standards for verifying claims, and tight compliance window could pave the way for overreach, censorship of legitimate content, and even surveillance. “Content moderation at scale is widely problematic and always ends up with important and necessary speech being censored,” India McKinney, director of federal affairs at Electronic Frontier Foundation, a digital rights organization, told TechCrunch. Online platforms have one year to establish a process for removing nonconsensual intimate imagery (NCII). While the law requires takedown requests come from victims or their representatives, it only asks for a physical or electronic signature — no photo ID or other form of verification is needed. That likely aims to reduce barriers for victims, but it could create an opportunity for abuse. “I really want to be wrong about this, but I think there are going to be more requests to take down images depicting queer and trans people in relationships, and even more than that, I think it’s gonna be consensual porn,” McKinney said. Senator Marsha Blackburn (R-TN), a co-sponsor of the Take It Down Act, also sponsored the Kids Online Safety Act which puts the onus on platforms to protect children from harmful content online. Blackburn has said she believes content related to transgender people is harmful to kids. Similarly, the Heritage Foundation — the conservative think tank behind Project 2025 — has also said that “keeping trans content away from children is protecting kids.” Because of the liability th...
First seen: 2025-05-24 18:41
Last seen: 2025-05-27 14:56