The NO FAKES Act Has Changed – and It's So Much Worse

https://news.ycombinator.com/rss Hits: 9
Summary

A bill purporting to target the issue of misinformation and defamation caused by generative AI has mutated into something that could change the internet forever, harming speech and innovation from here on out. The Nurture Originals, Foster Art and Keep Entertainment Safe (NO FAKES) Act aims to address understandable concerns about generative AI-created “replicas” by creating a broad new intellectual property right. That approach was the first mistake: rather than giving people targeted tools to protect against harmful misrepresentations—balanced against the need to protect legitimate speech such as parodies and satires—the original NO FAKES just federalized an image-licensing system. Take Action Tell Congress to Say No to NO FAKES The updated bill doubles down on that initial mistaken approach by mandating a whole new censorship infrastructure for that system, encompassing not just images but the products and services used to create them, with few safeguards against abuse. The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.” This bill would be a disaster for internet speech and innovation. Targeting Tools The first version of NO FAKES focused on digital replicas. The new version goes further, targeting tools that can be used to produce images that aren’t authorized by the individual, anyone who owns the rights in that individual’s image, or the law. Anyone who makes, markets, or hosts such tools is on the hook. There are some limits—the tools must be primarily designed for, or have only limited commercial uses other than making unautho...

First seen: 2025-06-24 08:10

Last seen: 2025-06-24 16:12