A federal judge has sided with Anthropic in a major copyright ruling, declaring that artificial intelligence developers can train models using published books without authors’ consent.The decision, filed Monday in U.S. District Court for the Northern District of California, sets a precedent that training AI systems on copyrighted works constitutes fair use. Though it doesn’t guarantee other courts will follow, Judge William Alsup’s ruling makes the case the first of dozens of ongoing copyright lawsuits to give an answer about fair use in the context of generative AI.It’s a question that has been raised by creatives across various industries for years since generative AI tools exploded into the mainstream, allowing users to easily produce art from models trained on copyrighted work — often without the human creators’ knowledge or permission.AI companies have been hit with a slew of copyright lawsuits from media companies, music labels and authors since 2023. Artists have signed multiple open letters urging government officials and AI developers to constrain the unauthorized use of copyrighted works. In recent years, companies have also increasingly reached licensing deals with AI developers to dictate terms of use for their artists’ works.Alsup ruled on a lawsuit filed in August by three authors — Andrea Bartz, Charles Graeber and Kirk Wallace Johnson — who claimed that Anthropic ignored copyright protections when it pirated millions of books and digitized purchased books to feed into its large language models, which helped train them to generate humanlike text responses.“The copies used to train specific LLMs were justified as a fair use,” Alsup wrote in the ruling. “Every factor but the nature of the copyrighted work favors this result. The technology at issue was among the most transformative many of us will see in our lifetimes.”His decision said Anthropic’s use of the books to train its models, including versions of its flagship AI model Claude, was “exceedingly...
First seen: 2025-06-25 20:20
Last seen: 2025-06-25 20:20