The deal settles part of the suit filed by publishers
Amazon-supported artificial intelligence company Anthropic has inked an agreement with Universal Music, ABKCO and Concord on using “guardrails” to prevent Anthropic chatbot Claude from infringing on copyrighted lyrics, reported Reuters.
The deal settles part of the publishers’ lawsuit that accused Anthropic of misusing lyrics from songs by Beyonce and the Rolling Stones, among others, in Claude’s training. In the agreement, Anthropic consented to maintaining current guardrails and imposing them on future Claude models; moreover, it agreed to permit the court to resolve disputes over the measures.
US District Judge Eumi Lee approved the deal at the end of December 2024. The judge has yet to rule on the publishers’ request for a preliminary injunction to stop Anthropic from using their lyrics to train its AI.
The publishers said in a statement published by Reuters that while they considered the agreement to be “a positive step forward,” the suit, which was filed in October 2023, would continue. The publishers sought to block Anthropic from generating their lyrics as responses to user prompts on the grounds of copyright infringement.
Reuters reported in 2023 that this suit seemed to be the first case filed over the use of song lyrics in AI training.
“Claude isn't designed to be used for copyright infringement, and we have numerous processes in place designed to prevent such infringement. Our decision to enter into this stipulation is consistent with those priorities,” a spokesperson from Anthropic said in a statement published by Reuters.
The case is Concord Music Group Inc v. Anthropic PBC, U.S. District Court for the Northern District of California, No. 3:24-cv-03811. The publishers are being represented by Matt Oppenheim, Nick Hailey, and Jenny Pariser of Oppenheim + Zebrak; and Richard Mandel, Jonathan King, and Richard Dannay of Cowan Liebowitz & Latman.
Joe Wetzel, Andy Gass, Sy Damle, and Alli Stillman of Latham & Watkins are acting for Anthropic.