Court Rejects Dismissal Motion in Concord v. Anthropic

adminIn The Loop5 days ago21 Views


Concord v. Anthropic dismissal motion

Photo Credit: Markus Spiske

Let the years-running Concord v. Anthropic tug of war continue: A federal judge has denied the AI giant’s latest motion to toss the major music publishers’ copyright suit. Meanwhile, the litigants “have lost the ability to” effectively manage discovery, per the court.

Judge Eumi Lee rejected Anthropic’s dismissal push yesterday, six months and change after granting a separate motion to dismiss. The latter made way for an amended complaint – a second amended action also appears to be on the way – that’s evidently faring better in the face of the Claude developer’s arguments.

Beginning on the contributory infringement side, the court reiterated that it “must accept the Publishers’ non-conclusory allegations as true” at the motion to dismiss stage.

And in the judge’s view, the plaintiffs, in emphasizing Anthropic’s implementation of Claude guardrails, “have pled a plausible claim” about the company’s possible knowledge of users’ alleged lyrics infringement via the chatbot’s outputs.

Shifting to the vicarious infringement claim – which survived the initial dismissal motion – Judge Lee once again determined that the publishers effectively “allege more than a financial interest in copyright infringement generally.”

“Instead,” the court explained, “Publishers allege that Anthropic directly benefits from the infringement of Publishers’ works specifically, as required by the case law. … Moreover, it is plausible that the availability of Publishers’ lyrics draws customers to use Claude because, as Publishers contend, Claude would not be as popular and valuable as it is but for ‘the substantial underlying text corpus that includes Publishers’ copyrighted lyrics.’”

Finally, regarding Anthropic’s alleged removal of copyright management information in violation of the DMCA, “the Court infers that Anthropic had a reasonable basis to know that its intentional removal of CMI from the datasets used to train Claude would conceal its alleged infringement.”

“Thus, it is undisputed, for purposes of this motion, that Anthropic distributed Publishers’ lyrics without including CMI,” the court indicated. “The critical question is whether Anthropic did so while knowing, or having a reasonable basis to know, that it would conceal its own infringement.”

As for where the case goes from here, following a venue change, some legal-citation AI hallucinations, and other twists, a trial is still a ways off.

Most immediately, the parties are set to appear at a discovery hearing this Friday; apparently, the marathon process isn’t going very well, and the participating attorneys must therefore “be prepared to remain at court the entire business day.”

The litigants “have lost the ability to [handle the] management [of] discovery in an efficient manner” and will have “to show cause as to why both sides should not be sanctioned” during the hearing, Magistrate Judge Susan van Keulen wrote in an order dated Sunday, October 5th.

Besides the discovery debacle, the publishers’ aforementioned desire to file another amended suit is worth keeping in mind.

Amid a growing focus on how AI developers secured protected materials for training – not necessarily the training process itself – the filing companies in August said they would add claims “based on new evidence” regarding Anthropic’s alleged “use of BitTorrent to download and upload…works without authorization.”

Also in August, the publishers and the AI developer jointly requested and received an amended scheduling order that, among other things, tentatively teed up a trial for September 28th, 2026.



Join Us
  • Linked in
  • Apple Music
  • Instagram
  • Spotify

Stay Informed With the Latest & Most Important News

I consent to receive newsletter via email. For further information, please review our Privacy Policy

Advertisement

Loading Next Post...
Follow
Sidebar Search Trending
Popular Now
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...