When Claude Hallucinates in Court: The Latham & Watkins Incident and What It Means for Attorney Liability
There is a particular kind of irony that the legal profession rarely gets to witness in such pristine form. In May 2025, Latham & Watkins a firm that routinely bills over $2,000 an hour for its partners and counts Anthropic among its clients filed a court declaration in Concord Music Group v. Anthropic that contained fabricated citation details. The citations weren’t invented by a sleep-deprived associate pulling an all-nighter. They were generated by Claude, the very AI model that Latham & Watkins was in court defending. Sit with that for a moment. The lawyer arguing that Claude is not a copyright infringement machine used Claude to format a legal citation in an active case and Claude got the authors wrong, the title wrong, and nobody caught it until opposing counsel started digging. The irony isn’t just delicious. It’s instructive. Because what happened inside that filing is a near-perfect X-ray of the structural problem that AI poses for legal prac...
