CHEYENNE, Wyo. (Legal Newsline) - The judge handed a mistake-laden brief crafted by a plaintiffs firm's AI machine is issuing more penalties, this time for putting him and jurors through a pointless nine-day trial.
Wyoming federal judge Kelly Rankin on March 17 ordered the parties in Stephanie Wadsworth's lawsuit against Walmart and Jetson Electric Bikes to pay more than $15,000 for waiting until jurors were deliberating to reach a settlement.
The case, which alleges hoverboard's battery caused a fire, became notable earlier this year when Rankin discovered a brief submitted by Morgan & Morgan, which calls itself the nation's largest personal injury firm, cited fictional cases generated by AI.
It got attorney Rudwin Ayala fined $3,000 and kicked off the case on Feb. 24, ahead of a March 3 trial start. Morgan & Morgan sent Josh Autry of its Lexington, Ky., case to replace him.
Rankin told attorneys for both sides at a Feb. 28 hearing that they had until the next day to reach a settlement before they would face the imposition of jury costs.
That didn't happen, and the trial lasted from March 3-13. The case settled for an undisclosed amount an hour after the jury began deliberations, and now both sides must pay $15,444.71 in jury costs.
In addition to Ayala's fine, two other plaintiff lawyers were each fined $1,000 over the AI brief - Mike Morgan, the head of the firm's product safety department, and local counsel Taly Goody of Goody Law Group.
Ayala prepared the motion. He said he received little to no involvement from his co-counsel when doing so.
He used the AI tool for finding additional case support and uploaded his motion to it, with instructions to add information. It did so, adding nine cases that don't exist.
"This was the first time in my career that I ever used AI for queries of this nature," he wrote Feb. 13.
"My reliance on the query results was misplaced, and I failed to verify that the case citations resulted were in fact accurate as I expected them to be. As a result, I have come to learn the term 'AI hallucinations' and take full and sole responsibility for the resulting misinformation to this Court, as unintentional as it was."
Rankin cited three other cases in which courts have asked lawyers to explain why AI has led to these errors. A federal appeals courts a year ago punished a lawyer who used ChatGPT and had the same problem, dismissing her client's case and subjecting her to a possible $8,000 penalty.
Another New York lawyer was ordered to pay $5,000 for citing a nonexistent case in 2023.
Before the digital age, attorneys had to cross-reference case citations in books to make sure those cases were still applicable. Rankin wrote that process has been simplified through database signals.
"Yet one still cannot run a natural language or 'Boolean' search through a database and immediately cite the highlighted excerpt that appears under a case," he wrote.
"The researcher must still read the case to ensure the excerpt is existing law to support their propositions and arguments. After all, the excerpt could very well be a losing party's argument, the court explaining an overruled case, dicta, etc.
"As attorneys transition to the world of AI, the duty to check their sources and make a reasonable inquiry into existing law remains unchanged."