CHEYENNE, Wyo. (Legal Newsline) - With "great embarrassment," the nation's largest plaintiff firm is trying to explain why it is using made-up cases to bolster its arguments.
The culprit is an internal Artificial Intelligence tool used at Morgan & Morgan, the firm says in recent federal court documents in Wyoming. There, Judge Kelly Rankin has ordered it to show why it shouldn't be punished for citing cases that never existed.
In a routine Jan. 22 motion, Morgan & Morgan, as it pursues property damage and injuries claims against Walmart and Jetson Electrics over a hoverboard battery that allegedly caught fire, cited nine cases to back its arguments.
"The problem with these cases is that none exist, except United States v. Caraway," Judge Rankin wrote Feb. 6.
"The cases are not identifiable by their Westlaw cite, and the Court cannot locate the District of Wyoming cases by their case name in its local Electronic Court Filing System."
Rankin cites three other (presumably real) cases in which courts have asked lawyers to explain why AI has led to these errors. A federal appeals courts a year ago punished a lawyer who used ChatGPT and had the same problem, dismissing her client's case and subjecting her to a possible $8,000 penalty.
Another New York lawyer was ordered to pay $5,000 for citing a nonexistent case in 2023.
Two attorneys at Morgan & Morgan and their local counsel were ordered to explain how their mistakes occurred. Mike Morgan wrote Feb. 10, "This matter comes with great embarrassment and has prompted discussion and action regarding the training, implementation and future use of artificial intelligence within our firm."
It only took two months for the AI tool to become a problem. The firm conducted initial training in November.
Morgan reached out to opposing counsel to apologize, convened a team of the firm's ethics counsel and added a "click box" for attorneys using AI that requires acknowledgement of the "limitations of artificial intelligence."
The ethics team issued a written statement sent to every lawyer in the firm, which says it has offices in all 50 states.
Morgan & Morgan lawyer Rudwin Ayala prepared the motion. He said he received little to no involvement from his co-counsel when doing so.
He used the AI tool for finding additional case support and uploaded his motion to it, with instructions to add information.
"This was the first time in my career that I ever used AI for queries of this nature," he wrote Feb. 13.
"My reliance on the query results was misplaced, and I failed to verify that the case citations resulted were in fact accurate as I expected them to be. As a result, I have come to learn the term 'AI hallucinations' and take full and sole responsibility for the resulting misinformation to this Court, as unintentional as it was."
Local counsel Taly Goody of Goody Law Group says she never saw the motion before it was submitted to the court. She had referred the case to Morgan & Morgan because her husband has an interest in trying a case with its lead trial attorney.
Goody doesn't use AI, she says.
"Had the motion been sent for review, it is my hope that I would have noticed the oddity of seven District of Wyoming cases cited in the motion and would have inquired about the same," she said, asking that no sanctions be entered against her.