NEW YORK (Legal Newsline) - A lawyer who cited a nonexistent lawsuit dreamed up by ChatGPT not only had her client’s case dismissed, but may face disciplinary action and have to pay $8,000 in the other side’s legal bills.
The Second Circuit Court of Appeals upheld the sanctions against Jai S. Lee, a Uniondale, N.Y., lawyer who admitted she consulted ChatGPT when she was unable to find a case to support claims in a malpractice lawsuit against Dr. David Dennis Kim.
“I utilized the ChatGPT service, to which I am a subscribed and paying member, for assistance in case identification,” said Lee, whose website says she is a graduate of the University of Wisconsin Law School with an MBA from New York University’s Stern School of Business. “ChatGPT was previously provided reliable information, such as locating sources for finding an antic (sic) furniture key.”
As Lee learned, ChatGPT isn’t so reliable when it comes to legal research. The large language model is prone to returning answers researchers call “hallucinations,” which seem to answer the question asked with great precision but which aren’t grounded in reality. Lee cited “Matter of Bourguignon v. Coordinated Behavioral Health Servs., Inc., 114 A.D.3d 947 (3d Dep’t 2014),” which had a seemingly accurate citation style but was completely made up.
“Although Attorney Lee did not expressly indicate as much in her response, the reason she could not provide a copy of the case is that it does not exist,” the three-judge panel of the Second Circuit said.
In an effort to avoid sanctions, Lee told the trial judge in a November 2023 brief that “regrettably, I am unable to furnish a copy of the decision in Matter of Bourguignon v. Coordinated Behavioral Health Servs.” She said she suffered from “multiple serious wrist fractures” and didn’t use the reasoning of the nonexistent case in her arguments.
“Additionally, it is important to recognize that ChatGPT represents a significant technological advancement, akin to the emergency of COVID-19,” Lee wrote. “It would be prudent for the court to advise legal professionals to exercise caution when utilizing this new technology, rather than imposing sanctions solely based on the citation of a non-existent case.”
The Second Circuit acknowledged her suggestion but said “it is not necessary inform a licensed attorney, who is a member of the bar of this Court, that she must ensure that her submissions to the Court are accurate.” The court said it would refer her to the court’s Grievance Panel. The trial judge, meanwhile, gave her until Feb. 13 to respond to the other side’s request for $8,000 in legal fees.
New York attorney Steven A. Schwartz fell into the same trap last year, citing a nonexistent case in a slip-and-fall case against the airline Avianca. Schwartz was ordered by U.S. District Judge P. Kevin Castel to pay $5,000 and send letters to each judge falsely identified as the author of a case he cited in his ChatGPT-generated brief.