Atlanta

Atlanta Attorney Fined $2,500 for Submitting AI-Generated Bogus Cases in Divorce Brief

AI Assisted Icon
Published on July 06, 2025
Atlanta Attorney Fined $2,500 for Submitting AI-Generated Bogus Cases in Divorce BriefSource: Unsplash/Steve Johnson

An Atlanta attorney has been hit with a $2,500 fine by the Georgia Court of Appeal for what appears to be the use of artificial intelligence, leading to bogus cases being cited in a divorce case brief. According to the judges, attorney Diana Lynch's motion included multiple non-existent cases, half of which were deemed to be "'hallucinations' made up by generative-artificial intelligence," as reported by FOX 5 Atlanta. The other cases cited had no relevance to the client's divorce proceedings.

The situation surfaced when the ex-wife of Lynch's client, brought attention to the false references. Judge Jeff Watkins, addressing the issue, stated that the use of fictitious cases by Lynch has deprived the opposing party of the opportunity to appropriately respond to her arguments. In addition, the judges vacated the previous court order and returned the case to a lower court for reconsideration. The recent ruling aligns with the concerns of Supreme Court Chief Justice John Roberts who, in his 2023 Year-End Report on the Federal Judiciary, cautioned against this very issue of AI-induced "hallucinations" in legal work.

Fellow lawyer Christopher Timmons, with the firm Knowles, Gallant and Timmons, referred to the penalty as a reflection of the court's disapproval, indicating, "It’s embarrassing for the judges involved, it makes the lawyer who cited the fake cases look bad." His take, in an interview with Channel 2 Action News, underscored the rise of AI usage in legal practices and the essential need for attorneys to verify any AI-generated content. Timmons highlighted that "AI is basically a product that... it’s as good as, your inputs," as per information obtained from Channel 2 Action News.

The increasing integration of AI in legal proceedings is not without its perils, Timmons warns that as the technology becomes more common, so will instances of court filings containing fake or irrelevant cases, adding that "we’re gonna see more cases where attorneys are relying on ChatGPT, and that ChatGPT is going to end in hallucination." Attempts to reach Lynch for comment by Channel 2 Action News through various means including calls, social media, and emails have thus far, not been met with a response.