Federal judges using AI filed court orders with false quotes, fake names
Federal judges using AI filed court orders with false quotes, fake names
Homepage   /    education   /    Federal judges using AI filed court orders with false quotes, fake names

Federal judges using AI filed court orders with false quotes, fake names

🕒︎ 2025-10-29

Copyright The Boston Globe

Federal judges using AI filed court orders with false quotes, fake names

Both faulty court documents were docketed and had to be hastily retracted after defendants alerted the judges to the errors. Neither judge explained the cause of the errors until the committee contacted them. The use of generative artificial intelligence has become more common in the US judicial system. Wingate and Neals join scores of lawyers and litigants who have been rebuked for using AI to produce legal filings strewn with errors. Legal groups are still catching up. The Administrative Office of the United States Courts, which supports the federal court system, issued interim guidance in July that suggests users “consider whether the use of AI should be disclosed” in judicial functions. It has also established a task force to issue additional guidance on AI use in federal courts. Grassley said Monday that federal courts need to establish rules on AI use in litigation. “I call on every judge in America to take this issue seriously and formalize measures to prevent the misuse of artificial intelligence in their chambers,” he said. Wingate and Neals said in their letters that they took corrective measures after being alerted to the mistakes and will implement additional reviews of court filings before they are submitted. Neals said he established a written policy in his chambers prohibiting the use of generative AI in legal research or drafting court filings. Wingate did not immediately respond to a request for comment. Neals’s office declined to comment. Wingate, whom President Ronald Reagan appointed to the court in 1985, was overseeing a case brought by the Jackson Federation of Teachers and other advocacy groups against the Mississippi State Board of Education and other state bodies. The suit challenged a state law banning public schools from teaching “transgender ideology” and “diversity training” on topics of race, gender, and sexual orientation. On July 20, Wingate granted a temporary restraining order that blocked the state from enforcing parts of the ban. Two days later, in a motion to clarify, Mississippi attorneys said Wingate’s order was replete with errors. The order named several plaintiffs and defendants, including a college sorority, a Mississippi parent, students, and government officials, who were not parties to the case, according to the Mississippi attorneys’ response. The order described allegations that did not appear in the plaintiffs’ complaint and falsely quoted the legislation as being blocked, the attorneys noted. The order also cited declarations from individuals in support of a restraining order that did not exist. Wingate’s office issued a corrected restraining order that evening and told the parties to disregard the previous one. The case is ongoing; Wingate granted a preliminary injunction against the legislation in August that Mississippi attorneys appealed. Neals, who was appointed by President Joe Biden in 2021, issued an opinion with errors in a federal securities class-action lawsuit against CorMedix, a pharmaceutical company, over allegations that it misled investors about a medical product. On June 30, Neals denied a CorMedix motion to dismiss the lawsuit. About a month later, attorneys for CorMedix wrote that Neals’s opinion contained fabricated cases and nonexistent quotes from real cases it cited in support of his ruling. It misstated the outcomes of cases and whether appeals motions to dismiss were granted. It also attributed false quotes to CorMedix, according to the letter. Neals’s opinion was also submitted as “supplemental authority” in support of another class-action lawsuit, whose defendants also raised the issues with his filing, the letter said. Neals said the opinion was entered in error and removed it from the court docket. The case is ongoing. The mistakes in both judges’ orders were similar to those caused by AI hallucinations - in which generative AI, which produces text by predicting what words follow each other from an analysis of written content, confidently invents facts and false citations - and observers quickly speculated that the errors had come from AI use. At first, facing questions from lawyers and litigants, neither judge admitted that the errors were AI-related. Grassley, in his Monday speech, called their “lack of transparency … breathtaking.” The Senate Judiciary Committee wrote to Neals and Wingate in early October inquiring about the mistakes, it said. Both judges said in their responses that the errors were attributable to AI but that the filings were drafts that were mistakenly published before review. A law clerk in Wingate’s office used the Perplexity AI tool as a “foundational drafting assistant” to synthesize publicly available information on the court docket, Wingate wrote. A law school intern for Neals used ChatGPT to perform legal research, Neals wrote. (The Washington Post has partnerships with Perplexity and ChatGPT’s creator, OpenAI.) “I manage a very busy docket and strive to maintain the public’s trust by administering justice in a fair and transparent manner,” Wingate wrote. “Given that I hold myself and my staff to the highest standards of conduct, I do not expect that a mistake like this one will occur in the future.”

Guess You Like