Federal judges using AI filed court orders with false quotes and fake names
Federal judges using AI filed court orders with false quotes and fake names
Homepage   /    education   /    Federal judges using AI filed court orders with false quotes and fake names

Federal judges using AI filed court orders with false quotes and fake names

🕒︎ 2025-10-29

Copyright Anchorage Daily News

Federal judges using AI filed court orders with false quotes and fake names

Two federal judges in New Jersey and Mississippi admitted this month that their offices used artificial intelligence to draft factually inaccurate court documents that included fake quotes and fictional litigants - drawing a rebuke from the head of the Senate Judiciary Committee. “I’ve never seen or heard of anything like this from any federal court,” Sen. Chuck Grassley (R-Iowa) said in a Senate floor speech Monday. The committee announced Thursday that the judges, Henry T. Wingate of the Southern District of Mississippi and Julien Xavier Neals of the District of New Jersey, admitted that their offices used AI in preparing the mistake-laden filings in the summer. They attributed the mistakes to a law clerk and a law school intern, respectively, according to letters the judges sent in response to a Senate inquiry. Both faulty court documents were docketed and had to be hastily retracted after defendants alerted the judges to the errors. Neither judge explained the cause of the errors until the committee contacted them. The use of generative artificial intelligence has become more common in the U.S. judicial system. Wingate and Neal join scores of lawyers and litigants who have been rebuked for using AI to produce legal filings strewn with errors. Legal groups are still catching up. The Administrative Office of the U.S. Courts, which supports the federal court system, issued interim guidance in July that suggests users “consider whether the use of AI should be disclosed” in judicial functions. It has also established a task force to issue additional guidance on AI use in federal courts. Grassley said Monday that federal courts need to establish rules on AI use in litigation. “I call on every judge in America to take this issue seriously and formalize measures to prevent the misuse of artificial intelligence in their chambers,” he said. Wingate and Neals said in their letters that they took corrective measures after being alerted to the mistakes and will implement additional reviews of court filings before they are submitted. Neals said he established a written policy in his chambers prohibiting the use of generative AI in legal research or drafting court filings. Wingate did not immediately respond to a request for comment. Neals’s office declined to comment. Wingate, whom President Ronald Reagan appointed to the court in 1985, was overseeing a case brought by the Jackson Federation of Teachers and other advocacy groups against the Mississippi State Board of Education and other state bodies. The suit challenged a state law banning public schools from teaching “transgender ideology” and “diversity training” on topics of race, gender and sexual orientation. On July 20, Wingate granted a temporary restraining order that blocked the state from enforcing parts of the ban. Two days later, in a motion to clarify, Mississippi attorneys said Wingate’s order was replete with errors. The order named several plaintiffs and defendants, including a college sorority, a Mississippi parent, students and government officials, who were not parties to the case, according to the Mississippi attorneys’ response. The order described allegations that did not appear in the plaintiff’s complaint and falsely quoted the bill as being blocked, the attorneys noted. The order also cited declarations from individuals in support of a restraining order that did not exist. Wingate’s office issued a corrected restraining order that evening and told the parties to disregard the previous one. The case is ongoing; Wingate granted a preliminary injunction against the bill in August that Mississippi attorneys appealed. Neals, who was appointed by President Joe Biden in 2021, issued an opinion with errors in a federal securities class-action lawsuit against CorMedix, a pharmaceutical company, over allegations that it misled investors about a medical product. On June 30, Neals denied a CorMedix motion to dismiss the lawsuit. About a month later, attorneys for CorMedix wrote that Neals’s opinion contained fabricated cases and nonexistent quotes from real cases it cited in support of his ruling. It misstated the outcomes of cases and whether appeals motions to dismiss were granted. It also attributed false quotes to CorMedix, according to the letter. Neals’s opinion was also submitted as “supplemental authority” in support of another class-action lawsuit, whose defendants also raised the issues with his filing, the letter said. Neals said the opinion was entered in error and removed it from the court docket. The case is ongoing. The mistakes in both judges’ orders were similar to those caused by AI hallucinations - where generative AI, which produces text by predicting what words follow each other from an analysis of written content, confidently invents facts and false citations - and observers quickly speculated that the errors had come from AI use. At first, facing questions from lawyers and litigants, neither judge admitted that the errors were AI-related. Grassley, in his Monday speech, called their “lack of transparency … breathtaking.” The Senate Judiciary Committee wrote to Neals and Wingate in early October inquiring about the mistakes, it said. Both judges said in their responses that the errors were attributable to AI but that the filings were drafts that were mistakenly published before review. A law clerk in Wingate’s office used the Perplexity AI tool as a “foundational drafting assistant” to synthesize publicly available information on the court docket, Wingate wrote. A law school intern for Neals used ChatGPT to perform legal research, Neals wrote. (The Washington Post has partnerships with Perplexity and ChatGPT’s creator, OpenAI.) “I manage a very busy docket and strive to maintain the public’s trust by administering justice in a fair and transparent manner,” Wingate wrote. “Given that I hold myself and my staff to the highest standards of conduct, I do not expect that a mistake like this one will occur in the future.” “While my experience in the CorMedix case was most unfortunate and unforeseeable, I hope that, at the very least, it will inform the AO Task Force’s continuing work and ultimately lead to new meaningful policies for all federal courts,” Neals wrote.

Guess You Like

Project Hospitality honors four at Harvest Gala
Project Hospitality honors four at Harvest Gala
STATEN ISLAND, N.Y. — Communit...
2025-10-28