In one of the highest profile examples to date of hallucinated material being submitted to court, Sullivan & Cromwell has apologised that a motion it submitted to the Bankruptcy Court in New York contained inaccurate citations and errors. Sullivan & Cromwell’s opposing counsel Boies Schiller Flexner flagged the errors, including inaccurate regulation and case law citation.

In a letter dated 18 April to chief judge Martin Glenn, the co-head of Sullivan & Cromwell’s global restructuring group, Andrew Dietderich, confirmed that the inaccuracies include AI hallucinations.  

“We deeply regret that this has occurred,” says Glenn, adding, “The Firm maintains comprehensive policies and training requirements governing the use of AI tools in legal work. These safeguards are designed to prevent exactly this situation. The Firm’s policies on the use of AI were not followed in connection with the preparation of the Motion. In addition, the Firm has general policies and training requirements for the proper review of legal citations. Regrettably, this review process did not identify the inaccurate citations generated by AI, nor did it identify other errors that appear to have resulted in whole or in part from manual error.” 

Glenn says that the firm has undertaken immediate remedial measures, including a full review of the circumstances leading to these errors, and a re-review of all filings in this matter. “We can confirm that the other filings do not include any AI-related errors,” he says. 

The case is re Prince Global Holdings Limited et al, No. 26-10769. Commenting on this letter on Reddit, one observer makes the interesting observation that most AI tools now include citations with hyperlinks, making it easy to check if the information returned is accurate. The letter does not say what technology produced the inaccuracies.

This is hugely embarrassing for Sullivan & Cromwell, and in his letter Glenn is at pains to stress that the firm’s policies governing AI use are clear and rigorous. “Access to AI tools is conditioned on completion of mandatory training,” he says, adding: “Before any firm lawyer is granted access to generative AI tools, the lawyer must complete two required training modules, completion of which is tracked and verified.” 

The post Sullivan & Cromwell apologises for hallucinated material in court filing appeared first on Legal IT Insider.

Read More