AI hallucinations strike again


A barrister has found himself in hot water after a judge found that some of his appeal-drafting was the work of artificial intelligence (AI).

In a case involving two Honduran sisters seeking asylum from gang-related persecution, senior upper tribunal judge Mark Blundell found that their barrister, Chowdhury Rahman, had relied on AI to help draft the grounds of appeal and, importantly, had failed to carry out adequate checks afterwards.

The judge found that several of the authorities cited in the appeal were either “entirely fictitious” or, where they did exist, did not support the arguments advanced.

On at least one of the cases, he accepted the authority did not exist. In relation to the authorities that did exist, the judge found that Rahman “appeared to know nothing” of them. The judgment also reveals that Mr Rahman returned to court after being given several hours to gather the authorities he had cited — but was unable to produce all of them or identify relevant passages.

According to the judge, his explanations shifted throughout the hearing. At times claiming citation errors, at others insisting that different authorities had been intended. When pressed on how the errors arose, Rahman told the tribunal he had used “various websites” — including BAILII — to assist with research.

NEW: The 2026 Legal Cheek Chambers Most List

Rahman maintained that any inaccuracies were the result of his “drafting style”, but admitting there might have been some “confusion and vagueness” in how he had expressed himself.

However, Rahman’s inability to sufficiently explain his citations led the judge to conclude that he had not only used generative AI to prepare the grounds of appeal, but had also “attempted to hide that fact from me during the hearing”.

The judge also expressed concern that Rahman appeared not to grasp the “gravity of the situation” and failed to appreciate how his grounds of appeal may have misled the tribunal, leading to a significant waste of the tribunal’s time.

This isn’t the first time a lawyer has been criticised by a judge for using AI. Last month, Legal Cheek reported that a barrister had been referred to the Bar Standards Board after citing a non-existent case generated by ChatGPT during immigration tribunal proceedings. He later claimed that he was himself “a victim” of the AI technology.

The post Judge finds barrister relied on ‘entirely fictitious’ AI-generated cases appeared first on Legal Cheek.

Read More