AI-generated fictitious case laws are increasingly putting lawyers and authorities in trouble. Multiple recent incidents show that case studies and judgments produced by Artificial Intelligence (AI) tools often do not exist, leading to serious consequences in legal proceedings.
Lawyers Warn of Unreliable Research Outputs
Criminal lawyer Asif Naqvi said he uses AI only for drafting, never for research. “When you ask the tool for case law, it often provides a fictitious citation. When we later search for the actual judgment, it doesn’t exist. Hence, we cannot depend on it,” Naqvi said.
Bombay High Court Flags AI-Induced Errors
The problem was highlighted before the Bombay High Court during a hearing on an appeal against an Income Tax assessment order. The court found that all the case laws cited by the Income Tax Officer in the notice were fictitious and had been generated using AI.
While setting aside the order, the High Court observed: “In this era of AI, one tends to place much reliance on the results thrown up by the system. However, when exercising quasi-judicial functions, such results must be cross-verified. Otherwise, mistakes like the present one creep in.”
GST Appeal Also Hit by Fabricated Citations
A Borivali-based lawyer faced a similar situation after authorities found that all case laws cited in his GST appeal were fictitious. He had prepared his appeal brief and supporting documents using AI tools.
Speaking on condition of anonymity, the lawyer told the FPJ, “AI picked random information from a real judgment and presented it as a summary.”
On deeper examination, he found that AI had distorted facts, merged different cited case laws, and even altered basic details. “One case I relied on was from 2018, but AI gave the year as 2015 because one of the cases referred to in the judgment was from 2015,” he said. AI had also mixed random names and legal principles from the judgment, creating a fictitious final output.
Experts Call for Caution, Not Blind Reliance
Another lawyer, Tushar Lavhate, said he uses AI tools regularly. “AI is a tool; one can use it for drafting. It relies on available information. The output is not entirely fictitious, but one needs to be cautious,” he said.
Verification Makes AI Research Redundant, Say Lawyers
Another lawyer said he experimented with AI for drafting and research but found much of the material unreliable. “Information gathered through AI must be verified before use. It may be accurate, but without cross-checking with actual data, one cannot depend on it. This verification makes AI almost redundant for legal research,” he added.