AI And Law: When Hallucinated Judgements Enter Courtrooms, Justice Bears The Cost

Indian courts have flagged a troubling rise in AI-generated fake judgements being cited in petitions, forcing judges to verify even basic precedents. While AI can aid research and efficiency, misuse is increasing judicial burden and risking legal integrity, prompting calls for strict accountability and safeguards.

Add FPJ As a
Trusted Source
FPJ Web Desk Updated: Thursday, February 19, 2026, 09:14 PM IST
Courts raise alarm as AI-generated fake case laws surface in legal filings, adding pressure on an already burdened judiciary | Representational Image

Courts raise alarm as AI-generated fake case laws surface in legal filings, adding pressure on an already burdened judiciary | Representational Image

Earlier this week, the Supreme Court expressed shock at the growing practice of lawyers filing petitions drafted with AI tools that shockingly referred to even non-existent judgements. Most of these petitions, written by AI tools, quote paragraphs that do not appear in any reported judgements.

Unfortunately, such hallucinations are not one-off cases but have repeatedly turned up during court hearings across the country, where precedents cited turned out to be fabricated, or rather, AI-generated.

The courts, which are severely short of judges, now have the added task of determining the very existence and wording of every cited precedent, even the original ones. This is as good as a forensic audit of pleadings, adding hours of clerical verification to dockets already groaning under arrears.

Judicial burden and ethical breach

As Justice BV Nagarathna of the Supreme Court rightly noted, such a practice imposes an additional burden on the judges. The courts cannot afford to verify each petition individually, thereby shifting scarce judicial time from adjudication to authentication.

This also violates the Advocates Act and Bar Council rules, which treat misleading the court as a breach of ethics, irrespective of whether the lie was crafted by a junior, a clerk or an algorithm.

The Bombay High Court has already imposed costs on a litigant for employing AI tools to ‘generate’ a non-existent judgement, making it clear that petitions leaning heavily on AI tools will henceforth attract stiff penalties.

AI as tool, not authority

One must also point out that the courts themselves are turning to AI for transcription, translation and search, while insisting it must not “overpower judicial decision-making”. At the same time, AI tools can be used to conduct research, democratising access to case law for smaller chambers and trial courts that lack libraries or paid databases.

Hence, to reduce the burden on the courts, the onus of proving the authenticity of the petition should rest with the lawyers, and every citation in a pleading should be treated as a personal professional guarantee.

Need for stronger systems and safeguards

Alongside these efforts, courts need stronger internal systems: carefully managed digital archives maintained by experts; citation tools powered by public legal records; and rigorous training in artificial intelligence models that go beyond treating it as a secondary skill.

There are practices worldwide that Indian courts can draw on when formulating norms for lawyers' use of AI tools. For example, South Carolina’s Supreme Court has issued a policy where lawyers remain fully responsible for the accuracy of all filings and may not treat AI output as legal authority or rely on it without independent verification.

At the same time, no confidential information should be entered into public AI tools unless security, privacy and client consent issues are fully addressed.

Drawing the line

If India wants to be seen as a champion of AI, it needs to draw a clear line: while technology can speed up justice, it cannot be allowed to manufacture the law itself.

Published on: Thursday, February 19, 2026, 09:14 PM IST

RECENT STORIES