| Buckeye Trust v. PCIT-1 Bangalore |
ITA No. 1051/Bang/2024 |
December 2024 |
Bangalore, India |
Income Tax Appellate Tribunal (ITAT) |
This case revolves around the tax implications of a transaction where an individual transferred partnership interest to form a trust valued at ₹699 crores. The legal team argued a partnership interest isn’t ‘property’ under tax law. The tribunal equated partnership interests to shares, taxing it—contrary to usual precedent. |
The tribunal cited three fictional Supreme Court judgments and a fabricated Madras HC judgment. These non‑existent cases were reportedly generated by ChatGPT and mistakenly included in the ruling. Additionally, four referenced judgments listed elsewhere were missing from official archives. |
ITAT failed to verify citations and incorporated fake authorities into its judgment—demonstrating absence of due diligence when using AI outputs. |
No explicit sanctions noted. The ruling itself became erroneous due to reliance on hallucinated authorities—impacting credibility of the tribunal. |
AI outputs must be cross‑checked; judicial bodies cannot rely blindly on generative AI without verification. |
| Wadsworth v. Walmart Inc. |
Case No. 2:23‑CV‑118‑KHR |
February 24, 2025 |
United States |
U.S. District Court |
Plaintiffs sued Walmart alleging a defective hoverboard that caught fire and destroyed their home. Attorney Ayala drafted motions using an AI platform (“MX2.law”), inserted citations generated by AI without verification. |
Nine cases were cited, eight of which did not exist. Ayala uploaded the motion into an AI system to automatically add supporting cases, and filed it without providing copies to co‑counsel. |
Judge Rankin held attorneys remain responsible regardless of AI tools used. AI can be beneficial only with proper verification. |
Ayala’s pro hac vice revoked + $3,000 fine. Morgan: fined $1,000. Goody (local counsel): fined $1,000. Policies ordered to prevent reoccurrence. |
Even where AI drafting is delegated, signature = responsibility. Attorneys must verify every authority before filing. |
| Mata v. Avianca |
22‑cv‑1461 (S.D.N.Y. 2023) |
2023 |
United States |
Southern District of New York |
The plaintiff sued Avianca Airlines for personal injury. Attorneys submitted a brief citing six nonexistent cases—including quotes and quotations fabricated by ChatGPT. |
Lawyers relied entirely on ChatGPT for case authority; the tool even ‘assured’ them of authenticity. The attorneys failed to independently verify. |
The Court called the brief ‘bogus’ — containing fake decisions, fake quotes, and fake citations. Judicial integrity requires human‑checked references. |
$5,000 sanction imposed against the attorneys and firm. Ordered to inform falsely‑cited judges. |
Technology cannot excuse legal malpractice. AI hallucinations, if relied upon blindly, destroy professional credibility. |
| Lacey v. State Farm Gen. Ins. Co. |
No. 2:24‑cv‑05205‑FMO‑MAA, 2025 WL 1363069 (C.D. Cal. May 5, 2025) |
May 5, 2025 |
United States |
U.S. District Court for the Central District of California |
Jackie Lacey, former Los Angeles DA, sued State Farm over a denied professional liability policy claim. During discovery, filings contained fabricated cases and incorrect quotes. Attorneys used AI tools: Cocounsel, Westlaw Precision Drafting, Google Gemini. |
Fake case citations + misquoted law submitted. Even revisions—AFTER warnings—contained same mistakes, showing reckless disregard. |
Special Master held conduct ‘tantamount to bad faith.’ AI cannot replace human judgment; Rule 11 demands verification of legal citations. |
Faulty brief stricken; discovery motion denied. Firms Ellis George + K&L Gates jointly assessed $31,100 in fees. |
Courts are willing to impose heavy penalties for repeated AI‑caused misinformation—even without intent. |
| Patricia Bevins v. Colgate‑Palmolive Co. and BJ’s Wholesale Club |
Case No. 2:25‑cv‑00576 |
April 10, 2025 |
United States |
U.S. District Court, Eastern District of Pennsylvania |
Product liability claim. Attorney Palazzo sanctioned for submitting briefs containing erroneous and fabricated citations. |
Referenced cases either did not exist or were altered with inaccuracies—likely drafted through AI without verification. |
Court emphasized many attorneys wrongly treat AI as a replacement for legal research. Accuracy must come first. |
Sanctions imposed; additional judicial scrutiny applied. |
AI hallucinations are becoming common. Structural safeguards + ethical training are needed in legal practice. |