Mata v. Avianca, Inc.
Mata v. Avianca, Inc. | |
---|---|
Seal of the court | |
Court | United States District Court for the Southern District of New York |
Full case name | Roberto Mata, Plaintiff, v. Avianca, Inc., Defendant |
Decided | June 22, 2023 |
Docket nos. | 1:22-cv-01461 |
Citation | 678 F. Supp. 3d 443 |
Holding | |
Attorneys sanctioned for using fake case law citations generated by ChatGPT. | |
Court membership | |
Judge sitting | P. Kevin Castel |
Mata v. Avianca, Inc. was a U.S. District Court for the Southern District of New York case in which the Court dismissed a personal injury case against the airline Avianca and issued a $5,000 fine to the plaintiffs' lawyers who had submitted fake precedents generated by ChatGPT in their legal briefs.[1]
Background
In February 2022, Mata filed a personal injury lawsuit in the U.S. District Court for the Southern District of New York against Avianca, alleging that he was injured when a metal serving cart struck his knee during an international flight. The plaintiff's lawyers used ChatGPT to generate a legal motion, which contained numerous fake legal cases involving fictitious airlines with fabricated quotations and internal citations.[2][3][4]
Avianca's lawyers notified the Court that they had been "unable to locate" a few legal cases cited in the legal motion. The Court could not locate the cases either and ordered the plaintiff's lawyers to provide copies of the cited legal cases. Mata's lawyers provided copies of documents purportedly containing all but one of the legal cases, after ChatGPT assured that the cases "indeed exist" and "can be found in reputable legal databases such as LexisNexis and Westlaw."[1][5]
Opinion
In May 2023, Judge P. Kevin Castel dismissed the personal injury case against Avianca and ordered the plaintiff's attorneys to pay a $5,000 fine.[1]
Judge Castel noted numerous inconsistencies in the opinion summaries, describing one of the legal analyses as "gibberish."[6] Judge Castel held that Mata's lawyers had acted with "subjective bad faith" sufficient for sanctions under Federal Rule of Civil Procedure Rule 11.[1]
Impact
In July 2024, the American Bar Association issued its first formal ethics opinion on the responsibilities of lawyers using generative AI (GAI). The 15-page opinion outlines how the Rules of Professional Conduct apply to the use of GAI in the practice of law.[7][8]
Experts caution that lawyers cannot reasonably rely on the accuracy, completeness, or validity of content generated by GAI tools.[9]
Due to the continued usage of GAI in the practice of law, Mata has been described as a landmark case by legal professionals,[10][11] as it is frequently cited by courts in cases where usage of GAI during the course of proceedings leads to the creation and citation of nonexistent caselaw.[12][13][14]
References
- ^ a b c d "Mata v. Avianca, Inc". casemine.com. June 22, 2023. Retrieved May 14, 2025.
- ^ Goswami, Rohan (May 30, 2023). "ChatGPT cited 'bogus' cases for a New York federal court filing. The attorneys involved may face sanctions". CNBC. Archived from the original on May 30, 2023. Retrieved May 30, 2023.
- ^ Neumeister, Larry (June 8, 2023). "Lawyers blame ChatGPT for tricking them into citing bogus case law". Associated Press. Archived from the original on November 8, 2023. Retrieved November 8, 2023.
- ^ "'Use with caution': How ChatGPT landed this US lawyer and his firm in hot water". ABC News. June 24, 2023. Archived from the original on November 9, 2023. Retrieved November 9, 2023.
- ^ Maruf, Ramishah (May 27, 2023). "Lawyer apologizes for fake court citations from ChatGPT | CNN Business". CNN. Retrieved April 25, 2025.
- ^ Brodkin, Jon (June 23, 2023). "Lawyers have real bad day in court after citing fake cases made up by ChatGPT". Ars Technica. Archived from the original on January 26, 2024. Retrieved February 18, 2024.
- ^ "ABA issues first ethics guidance on a lawyer's use of AI tools". www.americanbar.org. Retrieved May 15, 2025.
- ^ Merken, Sara (July 29, 2024). "Lawyers using AI must heed ethics rules, ABA says in first formal guidance". Reuters. Retrieved April 24, 2025.
- ^ "Reviewing Generative Artificial Intelligence False Citations". natlawreview.com. Retrieved May 15, 2025.
- ^ Charlotin, Damien. "AI Hallucination Cases". damiencharlotin.com. Retrieved June 4, 2025.
- ^ Curlin, James (May 2025). "ChatGPT Didn't Write This . . . or Did It? The Emergence of Generative AI in the Legal Field and Lessons from Mata v. Avianca". Arkansas Law Review. 78 (1): 130-132. Retrieved June 4, 2025.
- ^ "Gauthier v. Goodyear Tire & Rubber Co". Casemine. Retrieved June 4, 2025.
- ^ "Benjamin v. Costco Wholesale Corporation". FindLaw. Retrieved June 4, 2025.
- ^ "United States v. Hayes" (PDF). CourtListener. Retrieved June 4, 2025.