In the first decision of its kind, a New York federal judge has sanctioned two lawyers and their law firm for submitting fabricated case law generated by ChatGPT, the generative artificial intelligence large language model, also known as a chatbot.1 The judge also dismissed the underlying lawsuit as time-barred under Article 35 of the Montreal Convention.2
The lawsuit in the U.S. District Court for the Southern District of New York arose from an alleged personal injury during an international flight from El Salvador to New York. The defendant foreign air carrier filed a motion to dismiss the action as time-barred because it was not commenced within two years of the incident as required by Article 35 of the Montreal Convention, which exclusively governed the case. Anticipating that the plaintiff would argue that the air carrier’s commencement of a voluntary bankruptcy proceeding had tolled the limitations period, the air carrier argued that case law uniformly holds that courts refuse to apply tolling rules to extend the time for filing claims arising under the Montreal Convention.
In opposition, the plaintiff cited a purported federal “case” that would have supported the plaintiff’s position because it ostensibly held that a bankruptcy stay tolls the Article 35 limitations period. However, defense counsel could not locate the case by citation or case name. The plaintiff’s brief contained several other internal citations and lengthy quotations from another case that could not be located, and many of the other cases cited in the brief similarly appeared to be non-existent.
After the air carrier informed the court and plaintiff’s counsel that it could not find most of the cases cited in the opposition brief, District Judge P. Kevin Castel issued an order directing the plaintiff’s attorney to submit copies of those cases to the court. In response, the plaintiff’s attorney submitted an affidavit that contained partial copies of the purported cases, including the “case” mentioned above that, upon first blush, appeared to be directly on point in support of plaintiff’s position.
The “case” contained certain attributes that suggested it was real: it displayed a caption and a citation to the Federal Reporter; it identified the judges who sat on the appellate panel, including a judge from another court supposedly sitting by designation; and it contained an introductory section and a lengthy discussion of the plaintiff’s international travel. Most importantly, it included a quotation, purportedly from another case, which was, on its face, directly on point and fully supported plaintiff’s position.
But these details were not real, the quote was not real, and the case was not real. Five of the other cases attached to the plaintiff’s attorney’s affidavit were similarly fabricated. Judge Castel thus issued an order directing the plaintiff’s attorney and his law firm to show cause why he should not be sanctioned for citing non-existent cases in the brief and for submitting “bogus judicial decisions with bogus quotes and bogus internal citations.”3
The plaintiff’s attorney submitted a written response admitting that he conducted his legal research using ChatGPT, the generative artificial intelligence software. He stated that ChatGPT provided the fabricated citations and cases and that he “was unaware of the possibility that its content could be false.”4 The court then held a hearing to obtain more information about how plaintiff’s attorneys conducted their legal research and why they did not acknowledge the mistakes after the issue was raised by both the air carrier and the court.
The court has now issued a decision, finding that plaintiff’s attorneys acted in bad faith “when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question.”5 The court accordingly imposed sanctions intended to deter similar conduct.
In a separate opinion issued simultaneously, the court granted the air carrier’s motion (which started the chain of events leading to the sanctions order) and dismissed the action on its merits.6 As discussed above, Article 35 of the Montreal Convention provides a two-year limitations period in which to commence an action arising under the Convention. The alleged incident occurred on August 27, 2019, yet the lawsuit was not commenced until February 2, 2022 – more than two years after the incident. The court agreed with the air carrier that the two-year limitations period is a strict condition precedent to bringing a claim. Therefore, unlike a statute of limitations, the two-year time-bar is not subject to equitable tolling. Accordingly, the automatic bankruptcy stay that had been in place during part of the time period between August 27, 2019, and February 2, 2022, did not toll the two-year limitations period and extend the time for filing an action under the Montreal Convention.
The sanctions order is a cautionary lesson regarding attorney conduct and legal ethics, and the dismissal order reaffirms the longstanding principle that the Montreal Convention was adopted to promote uniformity among its signatory States, and that application of local rules, such as tolling principles, would undermine that goal.
Disclaimer: This publication is made available for educational purposes only and is not intended as legal advice. If you have questions about any matters in this publication, please contact the authors directly. General inquiries may be directed to email@example.com
1 Mata v. Avianca, Inc., No. 22-cv-1461, 2023 WL 4114965 (S.D.N.Y. June 22, 2023). The authors of this Client Bulletin successfully represented the air carrier.
2 Mata v. Avianca, Inc., No. 22-cv-1461, 2023 WL 4138427 (S.D.N.Y. June 22, 2023).
3 Mata v. Avianca, Inc., No. 22-cv-1461, S.D.N.Y., Dkt. 31.
4 Id., Dkt. 32-1.
5 Mata v. Avianca, Inc., 2023 WL 4114965, at 1.
6 Mata v. Avianca, Inc., 2023 WL 4138427, at 1.