AI for Lawyers: It’s an Accelerator and a Danger

Artificial intelligence is rapidly transforming legal practice, offering lawyers powerful tools for research, drafting, and efficiency. AI platforms can summarize case law, generate first drafts of briefs, and identify legal arguments in seconds—tasks that traditionally required hours of billable time. Used properly, these tools can enhance productivity and allow attorneys to focus on strategy and client advocacy. Take a look at this article for AI for lawyers for more information. However, the risks are significant, particularly when lawyers rely on AI outputs without independent verification.
The most well-known example is Mata v. Avianca, Inc., 678 F. Supp. 3d 443 (S.D.N.Y. 2023). The opinion is available here: Mata v. Avianca opinion. In that case, plaintiff’s counsel submitted a brief containing multiple non-existent judicial decisions generated by ChatGPT. When opposing counsel and the court could not locate the cited authorities, the attorneys doubled down, even submitting fabricated “opinions.” Judge P. Kevin Castel ultimately sanctioned the lawyers, finding bad faith and imposing monetary penalties.
A second example involves a Massachusetts attorney sanctioned for similar conduct. In that case, the lawyer filed pleadings containing fictitious case citations produced by an AI tool. The court imposed sanctions and emphasized that attorneys have a non-delegable duty to verify legal authorities before submitting filings. News coverage across jurisdictions confirms that such incidents are not isolated. Courts have increasingly encountered AI-generated “hallucinations,” with dozens of cases involving fabricated citations and growing judicial frustration.
These cases share a common pattern: lawyers used AI to accelerate research, failed to verify the results, and then presented false authorities to the court. Judges have made clear that reliance on AI does not excuse violations of Rule 11 or professional responsibility duties. As one court noted, citing a fake opinion cannot constitute a good-faith legal argument.
The lesson is straightforward. AI can be a valuable assistant, but it is not a substitute for legal judgment. Competent representation now requires not only understanding the law, but also understanding the limitations of the tools used to practice it.