The promise of artificial intelligence in the legal system is speed, access, and efficiency. Its peril is something far older: blind trust in authority without verification.
A recent California decision makes that distinction unmistakably clear. A California judge fined lawyer Amir Mostafavi $10,000 after he filed an appeal with fabricated legal authorities generated by ChatGPT.
Of the 23 case quotations cited in his opening brief, 21 did not exist. The judge’s opinion was unsparing, emphasizing that no technology relieves a lawyer of their most basic professional obligation: to personally verify every citation placed before the court. The sanction appears to be the largest imposed by a California court for AI-generated legal fabrications. It also reflects a growing judicial impatience with what one judge described as “arguments based on total fabrication.”
The court did not prohibit the use of AI. Instead, it reinforced a principle every lawyer learns early: tools do not think, lawyers do. The opinion stressed that generative AI is no different from a junior associate, articling student, or law clerk. The output may be helpful. It may be efficient. But it must be checked. Failing to do so is not a technological error – it is professional misconduct and solicitor malpractice.
Courts across the Canada and the United States, have now confronted lawyers for citing nonexistent cases, fabricated quotations, and misrepresented authorities generated by AI systems. What was once viewed as an anomaly has become a pattern. In response to the accelerating misuse of AI in court filings, courts in both countries have moved to regulate the use of artificial intelligence in court proceedings. While the details are still emerging, the direction is clear: disclosure, accountability, and verification will be central themes.
This mirrors a broader trend. Judges are no longer merely cautioning lawyers about AI misuse – they are sanctioning it. What makes the Mostafavi sanction particularly striking is the contrast between how courts view lawyers and self-represented litigants. Across both countries, self-represented litigants are increasingly using AI tools as a form of virtual lawyer or law clerk. In some cases, that use has produced meaningful access to justice. But the same tools that empower can also mislead. AI hallucinations (ie. fabricated cases and false quotations) are on the rise.
Courts have shown some leniency toward self-represented litigants, though even they are not immune from warnings and sanctions. Lawyers, however, are held to a higher standard. This creates an inherent contest. But the gap between lawyers and self-represented litigants leads to different outcomes – a stern warning versus a finding of professional misconduct and the loss of a license to practice law.
AI is not going away. AI can draft faster, summarize more broadly, and brainstorm more creatively than any single human. Courts know it. Lawyers know it. Clients know it. Self-represented litigants know it.
In the coming years, lawyers who outsource judgment to algorithms will be sanctioned, embarrassed, and, increasingly, regulated out of the practice of law.
The future of law may be augmented by machines. Professional responsibility, however, remains stubbornly human.
Steve Benmor, B.Sc., LL.B., LL.M. (Family Law), C.S., Cert.F.Med., C.Arb., FDRP PC, Acc.D.C., is a full-time Divorce Mediator/Arbitrator and principal lawyer of Benmor Family Law Group, a boutique matrimonial law firm in downtown Toronto. He is a Certified Specialist in Family Law, a Certified Specialist in Parenting Coordination and was admitted as a Fellow to the prestigious International Academy of Family Lawyers. Steve is regularly retained as a Divorce Mediator/Arbitrator and Parenting Coordinator. Steve uses his 30 years of in-depth knowledge of family law, court-room experience and expert problem-solving skills in Divorce Mediation/Arbitration to help spouses reach fair, fast and cooperative divorce settlements without the financial losses, emotional costs and lengthy delays from divorce court.
Share this article on: