Enterprise tools drive efficiencies and insights, but AI-generated filings cause chaos: lawyers
Putting off learning about AI is like trying to jump onto a moving train after it has picked up speed, and estates lawyers who wait risk being left behind, says Doug Higgins.
Higgins, an estates litigator at Hull & Hull in Toronto, says his files now demand AI fluency. Wills and estates work has always been document-heavy, but he no longer trusts manual review alone. “You’re trying to distill the entirety of a person’s life,” he says, and that now means wrestling with email chains, incomplete financial records, medical files, and draft pleadings that must be organized, checked, and stress-tested before they ever reach a courtroom or a mediation table.
At intake, Higgins uses AI as a blunt tool to expose holes, not as a ghostwriter. He loads the client's data into his firm’s enterprise tool and asks it to search ruthlessly for omissions. “It really is a super helpful second set of eyes that we can use to say, okay, ‘Here’s what we have for our client file, where are the missing pieces here?’” he says. Overlooked accounts, unexplained gaps in correspondence, and contradictory details surface much earlier, allowing him to decide quickly which new records he still needs instead of discovering those gaps only when a deadline is looming.
Once the basics are set, he uses AI to help develop the litigation strategy by analyzing the entire file. Rather than have juniors build chronologies and outlines from scratch, he runs pleadings, evidence, and drafts through AI and asks it to attack the theory of the case. “Where this actually gets interesting is when we’re using an AI system for its inferential analysis,” he says. He tells the system what each piece of evidence is supposed to prove, then asks, “What am I missing here? What’s actually not explicitly stated in these records that might be important for us at mediation?” He wants the machine to flag the same weak links that a good opponent would, only months earlier in the file's life.
He applies the same pressure to massive records that once overwhelmed human review, including medical and financial histories at the end of life. Instead of reading thousands of pages straight through, he asks AI where the clusters of events, anomalies, and unexplained changes actually sit, using it to map the file before diving into the details. “It’s going to go in, and I get a really nice heat map of, okay, here’s where I want to start out focusing my attention,” he says. Peripheral details that would be easy to miss in a linear read become visible when the system pulls them together across time.
Higgins says the same discipline applies to solicitor estates work, where AI can preemptively flag inconsistencies before documents are filed. He cites a certificate of appointment as an example: “You might get something back from the registrar saying, ‘Hey, sorry. There's a mismatch between the date of birth on the application and some of the deceased's records. And a small thing like that can set an estate back easily eight to 10 weeks.” Those delays, he notes, are avoidable if an AI system catches the mismatch before the registrar does.
With those time savings, Higgins says estate lawyers can invest more effort in client communication, a chronic weak point in many practices. Rather than letting efficiency gains vanish into higher file volumes, he argues they should be spent on calls, reporting, and explanations that keep families informed and reduce the risk that clients feel shut out of the process.
Halifax estates lawyer Benjamin Carver agrees that AI can help lawyers be more efficient, but he sees a dark side to the public's access to consumer AI tools. He says more self-represented parties are relying on AI to generate filings in bitter family disputes, believing that polished volume equals strength. “We’ve had a couple of situations here where it’s been very apparent that opposing, self-represented parties are using AI, not just to generate their court filings, but also to make written legal arguments and also prepare affidavits, setting out evidence,” he says.
The result, Carver says, is a surge of material that looks professional but demands painstaking verification. When he receives briefs from counsel, he can usually assume the cases cited are real and jurisdictionally relevant. With AI-assisted self-represented litigants, he has seen submissions that mix hallucinated case law with genuine decisions and lean on superficially plausible authorities from foreign courts that have little to do with Nova Scotia. “It creates an additional burden to review and verify the accuracy and validity of what’s being filed,” he says, and that extra work translates directly into cost for his clients.
The courts are exposed to the same risk, with little infrastructure to respond. In Nova Scotia, where most of his estate litigation takes place, “there are no restrictions, no specific requirements relating to AI use,” he says. In one of his matters, a judge insisted that a self-represented litigant disclose the full extent of his AI use under cross-examination. Yet, even then, the cost award “represented only a fraction of our client’s actual expense,” he says, underscoring how little recourse parties have when AI-driven filings explode the size of the record.
Only the Provincial Court in Nova Scotia, which handles certain criminal and family matters, requires disclosure where AI is used. The Nova Scotia Supreme Court – where most estate litigation takes place – has issued guidance urging caution. Still, there are no certification requirements and no formal mechanisms for challenging AI-generated material. The Probate Court does not currently have any AI disclosure requirements, leaving judges to manage these disputes with analog tools in a digital fight.
Carver also warns his clients about risks outside the courtroom. He strongly discourages people from pasting privileged correspondence, legal advice, or strategy into public chatbots, citing cases in which materials generated by a commercial AI platform were held not to be privileged and became fully discoverable. “There have been some really embarrassing disclosures of chat logs in other jurisdictions,” he says, and he expects similar disputes to reach Canadian courts.
He is now just as concerned about what AI does to the evidence base in a field that already depends heavily on documents and memories. Generative tools make it cheap and easy to alter images, fabricate convincing visuals, and reshape timelines. Carver says litigators are increasingly skeptical of the authenticity of exhibits tendered. In estate disputes, where the key witness is dead, and events are often years in the past, a clean-looking document or photo can be persuasive long before anyone has the time or budget to test it forensically.
AI-drafted wills and powers of attorney are also proliferating quietly, and Carver argues that their surface quality hides deeper flaws. “Most people’s circumstances are more complex than they appreciate,” he says. Tax questions, blended families, business assets, and beneficiaries with special needs rarely fit into generic prompts, and consumer tools “do not always know all the right questions to ask,” he says. Documents that read well but ignore those realities set the stage for the next wave of litigation, as courts are asked to make sense of plans that never properly accounted for the client’s real life.
Both Higgins and Carver agree that estate lawyers cannot sit out these changes and hope to catch up later. Higgins treats AI literacy as part of basic competence in modern estates practice. Carver, who says some colleagues think he is overly optimistic about the long-term benefits of AI to legal practice, still stresses one rule that does not change with the technology: “Verify everything. Every case citation, every statutory reference, every factual assertion. The technology does not come with professional judgment built in.”