New AI guidelines are only a starting point for Canadian legal professionals

Licensees must remain abreast of broader developments in artificial intelligence

New AI guidelines are only a starting point for Canadian legal professionals
Tara Raissi

Five Canadian law societies have issued guidance for lawyers using generative AI in their practice. The Law Society of Alberta, Manitoba, Saskatchewan, British Columbia and Ontario offer resources on licensee professional and ethical obligations when utilizing AI. The guidance ranges from a white paper flagging risks practitioners face to practical tips for implementing tools driven by AI.

Although this resource is a good starting point for lawyers, it is not yet exhaustive. AI continues to evolve at an unprecedented rate. The gaps in our understanding of AI and the fact that it’s still in the early days for its integration into the practice of law make it nearly impossible to anticipate all the potential issues that can arise from its use in law. To harness AI’s benefits, lawyers must supplement this guidance by taking steps to keep up with AI’s evolution, legislative developments, and court findings that address its proper use.

As this technology develops and integrates into legal practice, AI regulatory guidance will grow. In the meantime, lawyers using AI must stay informed, follow the technology’s progress, and fill in the gaps. This will promote AI use that safeguards clients’ interests, upholds lawyers’ professional obligations, and is in line with regulators’ expectations.

The intersection of licensee obligations with AI

AI is not a substitute for lawyers exercising their professional judgment. When used responsibly, AI is a valuable tool that can benefit both lawyers and clients. Automating some of the repetitive tasks that lawyers perform saves time that can be spent on nuanced activities requiring human skill. It falls to lawyers to balance this potential with its limitations. AI can be unpredictable, glitchy, and incorrect. Based on source data, it can hallucinate and invent non-existent output or regurgitate bias or discriminatory responses.

The core of regulators' guidance concerns lawyers’ failure to comply with their professional responsibilities when embracing AI. Licensees are reminded of their duties of competence, confidentiality, candour, supervision and delegation, the reasonableness of fees and disbursements, and the duty not to mislead a tribunal. Companion documents and checklists recommend practical steps to help meet those obligations.

The resources available to lawyers emphasize AI’s shortcomings, including the accuracy and reliability of content generated by it. Lawyers can build on this guidance by learning about large language models (LLMs) to understand how AI works. This will better position counsel to anticipate the risks of AI use in their practice. Some of AI’s weaknesses result from the data it is trained on. Aside from the hallucinatory responses, AI does not know what it does not know. This becomes obvious when considering its use for legal research. If the dataset it is trained on is limited to a particular jurisdiction, relevant extraterritorial data will be omitted from its output. For example, an Ontario-based lawyer using AI to review a contractual clause for GDPR or EU compliance must first confirm that the tool is trained in GDPR and EU data protection laws.

The statutory framework around AI is an instructive resource for lawyers. In June 2022, the Government of Canada tabled the Artificial Intelligence and Data Act (AIDA) as part of Bill C-27, the Digital Charter Implementation Act. If passed, AIDA will be Canada's first AI-specific legislation to regulate its development and adoption. It seeks to mitigate the risks of harm associated with AI systems and emphasizes the significance of human oversight. A review of this impending legislation provides insight into AI vendor management and practical considerations for lawyers representing both vendors and purchasers of this technology.

Directions from the court

The regulatory guidance speaks to the duty to disclose AI use to clients and its impact their interests. Lawyers are encouraged to inform clients about the use of such technology and explain how it was used the associated risks, and the steps taken to mitigate those risks.

Practice directions issued by the court shed light on licensee blind spots. Courts and tribunals now require that lawyers disclose whether AI generated any part of their submissions. The Federal Court recently issued an updated notice regarding the use of AI in proceedings. This notice requires that any document prepared for litigation containing AI-generated content include a declaration in its first paragraph providing details about that content. Similarly, court rulings and disciplinary findings invoking improper AI use expose the potential harm to lawyers and their clients.

Conclusion

AI can offer many benefits to lawyers, such as greater efficiency and innovation. However, it can also pose challenges and risks that require careful and responsible use. Guidance from legal regulators, though a good starting point, is not enough for lawyers using AI in their daily practice. Licensees must remain abreast of developments in AI, including new releases, the statutory framework around it, and the direction provided by the courts.

Recent articles & video

Mary Gleason appointed chief justice of the Court Martial Appeal Court of Canada

Ontario court rejects child protection agency’s ‘speculation and gossip’, orders child’s return

CPPIB, Neuberger Berman, EQT to acquire international schools operator Nord Anglia for $20 billion

Federal Court overturns study permit denial, citing unreasonable focus on applicant’s career plans

Sask. court dismisses estate case due to jurisdictional overlap with Indigenous Services Canada

SK Court of King’s Bench dismisses personal injury claim due to inordinate delay

Most Read Articles

BC Supreme Court mandates DNA test to determine plaintiff’s claim in will dispute

SCC says Criminal Code changes bar judge from imposing driving ban on man who killed two with truck

CIBC did not discriminate against ex-employee based on his disability and heterosexuality, FCA rules

Making companies accountable for ESG and DEI