Artificial intelligence hallucinations in legal research may become a thing of the past

Tools such as retrieval-augmented generation can address fears about using AI

Artificial intelligence hallucinations in legal research may become a thing of the past
Monica Goyal

Most readers now know about the lawyer who turned to ChatGPT to do legal research. Steven Schwartz, who practised law in New York, admitted he used the AI tool that creates realistic text outputs based on user input to help him with his legal research. He did not verify the cases provided by ChatGPT and unknowingly cited several non-existent cases in his brief. The fake citations were brought to Mr. Schwartz’s attention by the opposing counsel and the court, who caught the fabrication. Instead of verifying the cases using a legal research database like Westlaw, Mr. Schwartz went to ChatGPT and asked the tool to confirm if the cases were true. And ChatGPT, which is incapable of understanding what is true or not, said yes. Mr. Schwartz has since been sanctioned for misconduct.

After Mr. Schwartz’s cautionary tale, you would think that lawyers would be more cautious about using tools like ChatGPT and would surely double-check the veracity of their work. Still, lawyers continue to make this mistake. Recently, the opposing counsel in a BC case found fake citations. The Law Society of BC is currently investigating the lawyer in question. 

The issues for Mr. Schwartz arose because he used ChatGPT believing it was like a Google internet search. However, unlike Google searches, ChatGPT is a mathematical model that emulates how people generate text (generative AI technology), so it will occasionally make up facts, like case citations. This tendency is referred to as hallucinations.

However, generative AI technologies could be beneficial for legal work. Lawyers do generate lots of text. In fact, at our law firm, we are using several solutions that assist with legal work and business legal operations. But we are careful and train our lawyers to use these tools properly.

The ChatGPT cases have sparked debates about the ethical and professional implications of using generative AI tools in legal practice and court. Some courts have now added requirements that any use of AI have a certification by the lawyer that confirms they have verified all the cases cited. The Chief Justice of the BC Supreme Court issued a directive telling judges not to use AI. State bars and law societies have said that lawyers should be more careful and responsible when incorporating technology into their work. Recently, the State bar of Florida approved an Ethics Advisory Opinion 24-1, which says that lawyers may ethically use generative AI if they can guarantee compliance with the lawyer’s ethical obligations, including duty to client confidentiality. I agree with many commentators that generative AI tools could be helpful and innovative if used correctly and with adequate training.

I have met many lawyers who are wary of generative AI solutions because of cases like Schwartz's and the problematic nature of hallucinations. However, advancements in AI technology are quickly emerging to address this issue. Retrieval-augmented generation (RAG) is one such solution. RAG is a technique that enhances the output of large language models (LLMs), like GPT, by retrieving relevant information from external sources. RAG aims to overcome the limitation of hallucinations by providing LLMs with additional context and facts that improve their responses' accuracy, reliability, and trustworthiness. Think of RAG as an assistant for LLMs, retrieving relevant case law and precedents. Systems that employ RAG will cite sources that users can easily click on and confirm that the text generated is correct, reducing inaccuracies and improving trust in generative AI technologies.

There are already several legal tech solutions emerging that incorporate the use of RAGs and do not have the problem of hallucinations. This technology will likely significantly impact legal research and document drafting, which can be very time-consuming tasks. Instead, imagine drafting a legal research memo that pulls your evidence and caselaw with appropriate citations.

Lexis+ and Paxton.AI are examples of legal tech solutions incorporating RAG technology. As AI technology evolves quickly, it may soon render any attempts by courts and law societies to limit AI use irrelevant. It appears that technology will solve its problems.

Recent articles & video

Ontario Superior Court certifies class action against crypto asset trading platform Binance

NS Court of Appeal denies request for the production of CCTV footage in a personal injury action

NS Supreme Court clarifies disclosure standards in a divorce and property division case

Federal Court overturns study permit denial due to immigration officer’s unreasonable assessment

Ontario Court of Appeal dismisses stroke-related medical malpractice suit against physician

Military judges being subject to chain of command does not sacrifice independence, impartiality: SCC

Most Read Articles

BC Supreme Court orders father to pay fines for continuous breaches of conduct and parenting orders

Ontario Superior Court certifies class action against The Bank of Nova Scotia

Manitoba First Nations' class action seeks treaty annuity payments

BC Supreme Court revokes probate grant for failure to properly notify testator’s son in Mexico