Workplace AI use in Canada speeding up while laws lag behind: report

'You need to have some sort of a compliance program around the use of AI,' says lawyer

Workplace AI use in Canada speeding up while laws lag behind: report

The adoption of generative AI such as ChatGPT in the workplace has grown quickly since its public launch on November 30, 2022.

The number of people who use generative AI at work grew 16 percent in the last six months, which represents a 32 percent annual growth rate, according to a new report by KPMG International.

Other factors grew significantly between May and October, including worker perception of generative AI use. For example, 90% of respondents said generative AI has enhanced the quality of their work compared to 84% in May. And 72% said generative AI is essential to addressing their workload, compared to 65% in May.

Federal laws around AI

While this is good news for productivity, it comes with caveats, said David Krebs, national co-leader of Miller Thomson’s privacy and cybersecurity practice.

Bill C-27 proposes to amend the current federal Personal Information and Electronic Documents Act (PIPEDA) as well as introduce the Artificial Intelligence and Data Act (AIDA) to add limits to how AI is used in Canadian workplaces, is currently making its way through the House of Commons. When those laws are adopted, Krebs said, it will be a “big deal” for organizations that are already using AI improperly.

“As an employer, as a business using these technologies, you have to keep an eye on what's coming down the pike,” Krebs told HRD. “One big concern, generally for any department, but I think especially for an HR department, would be the disclosure of confidential information, disclosure of personal employee information.”

Workplace AI use outpacing rules

Out of the respondents currently using generative AI in the workplace, the KPMG report found that 76% are using public generative AI tools, as opposed to 24% using privately developed generative AI platforms.

Also of concern: thirteen percent have entered private financial data about their companies into generative AI tools.

Further, 56% of the more than 4,500 Canadians surveyed cited “failing to verify outputs that may be inaccurate, misleading, or biased”, the report said.

Bill C-27, while currently under discussion in the House of Commons, will likely place restrictions around “high impact” uses of AI, and employment decisions are considered high impact, Krebs said.

“You need to have basically some sort of a compliance program around the use of AI,” he said.

“There's workplace monitoring always, but if you're then taking that information … using that data in a tool that is powered by AI, that will then give you certain outputs that have an effect on employment, that will be considered high impact. And with high impact, there's accountability, there's transparency, there has to be good human oversight.”

Human oversight a main factor in upcoming workplace AI laws

In a December 7, 2023 statement, the Office of the Privacy Commissioner of Canada (OPCC) laid out its principles for enforcing AIDA going forward, advising organisations that are using generative AI in their activities, while not yet under specific AI-related legislation, still fall under Canada’s current privacy laws.

It stressed that organizations are responsible for their own privacy compliance, highlighting the following points:

  • “Know that accountability for decisions rests with the organization, and not with any kind of automated system used to support the decision-making process.
  • Ensure that impacted individuals are provided with an effective challenge mechanism for any administrative or otherwise significant decision made about them. This includes maintaining and providing on request sufficient information for that person to be able to understand how a decision was reached, and allowing them the opportunity to request human review and/or re-consideration of the decision.
  • If the outputs of a generative AI system are not meaningfully explainable, consider whether the proposed use is appropriate.”

The OPCC also made special mention of the “unique impact on vulnerable groups” that generative AI use can have, pointing out that organisations that are using the technology share responsibility of preventing risks and bias of historically vulnerable groups.

Start regulating AI use in the workplace now

Although Bill C-27 would not come into force until at least 2024, the limits it will introduce will be substantial, so it behooves HR professionals to begin preparing for those changes now, said Krebs.

“The tools will keep developing more quickly than the legislation will be in force – that's going to keep happening because we're not going to have a finalized law enforced in Canada in 2024, but a lot is probably going to happen in terms of the adoption and the potential use cases, for not just generative AI, but other AI tools,” he said, adding that the laws will become more specific around compliance practices.

For now, he recommends organizations consider putting robust policies and procedures in place that discuss the appropriate use, including limits and restrictions, of generative AI.

“I think businesses should act as though this was all coming into force,” he said. “A policy might say, ‘We do not use ChatGPT here at all for work-related projects’ or ‘We will only use it as a research tool internally, as a trial-and-error kind of tool to figure out how we can use this going forward.’”

Also, HR professionals should keep in mind that while PIPEDA currently does not contain any express provisions dealing with automated decisions, its principles can be applied to those uses. Furthermore, both Bill C-27 and Quebec’s new privacy law, Law 25, both contemplate requirements that would apply to such automated decisions.

“There has to be transparency, so if somebody says, ‘Why was this decision made?’, you have to be able to look behind the technology to tell that individual how it was made. That decision can't just be ‘Well, the computer spit it out.’”

Recent articles & video

Sarah Teich: Top 25 Most Influential Lawyer shares her fight for human rights

Alberta Court of King's Bench orders sale of estate lands, ending 30-year dispute among heirs

BC Supreme Court dismisses attempt to overturn spousal support agreement as abuse of process

Ontario Court of Appeal rejects extension to appeal medical malpractice case due to lack of merit

BC Supreme Court approves deductions for future benefits in PI case despite payment uncertainties

Arbitration Act bars appeal of court-appointed arbitrator: Ontario Court of Appeal

Most Read Articles

Federal Court rejects Canada Recovery Benefit claim due to insufficient evidence and missed hearing

BC Supreme Court rejects employer's attempt to move employment dispute to arbitration

Federal Court overrules denial of taxpayer relief due to procedural fairness breach

BC Supreme Court dismisses claim to waive solicitor-client privilege in family law dispute