Civil society groups call on feds to protect privacy, free expression in imminent online harms bill

Ottawa plans to table legislation ASAP, Heritage Canada spokesperson

Civil society groups call on feds to protect privacy, free expression in imminent online harms bill
Cara Zwibel, Canadian Civil Liberties Association

As Ottawa says it hopes to table online-safety legislation as soon as possible, 13 civil society organizations, including the Canadian Civil Liberties Association (CCLA), have released a joint statement calling on the government to ensure the proposal protects privacy and freedom of expression. 

In their letter, the CCLA, Arab Canadian Lawyers Association, Canadian Internet Policy and Public Interest Clinic and ten other groups working on internet policy, civil liberties, and human rights, warned Minister of Canadian Heritage Pablo Rodriguez to ensure his proposal does not allow for proactive monitoring of online content, infiltrating private encrypted messages, or blocking websites without judicial authorization. The groups also told the minister that the bill should not require “mandatory takedown windows for most illegal content,” nor implement new definitions of “targeted harmful content,” outside what is already defined in Canadian law.

The imminent legislation on online safety comes after consultations and an expert advisory group informing its approach.

“The Government’s objective is to promote a safe and inclusive online environment for all Canadians,” says David Larose, media relations at Canadian Heritage. “The Government of Canada wants to take the time needed to get this right and draw upon the valuable feedback received throughout its engagement in 2022. It will help inform a legislative and regulatory response that supports an inclusive, free and safe online space for all Canadians.”

“The Government hopes to table legislation as soon as possible,” he says.

In its initial proposal, the federal government focused on five different types of online harm: child sexual exploitation, revenge porn, hate speech, terrorism propaganda, and incitement to violence. Some of these are straightforward, while hate speech, for example, can be very contextual, says Cara Zwibel, director of the Fundamental Freedoms Program at the CCLA. They and other organizations were concerned that Ottawa would take the same approach as other jurisdictions and require short timelines within which the platform must take steps to remove content flagged by users. This could incentivize platforms “err on the side of removal” and lean too heavily on taking down content when there is a complain, she says.

Following consultations and the expert advisory group, Zwibel says Ottawa appears now to be leaning toward a “risk-based approach,” rather than concentrating on content removal.

“Instead of implementing a bunch of takedown requirements, they'd be focused on asking the platform to assess and evaluate the risks associated with their platform and the business they run and explain how they're going to mitigate some of those risks and what steps they're going to take.”

In the letter, the signatories list seven actions that the legislation must not allow because they would cause a serious threat to freedom of expression and privacy. These include not requiring platforms to issue reports to police or national security agencies (aside from instances of child exploitation or where there is a risk of imminent violence). The groups say the legislation must not permit government to compel platforms to collect, intercept, or share private communication without a court order; must not authorize website blocking without a court order or appeal rights; and must not mandate “short and inflexible” timeframes for taking down content, with the exception of content related to child exploitation, which represents a risk of imminent harm to a person; or revenge porn.

The groups want the legislation to require transparency from platforms in how they moderate content, to require platforms maintain an appeal process for removed content, and to “encourage algorithmic transparency for the purposes of research and investigation.” They also want the legislation to encourage platforms to develop tools empowering users to easily block others or lock down their account, and not that conduct standards should focus on “patterns of behaviour based on reasonable risk assessments” rather than “a standard of perfection where any mistake is subject to a penalty.”

Larose notes that some of the civil-society organizations who signed the joint letter participated in the consultation process. Heritage Canada has published a report on the process.

Recent articles & video

Top Personal Injury Boutiques for 2023 unveiled by Canadian Lawyer

Lega unveils LLM governance platform to jumpstart law firms' AI journey

Cassels reimagines office design, replaces ‘old partner’ setup with ‘equality of access’ to daylight

Report calls for federal framework to implement Canada’s international human rights obligations

Overhauling the family justice system in Canada

Quebec Superior Court rejects advance inheritance payment to prevent ‘judicial guerilla’ among heirs

Most Read Articles

SCC finds company committed abusive tax avoidance in case dealing with general anti-avoidance rule

Cassels reimagines office design, replaces ‘old partner’ setup with ‘equality of access’ to daylight

The Law Society of Ontario governing coalition will face serious challenges

For AI, intellectual property must now reward and incentivize creativity and inventiveness: lawyer