Algorithmic policing risks intensifying systemic racism, harm privacy and Charter rights: report

The technology is currently being used and developed in Canada, said the report

Algorithmic policing risks intensifying systemic racism, harm privacy and Charter rights: report

A new report on algorithmic policing technologies is calling for a moratorium on their use until the Government carries out a comprehensive examination of their human rights implications and necessary legal reforms.

The Citizen Lab at the University of Toronto's Munk School and the International Human Rights Program at the University of Toronto's Faculty of Law released the report, “To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada,” on Tuesday. The report states that two police services – Vancouver and Saskatoon – have confirmed they are currently using or developing predictive algorithmic technologies, and other police forces have acquired technologies that provide that capability.

The report warns that these technologies risk reinforcing systemic bias against Black and Indigenous people and threaten the privacy and Charter rights of everyone. The Canadian legal system is currently without sufficient safeguards to ensure algorithmic policing is applied constitutionally and with the proper regulatory, judicial and legislative oversight, states the report.

“The top line finding is that there are algorithmic policing technologies being used and under development and consideration in Canada,” says Cynthia Khoo, a research fellow at the Citizen Lab and a technology and human rights lawyer.

“There's enough evidence to show that there's a tremendous risk of human rights violations if we're not careful about the implementation of these technologies, and in deciding whether you even use them at all.”

Algorithmic policing technology are a variety of different tools which draw inferences from mass-data-processing to predict potential unlawful activity or analyse data through automated surveillance. The technology complements traditional investigative methods and allows police to more effectively allocate resources. Facial recognition, automated license plate readers and social media surveillance algorithms are forms of this technology. In general, its use is more widespread in the U.S. and the UK than in Canada, said the report.

The report warns that because of the use of historical police data, historically marginalized groups may find themselves fed through a “negative feedback loop.” Past systemic bias will be multiplied because the algorithm will read the historic bias as reason to label them a heightened risk.

“There are critical questions to be answered regarding whether algorithmic surveillance technologies should be used at all given known problems regarding inaccuracy, ineffectiveness, and biases in the technologies,” says Kate Robertson report co-author, Citizen Lab research fellow and criminal defence lawyer at Markson Law in Toronto.

An example of the technologies’ potential to intensify bias is its use of data from police stops, said the report. A 2019 report on Halifax street checks found police were six-times more likely to target Black people than White people. Similar trends are found with “carding” in Ontario, and despite the practice being regulated in 2017, the pre-2017 data is still accessible to police and would populate the data sets on which the algorithms operate.

Citizen Lab interviewed Jonathan Rudin, program director at Aboriginal Legal Services, for the report. Rudin told Canadian Lawyer that although reliance on historical police data “purport to develop more objective and precise predictive policing,” its use will just perpetuate historical injustices experienced by the Indigenous community, “while giving them a pseudo-scientific veneer.”

“We know from the lived experience of members of the Indigenous community, from government study after government study, and from decisions of the Supreme Court , that the root causes of Indigenous criminal behaviour lie in the impact of historical and current government policies and the persistence of systemic discrimination in the justice system,” says Rudin.

The technology would also be influenced by inaccurate data, the report argued with reference to gang databases. One researcher from a Toronto-based community service organization cited in the report said young people are entered into gang databases based on “unverified information” and “incorrect assumptions” from their school, local service providers and police. University of Toronto criminologist Dr. Scot Wortley has noted that the law-abiding friends and family of gang members can also be classified as “gang involved,” said the report.

Arrest data is also suspect. Though used in algorithmic policing technologies, arrest data are not “factually or legally accurate representations of criminal activity,” said the report. Those arrested may be soon released, have their charges dropped or be acquitted.

If algorithmic surveillance technologies are used, Robertson says the current warrant provisions in the Criminal Code need to be re-examined.

“Privacy intrusions under section 8 of the Charter are not restricted to police searches of electronic devices owned by private individuals,” she says. “Online surveillance techniques from police computers can also pose significant and unique privacy threats. Safeguards would also be needed to provide sufficient notice to all third parties who are unknowingly affected by these sweeping surveillance technologies. Warrant provisions must also take into account the privacy impacts of algorithmic analysis of data that is already in a police service’s possession.”

Khoo and Robertson authored the report with Yolanda Song, a civil litigation lawyer at Stevenson Whelton LLP in Toronto, and pro bono research associate at IHRP.

Recent articles & video

AI funding announcement good news for tech sector, but also means legislation coming: BLG lawyer

Manitoba Court of Kings's Bench underscores lawyers' responsibilities to clients in estate planning

2024 budget contains a few surprises, says Davies tax partner Christopher Anderson

Canadian Human Rights Commission releases 2023 Annual Report highlighting challenges and progress

Shannon Mason named as newest judge of Nova Scotia Supreme Court Family Division

Alberta welcomes seven new judges: Friesen, Hawkes, McGuire, Brookes, Parker, Ho, and Jugnauth

Most Read Articles

BC Supreme Court upholds mother’s will against son's claims for greater inheritance

BC Supreme Court clarifies when spousal and child support obligations should end

Federal Court approves $817 million settlement for disabled Canadian veterans

Ontario Superior Court rejects worker's psychological impairment claim from a workplace injury