Kitchener-Waterloo

Canadian police using tech that tries to predict potential crimes, monitor private chat rooms, report finds

A new report out of the University of Toronto raises concerns about the Waterloo Region Police Service’s use of surveillance technology to monitor people’s online conversations.

University of Toronto report calls for greater oversight of so-called 'algorithmic policing'

Lights on an internet switch are lit up as with users in an office in Ottawa in 2011. A new report out of the University of Toronto raises concerns about the Waterloo Region Police Service’s use of surveillance technology to monitor people’s online conversations. (Adrian Wyld/Canadian Press)

A new report out of the University of Toronto raises concerns about the Waterloo Region Police Service's use of surveillance technology to monitor people's online conversations.

The report specifically mention's the police service's use of an algorithmic social media surveillance technology. That tech is known as the ICAC Child On-line Protection System (ICACCOPS), which appears to scrape conversations from online chatrooms, the report said. 

"This is a concerning example that we uncovered in our research," said Kate Robertson, who is a lawyer and research fellow.

She co-authored the report To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada.

"It's an example of one of the automated surveillance technologies that are increasingly, and troublingly, being deployed by Canadian police services, often on a secretive basis," she said. 

In a case highlighted in the report, Robertson said the WRPS reportedly used the ICACCOPS program without a warrant.

"This is a troubling prospect because, from what we've learned about the technology, it appears to enable the interception of private communications in chat rooms," including chat rooms that are password-protected, and where people likely wouldn't expect to be monitored by a police officer, she said.

That investigation also involved the OPP, which did not immediately respond to requests for comment.

In the case highlighted in the report, the crown decided to stay the charge before a decision was made about whether the police techniques were constitutional, Robertson said.

In a statement, Waterloo regional police confirmed the service began using ICACCOPS 10 years ago and now uses it "routinely."

The technology is used by officers investigating Internet child exploitation and sexual abuse. Use of the technology is supported by the province, the statement said.

"Regular consultations and ongoing training are conducted with legal counsel to ensure that investigations are conducted in a way that protects the rights of suspects as well as the safety of victims," said police spokesperson Cherri Greeno.

Greeno said judicial authorization is obtained when required in the course of an investigation.

Tech tries to predict future crimes

The chatroom-scraping technology is just one of the technologies being deployed by police services that concerns researchers.

Robertson said she and her fellow researchers found examples of police services across Canada who are already using "algorithmic policing" techniques, or who have acquired tools with the capability of doing so.

Broadly speaking, algorithmic policing includes technology that:

  • Automatically collects and analyzes surveillance data, or
  • Tries to predict future crimes.

The facial recognition software Clearview AI is another example of algorithmic policing, the report said. Clearview AI's powerful technology can unearth personal information — including a person's name, phone number, address or occupation — based on nothing more than a photo.

Police services across Canada – including Waterloo regional police – admitted to using Clearview AI this winter. The software is no longer available in Canada after a privacy investigation was announced by provincial and federal authorities.

Perpetuating systemic bias

Too often, Robertson said, courts only hear about these technologies after they have already been in use.

Because of the potential for algorithmic policing to infringe on people's privacy rights, Robertson said a more proactive system of oversight is needed.

She also worries whether it's possible for police officers to use data to predict future crimes at all, without perpetuating systemic bias and discrimination. 

She and her colleagues want to see mandatory regulations that require police to consult with the public before using a new technology.

The report noted that algorithmic policing is, at least, used less in Canada than it is in the United States and the United Kingdom.

Robertson said she thinks there's still time to consider the potential costs of these technologies, before they become more widespread.

"There is still time to course correct," she said.