Toronto

York, Peel police now using facial recognition technology

Two Toronto-area police services say they have begun using facial recognition technology as part of their investigations, but some advocates warn the tool comes with real risks to civil liberties.

Police services say plan made after consultations with provincial privacy commissioner

Red and blue flashing lights on police car.
Two Greater Toronto Area police services say they are now using facial recognition technology as part of the investigative process. (Shutterstock)

Two Toronto-area police services say they have begun using facial recognition technology as part of their investigations, but some advocates warn the tool comes with real risks to civil liberties.

The York Regional Police Service and Peel Regional Police Service announced the use of the technology in separate news releases Monday afternoon, saying the company Idemia was selected as the vendor. Both say the move follows consultations with the province's information and privacy commissioner.

Peel police say the technology will automate parts of the force's current image comparison process.

"The new system will scan and compare against lawfully-collected digital evidence currently stored in our databases," said Peel police Deputy Chief Nick Milinovich in a statement. "This new technology will not only support our criminal investigations greatly, but it will enable us to run mugshot searches faster with less human error, increasing safety in Peel Region."

Peel police said the images in the organization's existing mugshot database have been stored in accordance with the Identification of Criminals Act, and the technology will not be used to scan or compare against footage like live video from other sources. 

In its news release, York police said the technology allows police to compare images of people identified as suspects or persons of interest with mugshots in the police database, and that images will not be gathered from social media or CCTV footage as part of the program.

Sharing a system between police forces allows for collaboration, the release said, while lowering costs.

"As we're all too aware, criminals don't limit their activity to a single jurisdiction," York police Chief Jim MacSween said in a statement. "Partnering with Peel Regional Police is cost effective and enables us to collaborate more extensively to make both communities safer."

Calgary police gave the media a demonstration Monday on how the new facial recognition software works.
Calgary police have been using a version of facial recognition technology since 2014. (CBC)

The two forces aren't the first in the Greater Toronto Area to employ facial recognition. The Toronto Police Service used facial recognition technology from the company Clearview AI in dozens of criminal investigations over a span of three and half months from October 2019 to February 2020, before being ordered to stop by its police chief.

CBC News has previously reported that officers uploaded more than 2,800 photos to the U.S. company's software to look for a match among the three billion images Clearview AI extracted from public websites, such as Facebook and Instagram, to build its database.

Four Canadian privacy commissioners later determined that Clearview AI conducted mass surveillance and broke Canadian privacy laws by collecting photos of Canadians without their knowledge or consent.

The Office of the Information and Privacy Commissioner of Ontario released public guidance in February for police using facial recognition "to help mitigate potential privacy risks."

There is advice specific to joint programs, the office said in an emailed statement on Wednesday.

"While advancements in facial recognition technology can enhance policing and help police identify investigative leads more efficiently, it is critical to protect fundamental privacy rights and freedom," the statement said.

Improper implementation, the office said, "can increase privacy risks, including the potential for over-retention or misuse of personal information, bias or inaccuracy, and technological or human errors that could result in false recognitions, wrongful arrests, and other types of intrusive investigative scrutiny of Ontarians."

Case for justifying use of technologies doesn't exist: prof

Wendy H. Wong, a University of B.C. Okanagan political science professor who has researched emerging technologies, said on Monday she is surprised that the two police services are using facial recognition.

"I'm not sure the case for justifying the use of such technologies exists. At least it hasn't been articulated by the police departments in their statement. Part of me wants to know why they think they need facial recognition technology at this moment in time," Wong said. 

Wong said the use of technologies is not always as intended, which has real consequences for people, and they are often used against marginalized people.

Policing is most often present in marginalized communities and that is where the data is going to be coming from, she said. There is the potential for human error and police forces have their own biases, she added. 

"We know there are inherent risks to using these types of technologies," she added. "The jury is not in yet with regard to what the harms are for having facial data about people." 

Anaïs Bussières McNicoll, director of the fundamental freedoms program and interim director of the privacy, technology and surveillance program of the Canadian Civil Liberties Association, said the group's general position is that the use of facial recognition threatens individuals' privacy rights and their right to be free from unreasonable search and seizure.

She said there is no legislation in Canada that directly addresses the serious risks and challenges caused by adoption and deployment of facial recognition technology.

"Until there are clear and transparent policies and laws regulating the use of facial recognition technology in Canada, it should not be used by law enforcement agencies," she said.
 

With files from Lane Harrison