Facial recognition technology gains popularity with police, intensifying calls for regulation

RCMP was previously reprimanded for using technology without public's knowledge

Image | Peel Regional Police

Caption: In a video posted online, a Peel Regional Police officer is seen demonstrating Idemia's facial-recognition software. (Peel Regional Police)

Some police services in Canada are using facial recognition technology to help solve crimes, while other police forces say human rights and privacy concerns are holding them back from employing the powerful digital tools.
It's this uneven application of the technology — and the loose rules governing its use — that has legal and AI experts calling on the federal government to set national standards.
"Until there's a better handle on the risks involved with the use of this technology, there ought to be a moratorium or a range of prohibitions on how and where it can be used," says Kristen Thomasen, law professor at the University of British Columbia.
As well, the patchwork of regulations on emerging biometric technologies has created situations in which some citizens' privacy rights are more protected than others.
"I think the fact that we have different police forces taking different steps raises concerns [about] inequities and how people are treated across the country, but [it] also highlights the continuing importance of some kind of federal action to be taken," she said.
Facial recognition systems are a form of biometric technology that use AI to identify people by comparing images or video of their faces — often captured by security cameras — with existing images of them in databases. The technology has been a controversial tool in police hands.
In 2021, the Office of the Privacy Commissioner of Canada found that RCMP violated privacy laws when they used the technology without the public's knowledge. That same year Toronto police admitted some of its officers used facial recognition software without informing their chief. In both cases the technology was supplied by U.S. company Clearview AI, whose database was composed of billions of images scraped from the internet without the consent of those whose images were used.
Last month, York and Peel Police in Ontario said they had begun implementing facial recognition technology provided by multinational French company Idemia. In an interview, York PoliceConst. Kevin Nebrija said the tools "help speed up investigations and to identify suspects sooner," adding that in terms of privacy, "nothing has changed because security cameras are all around."
WATCH | This news reporter is AI-generated. Should we be worried?

Media | The National : This news reporter is AI-generated. Should we be worried?

Caption: A 24-hour news channel startup based in southern California comes with a twist: all of the reporters and production are AI-generated. CBC’s Jean-François Bélanger explores what Channel 1 is promising and why some are concerned about what it could mean for the news industry.

Open Full Embed in New Tab (external link)Loading external pages may require significantly more data usage.
Yet in neighbouring Quebec, Montreal Police Chief Fady Dagher says the force will not adopt such biometric identification tools without a debate on issues ranging from human rights to privacy.
"It's going to be something that is going to take a lot of discussion before we think about putting in place," Dagher said in a recent interview.
Nebrija stressed that the department consulted the Privacy Commissioner of Ontario for best practices, adding that the images police will acquire will be "obtained lawfully," either with the co-operation of security camera owners or by obtaining court orders for the images.

Court green light needed: expert

And although York police insist officers will seek judicial authority, Kate Robertson, a senior researcher at University of Toronto's Citizen Lab, said Canadian police forces have a history of doing just the opposite.
Since the revelations about Toronto police using Clearview AI between 2019 and 2020, Robertson said she is "still not aware of any police service in Canada that is obtaining prior approval from a judge to use facial recognition technology in their investigations."
According to Robertson, getting the go-ahead from the court, usually in the form of a warrant, represents the "gold standard of privacy protection in criminal investigations." This ensures a facial recognition tool, when used, is appropriately balanced against the right to free expression, freedom of assembly and other rights enshrined in the Charter.
While the federal government doesn't have jurisdiction over provincial and municipal police forces, it can amend the Criminal Code to incorporate legal requirements for facial recognition software in the same way it updated the law to address voice recording technologies that could be used for surveillance.
In 2022, the federal, provincial and territorial heads of Canada's privacy commissions called on lawmakers to establish a legal framework for appropriate use of facial recognition technology, including empowering independent oversight bodies, prohibiting mass surveillance and limiting how long images can be retained in databases.
Meanwhile, the federal Economic Development Department said Canadian law "could potentially" regulate corporate collection of personal information, under the Personal Information Protection and Electronic Documents Act, or PIPEDA.
WATCH | Artificial intelligence could pose extinction-level threat to humans, expert warns:

Media Video | Power & Politics : Artificial intelligence could pose extinction-level threat to humans, expert warns

Caption: A new report is warning the U.S. government that if artificial intelligence laboratories lose control of superhuman AI systems, it could pose an extinction-level threat to the human species. Gladstone AI CEO Jeremie Harris, who co-authored the report, joined Power & Politics to discuss the perils of rapidly advancing AI systems.

Open Full Embed in New Tab (external link)Loading external pages may require significantly more data usage.
"If, for example, a police force, including the RCMP, were to contract out activities that use personal information to a private company conducting commercial activities, then these activities could potentially be regulated by PIPEDA, including services related to facial recognition technologies," the department said.
Quebec provincial police also have a contract with Idemia, but they wouldn't say exactly how they use the company's technology.
In an emailed statement, the police said its "automated face comparison system is not used to check the identity of individuals. This tool is used for criminal investigations and is limited to the data sheets of individuals who have been fingerprinted under the Identification of Criminals Act."
AI governance expert Ana Brandusescu said Ottawa and the country's police forces have not heeded the calls for better governance, transparency, and accountability in procurement of facial recognition technology.
"Law enforcement is not listening to academics, civil society experts, people with lived experience, people who are directly harmed," she said.