Edmonton

Caution urged as Edmonton police explore facial recognition technology

Alberta's privacy commissioner is urging the Edmonton Police Service to seek oversight as it explores the use of facial recognition technology. 

'It fundamentally changes the nature of the relationship between citizens and law enforcement'

As Edmonton police explore the use of facial recognition, experts warn the technology could fundamentally change the relationship between law enforcement and citizens. (Thomas Peter/Reuters)

Alberta's privacy commissioner is urging the Edmonton Police Service to seek oversight as it explores the use of facial recognition technology. 

In a statement Monday, the commissioner's office strongly encouraged EPS to submit a privacy review to ensure any future facial recognition program complies with privacy law.

"Analytic technologies, such as facial recognition, raise significant concerns regarding privacy and security of personal information," said spokesperson Scott Sibbald in an email to CBC News.

The statement comes after EPS said it is looking into a facial recognition technologies, following the lead of Calgary and Toronto police.

"The intention will be to use the technology in response to existing criminal investigations, using a database of pictures previously obtained for a lawful purpose," such as mug shots, EPS spokesperson Cheryl Sheppard said in a written statement. 

In 2014, Calgary was the first police force in Canada to publicly roll out a facial recognition program. Police said the technology can take a picture of a suspect and, within a matter of seconds, look for match among over 300,000 mug shots in a police database. 

The use of facial recognition technology came under renewed scrutiny after a New York Times investigation found controversial tech start-up Clearview AI has compiled billions of publicly available images into a facial recognition database and sold it to law enforcement agencies across the United States.

While Edmonton police have yet to secure a licensing agreement with a software company, Sheppard says they are not looking into Clearview AI. In the statement, Sheppard stressed police have not yet implemented a facial recognition program.

Real concerns, not dystopian speculation

While police forces say facial recognition can help solve crimes and save resources, some privacy lawyers and experts are alarmed by the potential abuse.

"It fundamentally changes the nature of the relationship between citizens and law enforcement because it takes away even the possibility of anonymity in public spaces," said Brenda McPhail, director of the privacy, surveillance and technology project at the Canadian Civil Liberties Association.

Experts say facial recognition technology could take away the possibility of anonymity in public spaces. (Photo illustration/CBC)

McPhail gave the example of police obtaining footage during an investigation of a retail store robbery.

While police may be looking for one suspect, all the people captured on the store's camera could be catalogued by a police database using facial recognition technology. 

"It's not wild speculation or dystopian theory that these kinds of images could end up being subject to a police database. They could be legally obtained and they could be used," McPhail said. 

AI moving faster than law

Lawmakers and the courts are also failing to keep pace with artificial intelligence, said Calgary-based technology lawyer James Swanson. And without specific regulation, police could push the limits of existing privacy law and how it applies to facial recognition technology.

"I can tell you the technology is evolving extremely rapidly. Far faster than the law can keep up with," Swanson said.  "We'll have a large scale social experiment in how it all comes to task." 

Edmonton police said they would only use facial recognition software to search legally-obtained images, but Canadian courts continue to grapple with what that means. 

Studies repeatedly show facial recognition algorithms are far more likely to misidentify black women compared to white men raising concerns about false accusations. (Center on Privacy & Technology at Georgetown Law)

In 2018, an Ontario court ruled police lawfully used facial-recognition technology to search a database of drivers' licences without a warrant. 

Months later, a man successfully argued in B.C. court that police violated his privacy by getting a copy of his passport photo from border services without a warrant.

"It's definitely a grey area," Swanson said. "The possibility for significant abuse is there, whether it's actually abused is a different question." 

Technology could perpetuate racial bias 

Even if Edmonton police used facial recognition technology against a mugshot database, those databases reflect systemic racial biases in policing, McPhai said. Black and Indigenous people are often over-policed compared to the rest of the population, leading to higher rates of arrest. 

"If a facial recognition is continuously searching that group of faces, that bias is going to be perpetuated," McPhail said. 

Studies have repeatedly found racial bias in facial recognition technology. 

A recent U.S. government analysis, the largest of its kind, reviewed nearly 200 facial recognition algorithms and found it was 10 to 100 times more likely to falsely identify a black face compared to a white one. 

When the findings are applied to law enforcement, the study noted misidentification could lead to false accusations for people of colour. 

Another study from Microsoft and MIT researchers found bias also extended to gender, with facial recognition software misidentifying black women at a far higher rate than white men. Black women were falsely identified roughly 34 per cent of the time, compared to under one per cent for white men.

The concerns have prompted several U.S. jurisdictions, including San Francisco and Oakland in California, to pass outright bans on police use of facial recognition technology.

McPhail said there needs to be a discussion about whether the public wants facial recognition technology before the police consider it. 

"Do we want to set this technology loose in our cities, in our communities, before we've got the correct regulations in place, before we've set up the safeguards we think are necessary to ensure rights are protected?"