Politics

RCMP's use of Clearview AI facial recognition technology under investigation

One day after the RCMP admitted to using controversial facial recognition technology, the federal Office of the Privacy Commissioner is opening an investigation into whether its use violates federal privacy law.

Mounties acknowledge using facial recognition technology, but won't say where

The RCMP has acknowledged using controversial facial recognition technology that has raised privacy concerns. (Photo illustration/CBC)

One day after the RCMP admitted to using controversial facial recognition technology, the federal Office of the Privacy Commissioner is opening an investigation into whether its use violates federal privacy law.

"In light of the RCMP's acknowledgement of its use of Clearview's technology, the OPC is launching an investigation under the Privacy Act," said a statement issued Friday by the Office of the Privacy Commissioner.

The RCMP put out a statement Thursday confirming it has used the technology in 15 child exploitation investigations over the past four months, resulting in the identification and rescue of two children.

The statement also said that "a few units in the RCMP" are also using the controversial tech to "enhance criminal investigations," but offered no details about how widely, and where, it's being employed.

CBC News has asked for further information about the RCMP's use of Clearview AI, including where and how it was employed, but has yet to receive a response. 

"While the RCMP generally does not disclose specific tools and technologies used in the course of its investigations, in the interest of transparency, we can confirm that we recently started to use and explore Clearview AI's facial recognition technology in a limited capacity," Thursday's statement said.

"We are also aware of limited use of Clearview AI on a trial basis by a few units in the RCMP to determine its utility to enhance criminal investigations."

Other police forces using facial recognition too

Clearview AI's powerful technology can unearth items of personal information — including a person's name, phone number, address or occupation — based on nothing more than a photo.

Concerns about the software erupted after a New York Times investigation revealed the software had extracted more than three billion photos from public websites like Facebook and Instagram and used them to create a database employed by more than 600 law enforcement agencies in the U.S., Canada and elsewhere.

On Friday, the Chronicle Herald reported Halifax police have deployed the controversial software but are no longer using it.

Earlier this month, Toronto police said some of their officers have used Clearview AI  — one month after denying it.

Calgary police say they regularly use facial recognition technology. Hamilton's police service said it has tested Clearview AI's system. The Ottawa Police Service tested another system from NeoFace Reveal last year, but said no longer uses it. Edmonton Police say they're considering using facial recognition technology.

The Ontario Provincial Police said it has used facial recognition technology, but hasn't specified the product. 

David Fraser, a privacy lawyer with McInnes Cooper, said police forces shouldn't be the ones deciding how to balance expediency in investigations with citizens' right to privacy.

"It seems that decision making about the use of intrusive technologies is happening entirely in the shadows, without any public oversight ..." he said.

"We have lines that need to be drawn and we really need to have a public and societal conversation about where those lines should be drawn."

Therrien's office had teamed up with several provincial privacy commissioners for a separate review of Clearview AI's practices.

House of Commons committee looking into AI

Fraser said he hopes Therrien's findings prompt a larger political discussion.

"I'm hopeful that not only will there be a verbal smackdown [from Therrien], and I certainly expect that there will be. I hope that that actually prompts policymakers and lawmakers to engage in an actual conversation about the processes that lead to the adoption of technology like this," he said.

NDP MP Charlie Angus has pushed for members of the House of Commons committee on access to information, privacy and ethics to scrutinize the effects facial recognition tools could have on society.

Teresa Scassa, Canada Research Chair in Information Law and Policy at the University of Ottawa, said strict  parameters and a chain of command should be in place for facial recognition tools to prevent their abuse — and it's not clear if that's happening in Canada.

"You can imagine, for example, somebody who sees an attractive woman in a bar at a party. doesn't know who she is, snaps a photo and runs it through the system at work where there are no checks and balances, to find out who she is," she said.

"If you're going to adopt these tools, you need to have policies in place. You need checks and balances, you need to have transparency and you need to have a debate ... about when it is acceptable to use these technologies versus when it's not."

The RCMP said it will work with Therrien on guidelines for using facial-recognition technology under Canadian law.

With files from the CBC's Peter Zimonjic, David Burke and The Canadian Press