Kitchener-Waterloo

Waterloo police used Clearview AI more than previously revealed

Waterloo regional police said the service used the controversial facial recognition technology Clearview AI more than the service has previously acknowledged.

Investigators from multiple units used the technology between November 2019 and February 2020

Waterloo Regional Police Service Chief Bryan Larkin apologized to the police services board Wednesday for not previously communicating the extent to which police had used Clearview AI. (Matthew Pierce/CBC)

Waterloo regional police said the service has used the controversial facial recognition technology Clearview AI more than has been previously acknowledged.

Waterloo regional police Chief Bryan Larkin said Wednesday that he had initially been told the service either wasn't using the technology or that "we may have used it via a third party."

"Through further audit and further review I learned that that was not accurate and not correct," said Larkin.

In a statement released Wednesday, police said the service's cyber crime unit had acquired a free license for Clearview AI at an international conference on child exploitation in November 2019.

Members of police from multiple investigative units went on to use the software between November 17, 2019 and February 14, 2020, the service announced in a statement.

On February 13, 2020, a police spokesperson had told CBC that the WRPS "does not currently have facial recognition software."

A few days later, Insp. Mark Crowell told CBC News the police service had used the technology through "provincial investigative networks," specifically the Toronto Police Service.

Larkin apologized to the police services board Wednesday for not previously communicating the extent to which police had used Clearview AI. 

"The information we shared was inaccurate, it was misinformed and for that I accept responsibility," he said.  

Charges laid

Clearview AI can reveal personal information — including a person's name, phone number, address or occupation — based on nothing more than a photo.

Concerns were raised about Clearview AI after New York Times investigation earlier this year revealed the software had extracted more than three billion photos from public websites to create a database that was then used by hundreds of law enforcement agencies.

According to Waterloo regional police, the technology has been used by investigators with the following units:

  • Cybercrime and child exploitation.
  • Missing persons.
  • Criminal intelligence.

Police said investigators used the technology to identify both victims and suspects. According to Larkin, at least one such investigation has led to charges.

"We're still doing ongoing work to determine what the impact is and we will work with the crown attorney to advise them of such," said Larkin, who added an audit of the service's use of Clearview AI is ongoing.

Facial recognition 'evolution of science': chief

Larkin said police need to develop a balanced approach to new forms of technology, which takes into account both privacy concerns and the power that this technology may have to fight online crimes such as internet child exploitation.

"I believe that facial recognition is the evolution of science," Larkin said.

For now, all police members have been told to "cease all use" of facial recognition software until the service develops a policy around it, Larkin said.

An update on the policy is expected at the April police services board meeting.  

Board chair Karen Redman said Wednesday she appreciated that Larkin took responsibility for the situation.