RCMP admits to using Clearview AI in online child sex abuse cases

By News Staff

The RCMP has confirmed that it is using a controversial facial recognition technology to assist in online child sexual exploitation investigations.

In a statement released Thursday, the national police force says its National Child Exploitation Crime Centre (NCECC) has been using and evaluating Clearview AI’s software for the last four months.

“The Internet has changed the way child sexual exploitation offences are committed, investigated and prosecuted and Clearview AI is only one of many tools/techniques that are used in the identification of victims of online child sexual abuse,” the RCMP said.

The NCECC says the software is primarily used to help identify, locate and rescue children who have been or are victims of online sexual abuse. It added that in 15 cases in which the application has been used, it has resulted in the successful identification and rescue of two children.

The RCMP says it is also aware that Clearview AI is being used on a limited basis by a few units within the force to determine its effectiveness in other criminal investigations.

The RCMP says it plans to speak with Canada’s Privacy Commissioner to help develop guidelines and policies that conform to legislation and regulations.

“While we recognize that privacy is paramount and a reasonable expectation for Canadians, this must be balanced with the ability of law enforcement to conduct investigations and protect the safety and security of Canadians, including our most vulnerable,” said the RCMP.

Recently Toronto’s Chief of Police admitted that some of their officers have used Clearview AI, but he did not disclose how many times it had been used or for what purposes. Mark Saunders said even though the application had been used within the force since October 2019, he did not become aware of its use until February 5, at which time he directed officers to stop using the controversial facial recognition technology.

Police forces in Peel, Durham and Halton region have also admitted having access to the software.

The Ontario Provincial Police say they have used facial recognition technology but would not specify if Clearview AI is one of them.

Clearview AI recently revealed it was the victim of a data breach. According to a report in The Daily Beast, the company said an unauthorized intruder gained access to its list of customers and user accounts but that the intruder did not obtain any law-enforcement agencies’ search histories.

The New York Times reported back in January that Clearview AI had scraped 3 billion images from the internet, including from Facebook, which is a violation of the social media giant’s terms of service.

Top Stories

Top Stories

Most Watched Today