UK markets closed
  • NIKKEI 225

    29,188.17
    +679.62 (+2.38%)
     
  • HANG SENG

    28,755.34
    +133.42 (+0.47%)
     
  • CRUDE OIL

    61.80
    +0.37 (+0.60%)
     
  • GOLD FUTURES

    1,784.10
    +2.10 (+0.12%)
     
  • DOW

    33,815.90
    -321.41 (-0.94%)
     
  • BTC-GBP

    37,503.51
    -2,594.15 (-6.47%)
     
  • CMC Crypto 200

    1,194.54
    -48.51 (-3.90%)
     
  • ^IXIC

    13,818.41
    -131.81 (-0.94%)
     
  • ^FTAS

    3,965.04
    +29.40 (+0.75%)
     

Clearview AI's data collection "clear violation" of Canadians' privacy: commissioner

Shruti Shekar
·Telecom & Tech Reporter
·3-min read

Clearview AI taking billions of images of people “represented mass surveillance,” Canada’s Office of the Privacy Commissioner says following an investigation, calling it a “clear violation of the privacy rights of Canadians.”

The commissioner’s office released the results of its investigation into Clearview AI’s technology Wednesday, which found the company had collected highly sensitive biometric information without the knowledge or consent of individuals.

“It also collected, used, and disclosed Canadians’ personal information for inappropriate purposes, which cannot be rendered appropriate via consent,” a press release said.

In January 2020, The New York Times reported that Clearview AI was working with hundreds of law enforcement agencies in the United States, including the FBI.

The company allows users to take a picture of a person, and if the photo matches a face in its three billion image database, it can potentially provide information like names, addresses and other details.

The three billion photos were harvested from Facebook, Venmo, YouTube, and other sites.

The RCMP initially did not confirm or deny it was using the technology, later stating it had. Clearview AI announced in July 2020 it no longer will provide facial recognition services in Canada.

When investigation findings were presented to the company, Clearview argued that the company does not have a “real and substantial connection” to Canada, that consent was not required because the information was made public, and that its technology is more beneficial than harmful.

The report noted concern that the company “did not recognize the mass collection of biometric information from billions of people, without express consent, violated the reasonable expectation of privacy of individuals and that the company was of the view that its business interests outweighed privacy rights.”

The investigation in Canada was completed by the federal privacy commissioner, the Commission d'accès à l'information du Québec, the Office of the Information and Privacy Commissioner for British Columbia and the Office of the Information and Privacy Commissioner of Alberta,

During a press conference, Daniel Therrien, the privacy commissioner, said this case is a clear example of why his office should have broad order-making powers, like recommending financial penalties.

That recommendation is part of the newly introduced privacy legislation, which was announced in November but is yet to be implemented.

However, under the introduced bill, Therrien said he would not be able to offer a penalty with this investigation, because the violation is not subject to administrative penalties.

“We think that provision should be amended,” Therrien said.

Ann Cavoukian, former information and privacy commissioner of Ontario, agreed in an interview. She noted that there are many issues with C-11, but the hope is amendments will change the way it is currently structured so that companies like Clearview AI get the right penalties.

“Clearview AI clearly violated people’s privacy by slurping something like three billion facial images off of social media. They did this without any notice, without any consent of the individuals involved. It’s appalling. Your facial image is the most sensitive image out there, and if it’s compromised, it can get you in such deep trouble,” she said.

Today’s release indicated that Clearview’s technology “allowed law enforcement and commercial organizations to match photographs of unknown people against the company’s databank” of images. Therrien did not address why the names of those commercial entities were not named.

“We will think about this further. The use by commercial entities was extremely infrequent and stopped very shortly after the story broke out in the media and we investigated. So currently we have no reason to think that commercial entities are using Clearview,” he said.

Cavoukian, however, strongly urged the commissioner to release the names of those companies that did use the technology.

“You need transparency associated with this case and you need to know what companies are buying people’s images and what do they intend to do with that,” she said. “Transparency is a valuable asset.”