The Office of the Australian Information Commissioner (OAIC) and the UK’s Information Commissioner’s Office (ICO) have launched an investigation into facial recognition software company Clearview AI.
The investigation will focus on the firm’s data scraping practices and use of biometric data under the UK’s 2018 Data Protection Act and the Australian Privacy Act of 1988. Data scraping is when a computer program extracts data from numerous different websites.
This comes just says after the company suspended operations in Canada following investigations by the privacy authorities of Canada, Alberta, British Columbia and Quebec into Clearview AI’s use of facial recognition technology.
Clearview AI describes itself as “a research tool used by law enforcement agencies to identify perpetrators and victims of crimes”. Although this can have benefits for law enforcement, the firm has faced criticism for the privacy concerns this raises and whether individuals have consented to their images being used in this way.
In Clearview’s facial recognition app, which is not available to the public, users can upload a photo of a person. The app then matches to publicly available photos in which the same person appears scraped from across the internet. The company reportedly has a database of over 3 billion photos collected from websites and social media platforms.
Tim Mackey, principal security strategist at the Synopsys Cybersecurity Research Centre, said that the Clearview AI investigation reflects a wider backlash against facial recognition.
“It really hasn’t been a good few months for facial recognition companies. Starting with the revelation of a data breach at Clearview AI, jurisdictions around the world have put in place moratoriums on the use of facial recognition technologies by law enforcement,” he said.
Facial recognition requires large datasets of images to train its algorithms. The greater the volume and variation of images, the more accurate the software will be.
Clearview AI has got round this by scraping billions of pictures from websites such as Facebook at Twitter, which means it can avoid obtaining image licences in the same way a photographer would need to obtain permission from the subject of their picture, said Mackey.
“Obtaining the legal rights for such a large dataset would be expensive, and it’s asserted that Clearview AI bypassed image licenses and simply scraped the data from websites,” he explained. “This process would reduce the cost of image acquisition, but could also have allowed the Clearview AI team to identify weaknesses in social media applications.”
Clearview AI faces mounting pressure
Youtube, Twitter and Google have sent cease and desist letters to Clearview AI, with Facebook and Venmo objecting to its data scraping practices. The America Civil Liberties Union is also suing the company, alleging that it violated the Illinois Biometric Information Privacy Act.
In February the company suffered a data breach in which its client list was exposed, revealing which organisations used its facial recognition service.
In May, Buzzfeed reported that Clearview AI was ending its contracts with non-law enforcement organisations and private companies.
Last month, several companies in the facial recognition market, most notably Amazon, IBM and Microsoft, announced that they would no longer provide facial recognition technology to law enforcement in the wake of the Black Lives Matter protests.
The OAIC and ICO said they will engage with other data protection authorities who have raised similar concerns, where appropriate.
Mackey predicts that the firm could exit certain markets as a result of the increased scrutiny:
“It will be interesting to see how Clearview AI responds to the ICO’s investigation and what is discovered. Eventually, I predict it will exit the UK and Australian markets as it’s done in Canada.
“Of note to consumers, it’s important to understand what your rights are when you share photos on the internet or within apps. For example, that cool image aging software could in reality be a simple way to obtain your image and profile data for use in a training data set for facial recognition software.”
Clearview AI is yet to comment on the investigation.