ICO “deeply concerned” by King’s Cross facial recognition

By Robert Scammell

The head of the UK’s data regulator Elizabeth Denham has said she is “deeply concerned” by the increasing use of facial recognition in public spaces.

The information commissioner’s statement follows controversy around the use of facial recognition software at the King’s Cross Central development area in London.

Thousands of people pass through the 67-acre space per day. Argent, the property developer for the area, told the Financial Times – which first broke the news – that its cameras “use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public”.

The Information Commissioner’s Office (ICO) said it has launched an investigation into the use of facial recognition software at King’s Cross.

“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” said Denham. “That is especially the case if it is done without people’s knowledge or understanding.

“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector.

“My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”

ICO will inspect King’s Cross facial recognition system

She added that facial recognition technology is a “priority area for the ICO and when necessary, we will not hesitate use our investigative and enforcement powers to protect people’s legal rights”.

Denham added that the ICO will require “detailed information” from those employing the technology about how they are using it.

The ICO will also go to King’s Cross to inspect the system and how it is being used “to assess whether or not it complies with data protection law”.

Biometric data falls under a special category under GDPR that affords greater protection to citizens.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way,” said Denham. “They must have documented how and why they believe their use of the technology is legal, proportionate and justified.

“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”

Read more: Suprema data breach: What GDPR says about biometrics

Verdict deals analysis methodology

This analysis considers only announced and completed cross border deals from the GlobalData financial deals database and excludes all terminated and rumoured deals. Country and industry are defined according to the headquarters and dominant industry of the target firm. The term ‘acquisition’ refers to both completed deals and those in the bidding stage.

GlobalData tracks real-time data concerning all merger and acquisition, private equity/venture capital and asset transaction activity around the world from thousands of company websites and other reliable sources.

More in-depth reports and analysis on all reported deals are available for subscribers to GlobalData’s deals database.

Topics in this article: ,