The Information Commissioner’s Office (ICO) has warned that police forces need to “slow down” on the deployment of facial recognition technology due to its potential for “widespread invasiveness”.
Facial recognition, in which algorithms are taught to recognise and differentiate between faces based on a database of images, has become a contentious issue, especially when it comes to law enforcement.
The technology has been trialled by Leicester police, South Wales police, and the Metropolitan police since 2015, with the technology used during Notting Hill Carnival and the 2019 Champions League Final in Cardiff.
Earlier this year, the developers behind Kings Cross Station came under scrutiny after the Financial Times reported that facial recognition was being used around the station. The project was scrapped in September.
Campaign group Big Brother Watch has been strongly critical of the use of live facial recognition (LFR) by police due to concerns about “the impact of automated facial recognition on individuals’ rights to a private life and freedom of expression, and the risk of discriminatory impact”, calling on UK authorities to stop using automated facial recognition software.
The ICO’s facial recognition stance
In a blog post, UK Information Commissioner Elizabeth Denham said that the current laws and practices related to LFR technology “will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents”
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below formBy GlobalData
The non-departmental public body points out that there is currently a lack of regulation for LFR, and is therefore calling for the introduction of a “statutory and binding code of practice” in order for its use in public spaces by police forces and other organisations in a way that does not undermine public confidence.
It also recommends that more is done to investigate and eliminate bias from the algorithms used to train LFR “particularly associated with ethnicity”.
According to research conducted by MIT Media Lab and Google’s Ethical Artificial Intelligence Team, facial recognition has trouble differentiating between people with darker skin tones, particularly women with darker skin due to algorithms being trained with images of a disproportionate number of white men.
Earlier this year, Ed Bridges took South Wales Police to court after he was captured by automated facial recognition on two occasions in Cardiff, arguing that his right to privacy had been breached. However, the high court ruled that the use of facial recognition by South Wales Police was lawful.
In light of this judgement, the ICO believes that it should not be seen as “blanket authorisation” for the use of LFR by police forces in all circumstances, only when it can be shown to be strictly “necessary, balanced and effective”.
Although it recognises the need for police forces to explore the use of new technologies, the ICO calls for this to be done in a way that upholds data protection laws and does not infringe on the public’s privacy, and that “moving too quickly to deploy technologies” can be invasive and damaging to trust.
“We must absolutely ensure innovation is not being stifled or stopped”
However, while the technology must be thoroughly assessed and deployed with caution, from the perspective of the technology industry there is a danger that this could stifle innovation.
Jason Tooley, chief revenue officer at Veridium believes that the benefits of the technology could be harnessed to improve policing:
“There is increasing concern in the community that regulators such as the ICO will take too much of a heavy-handed approach to regulating the technology, and we must absolutely ensure innovation is not being stifled or stopped. It’s in the public interest for police forces to have access to innovative technology such as biometrics in order to deliver better services and safeguard our streets.”
However, he explains that greater transparency over the use of biometric technology is essential:
“It is imperative police forces take a strategic approach as they trial biometric technologies, without giving precedence to a single biometric approach. A strategic approach, using other biometric techniques that have greater levels of acceptance such as digital fingerprinting, will ensure a higher level of public consent due to its maturity as an identity verification technique. Considering the rapid rate of innovation in the field, adopting an open biometric approach that enables the police to use the right biometric technique for the right scenario, taking into account varying levels of maturity, will see the benefits associated with digital policing accelerated.
“If the police adopt a transparent policy on how biometric data is interpreted, stored and used, the public’s data privacy concerns can be greatly alleviated, which will in turn trigger consent and wider acceptance. Managing expectations around biometrics and how the technology will be used is crucial, especially in surveillance use cases. Concerns over data privacy can also be eliminated if sensitive biometric data is stored in the correct way, using sophisticated encryption methods such as sharding or visual cryptography, which renders the sensitive data unusable to a hacker.”