The High Court has ruled that the use of facial recognition by the South Wales Police was “consistent with the requirements of the Human Rights Act and the data protection legislation” in a landmark ruling.
This comes after former Liberal Democrat councillor Ed Bridges took legal action against South Wales Police after his face was captured by automated facial recognition (AFR) on two separate occasions.
In a judicial review in May, Bridges argued that the technology breaches his right to privacy, as well as equality laws and data protection laws, after his image was captured while shopping in Cardiff, and again while attending a peaceful protest. Bridges said that the incidents caused him “distress”.
First global legal challenge of facial recognition sees practice backed
This is the first instance globally of facial recognition technology, in which a biometric scan of a person’s face is compared against a database including suspects or vulnerable people, being legally challenged.
Lord Justice Haddon-Cave and Mr Justice Swift ruled that facial recognition had been “deployed for a limited time, and for specific and limited purposes” and that the “processing of personal data was lawful and met the conditions set out in the legislation”.
The court also noted that The Court also noted that “unless the image of a member of the public matched a person on the watchlist, all data and personal data relating to it was deleted immediately after it had been processed.”
In response to the ruling, Bridges said:
“South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
South Wales Police is one of the forces exploring the use of the technology, and has been conducting trials since 2017. Last month, the South Wales Police announced the start of a three-month trial of facial recognition technology, with 50 officers having access to a facial recognition app.
According to The Guardian, the Metropolitan, Leicestershire police have also deployed facial recognition technology since 2015.
Facial recognition: Taking data without consent?
Human rights organisation Liberty, which represented Bridges, argues that AFR technology captures images of individuals without their consent, which is “akin to taking their DNA or fingerprints without their knowledge or consent”.
The organisation has publicly criticised the use of facial recognition by UK police forces as infringing the public’s right to privacy. Liberty has said that Bridges will appeal the court’s decision.
The debate surrounding the ethics of facial recognition technology has been recently reignited after the Financial Times reported that facial recognition was used in King’s Cross Station between May 2016 and March 2018.
Opponents arguing that the technology disproportionately misidentifies women or people of colour, as well as there being a lack of legal guidelines for its deployment However, today’s verdict ruled that “the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate”.
The law, facial recognition and the future
The case highlights the importance of ensuring laws keep up with technological developments. According to The Telegraph, at the start of the ruling Lord Justice Haddon-Cave said that the “The algorithms of the law must keep pace with new and emerging technologies.”
Commenting on the ruling, Jon Baines, data protection advisor at Mishcon de Reya said that there will likely be more legal challenges in the future:
“Although the High Court has resoundingly dismissed this claim against South Wales Police, it will not be the end to legal challenges to the developing use of Live Facial Recognition systems. As the technology develops the potential for intrusive misuse increases – and it is important to note that police powers to use this sort of surveillance are much wider than the powers available to other organisations.
“Additionally, the Information Commissioner and her team continue to investigate the use of the technology and has enforcement powers which enable them to audit and fine those who misuse it.”
In a statement, the Information Comissioners’ Office has said that it will take today’s ruling into account when finalising its guidance on the use of facial recognition by police forces:
“We will now consider the court’s findings in finalising our recommendations and guidance to police forces about how to plan, authorise and deploy any future LFR systems. In the meantime, any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”