1. Comment
  2. Comment
October 25, 2021

Call me irresponsible: London’s surveillance systems

By GlobalData Thematic Research

Surveillance systems are crucial features of smart cities. However, they are potentially controversial if not adequately justified and responsibly deployed. Recent announcements from the Mayor of London’s office suggest that London has not yet found this balance.

In August 2021, the London Metropolitan Police announced that it would integrate retrospective facial recognition technology (RFR) into its surveillance systems as part of a GBP3 million contract with Northgate Public Services. RFR compares images of individuals captured on recorded footage to a database of existing images.

One month later, London Mayor Sadiq Khan published an Emerging Technology Charter for London. The charter set out the standards that companies are encouraged (though not legally obligated) to adopt when deploying data-enabled smart city technologies in the public realm in London. The charter’s four key principles are: be open, be responsible with people’s data, respect diversity, and be sustainable.

Surveillance systems fail to meet key principles

The expansion of RFR surveillance systems in London arguably fails to meet three of these four principles and raises questions about how responsible a smart city London really is.

The UK does not have an adequate regulatory framework for responsibly integrating facial recognition into surveillance systems. In August 2020, the UK Court of Appeal ruled that South Wales Police’s use of live facial recognition (LFR) had violated an individual’s right to privacy as worded in the Human Rights Act 1998. Furthermore, a report published in March 2021 by Her Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) identified the police’s use of LFR as an “area for improvement” and explicitly suggested as a solution a “robust framework” for the technology’s use that alleviated public concerns. However, such a framework has yet to appear.

The same HMICFRS report claimed that RFR was “less controversial” than LFR. However, this does not mean RFR is uncontroversial. Both the European Data Protection Board and the European Data Protection Supervisor have called for a ban on the use of any facial recognition-based surveillance systems. Moreover, the Metropolitan Police’s purchase of RFR technology was approved before the London Policing Ethics Panel could scrutinise its proposed use and offer advice. Rushing through the purchase of controversial surveillance technology cannot be considered open or responsible.

Facial recognition problems

There are notorious examples of facial recognition systems resulting in discrimination against minorities. The accuracy of facial recognition systems corresponds to the diversity of images they are presented with during their development, which creates biases. For instance, in 2018, an MIT researcher tested the facial recognition systems of Microsoft, IBM, and Megvii. The error rate for light-skinned men was less than 1%, while for dark-skinned women, it was just under 35%. In 2020 a Black man in the US was wrongfully arrested when the Detroit Police’s facial recognition system incorrectly matched his driver’s license photo with surveillance footage of a shoplifter.

Facial recognition technology is seeping into city surveillance systems around the world. According to GlobalData’s thematic report on smart cities, at least 75 nations use AI tools such as computer vision to monitor citizens’ activities. These technologies can only be used responsibly when appropriately regulated. Law enforcement bodies like the Met must not be permitted to circumvent regulators.