A freedom of information (FOI) request has found that UK police forces are largely adopting artificial intelligence (AI) technologies, in particular facial recognition and predictive policing, without any public consultation despite previous assurances that this would take place.
A series of FOI requests by the RSA, the details of which are published in a report today, have found that while two police forces are now using facial recognition and four are using predictive policing, only one force has confirmed that is has conducted public consultations on the matter.
Use of facial recognition technologies by UK police
Facial recognition, which is being used by the Met Police and the South Wales Police, involves capturing images of people in public spaces and running them against the Police National Database, which contains millions of images of suspects.
However, only South Wales Police has so far conducted public consultation on its use; while the Met Police rolled out the technology in early 2020 after trials last year, it has not yet conducted any public engagements on its use, despite planning to do so.
The technology has attracted criticism due to its potentially high error rate and risk of bias, and has only limited support from the public. A report by the Ada Lovelace Institute published in September 2019 found that only 49% of UK citizens support its use in everyday policing, and 46% want the right to opt-out of the technology.
Predictive policing in UK police forces
Predictive policing, meanwhile, typically applies analytical techniques to data police forces already hold to identify individuals or locations that are at increased risk of criminal activity, the details of which are then used to allocate resources.
Durham Constabulary, Surrey Police and West Yorkshire Police are all using the technology in this way, while West Midlands Police uses the crime mapping software MapInfo, which The RSA says “bears strong similarities to predictive policing systems”.
Kent Police is using a similar technology, but to case assessment, to determine which cases are most likely to result in a successful conviction.
None of the forces were able to confirm that any public engagement had been conducted, although the RSA did note that previous research suggested some form of engagement may have been conducted by Durham Constabulary.
While predictive policing has not been quite as divisive as facial recognition, there are considerable concerns around its use, particularly relating to its potential for racial and gender biases.
Police AI use amid the coronavirus
There are concerns that the outbreak of the coronavirus has exacerbated the lack of public consultation, enabling police to deploy AI technologies under the radar without adequate transparency about their shortcomings and flaws.
“Innovation is exciting and welcome, but there are causes for concern in the lack of public engagement that has come with these technologies. Racial and gender biases can be exacerbated by technologies as they are based on historic data: we need to talk about that,” said Asheem Singh, head of the RSA’s Tech and Society programme.
“Our findings indicate a lack of transparency and input from the public on how these new technologies are being used, which in turn undermines the principle of policing-by-consent. It’s fine to cut costs but not at the expense of the improvements forces have made in their relations with BME communities.”
The RSA is calling on police forces to step up consultation efforts to ensure such technologies are deployed transparently.
“Law enforcement should work with civil society groups to provide proper consultation around how AI and ADS is being used,” said Singh.
“This has implications beyond policing. As lockdown begins to ease from today, we need to be sure that new tech is being deployed with all the public’s best interests in mind.
“We have models for deliberating and discussing these complex technological challenges. We want to ensure government has the tools to do its job – but that means ensuring that those tools are beyond reproach and consented to and trusted by all.”