ICO Investigation Into Police Use of Facial Recognition Technology

ICO head Elizabeth Dunham is reported to have launched a formal investigation into how police forces use facial recognition technology (FRT) after high failure rates, misidentifications and worries about legality, bias, and privacy.

Concerns Expressed In Blog Post In May

In a blog post on the ICO website back in May, Elizabeth Dunham expressed several concerns about how FRT was being operated and managed. For example, although she acknowledged that there may be significant public safety benefits from using FRT, Elizabeth Dunham highlighted concerns about:

  • A possible lack of transparency in FRT’s use by police and how there is a real risk that the public safety benefits derived from the use of FRT will not be gained if public trust is not addressed.
  • The absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment.  This has since been addressed to an extent by an oversight panel, and by the appointment of a National Police Chiefs Council (NPCC) lead for the governance of the use of FRT technology in public spaces.
  • The use and retaining of images captured using FRT.
  • The need for clear evidence to demonstrate that the use of FRT in public spaces is effective in resolving the problem that it aims to address, and that it is no more intrusive than other methods.

Commissioner Dunham said that that legal action would be taken if the Home Office did not address her concerns.

Notting Hill Carnival & Football Events in South Wales

Back in May 2017, South Wales and Gwent Police forces announced that it would be running a trial of ‘real-time’ facial recognition technology on Champions League final day in Cardiff. In June, the trial of FRT at the final was criticised for costing £177,000 and yet only resulted in one arrest of a local man whose arrest was unconnected.

Also, after trials of FRT at the 2016 and 2017 Notting Hill Carnivals, Police faced criticism that it was ineffective, racially discriminatory, and confused men with women.

Research

Recent research by the University of Cardiff, which examined the use of the technology across a number of sporting and entertainment events in Cardiff for over a year, including the UEFA Champion’s League Final and the Autumn Rugby Internationals found that for 68% of submissions made by police officers in the Identify mode, the image had too low a quality for the system to work. Also, the research found that the locate mode of the FRT system couldn’t correctly identify a person of interest for 76% of the time.

What Does This Mean For Your Business?

Businesses use CCTV for monitoring and security purposes, and most businesses are aware of the privacy and legal compliance aspects (GDPR) of using the system and how /where the images are managed and stored.

As a society, we are also used to being under surveillance by CCTV systems, which can have real value in helping to deter criminal activity, locate and catch perpetrators, and provide evidence for arrests and trials. It is also relatively common for CCTV systems to fail to provide good quality images and / or to be ineffective at clearly identifying persons and events.

With the much more advanced facial recognition technology used by police e.g. at public events, there does appear to be some evidence that it has not yet achieved the effectiveness that was hoped for, may not have justified the costs, and that concerns about public privacy may be valid to the point that the ICO deems it necessary to launch a formal and ongoing investigation.