London Police Facial Recognition Trial

It has been reported that the police are conducting a trial of a facial recognition system in Soho, Piccadilly Circus and Leicester Square over two days in the run-up to Christmas in a bid to identify people among the Christmas shoppers who are wanted by the police or the courts.

Overt

Far from being used secretly, the Metropolitan Police are reported to be publicly announcing the use of the system using knee-height signs on pavements leading up to the surveillance areas, along with A4 posters on lamp posts and leaflets handed-out to members of the public by uniformed officers.

The actual surveillance using the facial recognition link-up to the police database of wanted offenders is reported to have been carried out (on Monday and Tuesday) by a green van with cameras mounted on the top. It has been also been reported that for this London trial of facial recognition, the Metropolitan Police will have been studying the crowds for 8 hours per day over the two day period, and have been specifically using a target list of 1,600 wanted people in the hope that crime and violence can be more effectively tackled.

Criticism

Criticism from privacy and freedom campaigners such as Big Brother Watch and Liberty has focused on mixed messages from police about how those who turn away from the van because they don’t want to be scanned will be treated.  For example, it has been claimed that some officers have said that this will be treated as a trigger for suspicion, whereas a Metropolitan Police press release has stated that those who decline to be scanned (as is their right) during the deployment will not be viewed as suspicious by police officers.

Concern has also been expressed by Big Brother Watch that, although the police may believe that the deployment of the system is overt and well publicised, the already prevalent signs and advertisements in the busy central London areas where it is being deployed could mean that people may not notice, thereby allowing the police to blur the line between overt and covert policing.  It has also been pointed-out by privacy groups that the deployment involves an unmarked van and plainclothes officers, which are normally associated with covert activity.

Doesn’t Work?

Big Brother Watch and Liberty are currently taking legal action against the use of live facial recognition in South Wales (the site of previous trials) and London, and ICO head Elizabeth Dunham is reported to have launched a formal investigation into how police forces use facial recognition technology (FRT) after high failure rates, misidentifications and worries about legality, bias, and privacy.

Serious questions have been raised about how effective current facial recognition systems are.  For  example, research by the University of Cardiff, which examined the use of the technology across a number of sporting and entertainment events in Cardiff for over a year, including the UEFA Champion’s League Final and the Autumn Rugby Internationals, found that for 68% of submissions made by police officers in the Identify mode, the image had too low a quality for the system to work. Also, the research found that the locate mode of the FRT system couldn’t correctly identify a person of interest for 76% of the time.

Google Not Convinced

Even Google (Cloud) has announced recently that it won’t be selling general-purpose AI-driven facial recognition technology until it is sure that any concerns over data protection and privacy have been addressed in law, and that the software is accurate.

Fooled With A Printed 3D Head!

The vulnerability of facial recognition software to errors and inaccuracy has been further exposed by a journalist, Thomas Brewster, from Forbes, who claimed that he was able to fool the facial recognition on four Android phones by using a model 3D head with his own face printed on it!

What Does This Mean For Your Business?

For the retail businesses in the physical area of the trial, anything that may deter criminal activities like theft and violence and may also catch known criminals is likely to be a good thing.

Most businesses and members of the public would probably agree that CCTV systems have a real value in helping to deter criminal activity, locating and catching perpetrators, and providing evidence for arrests and trials.  There are, however, several concerns, particularly among freedom and privacy groups, about how just how facial recognition systems are being and will be used as part of policing e.g. overt or covert, issues of consent, possible wrongful arrest due to system inaccuracies, and the widening of the scope of its purpose from the police’s stated aims.  Issues of trust where our personal data is concerned are still a problem as are worries about a ‘big brother’ situation for many people, although the police, in this case, have been clear that it is just a limited trial that has been conducted as overtly as possible with the support of literature and posters / literature to make sure the public is informed.