Live facial recognition technology should not be deployed by UK law enforcement under current circumstances, and the retention of millions of custody images would not hold up under another legal challenge, MPs have been told.
In a written submission to a parliamentary Science and Technology Committee inquiry about the use of automatic facial recognition (AFR) by police forces, deputy information commissioner Steve Wood said a “priority investigation” has been opened to investigate the use of AFR by law enforcement in public spaces.
“This will include considering the legal basis, the necessity, proportionality and justification for this intrusive processing,” he said.
Wood highlighted the findings of a previous report from the committee in May 2018, which said: “Facial recognition technology should not be deployed, beyond the current pilots, until the current concerns over the technology’s effectiveness and potential bias have been fully resolved.
“The [Information] Commissioner is concerned that this has not been fully addressed, and it is not yet clear how the oversight board will address the issues.”
During a select committee meeting on 19 March, the commissioner for the retention and use of biometric material, Paul Wiles, noted the lack of ministerial oversight regarding the use of AFR.
“Biometric, by definition, is going to be extremely intrusive of individual privacy, and therefore liberty,” he said.
Wiles added that, given the potential of AFR systems to increase mass surveillance of the population, “I would have thought one might have seen a more important public debate and some ministerial leadership about the direction in which that ought to go.”
In June 2018, the Home Office released its Biometrics Strategy, which established the Biometrics & Forensics Ethics Group – the “oversight board” to which Wood referred – as a statutory non-departmental public body focused on providing ethical guidance on the use and retention of biometric material.
However, according to Wiles, the strategy itself was a “disappointing document”, adding that the final version published by the Home Office was very different to the original draft he saw.
“It starts off with what I would describe as a very good prologue for a strategy, and it then simply becomes a list of some of the things – and by no means all of the things – the Home Office is doing in terms of biometrics,” he said. “I thought that was disappointing and a missed opportunity to lay out a strategy.”
Unconvicted custody images concerns
Wiles confirmed that the Police National Database (PND) currently holds 23 million images taken while people were in custody, whether or not they were subsequently convicted. The “watch lists” that police AFR systems operate use these custody images to identify people in real-time.
“At present, there are 10 million searchable facial images, but – just to be clear – that doesn’t necessarily mean 10 million people because some will be duplicates,” said Wiles.
However, Wood said that the Information Commissioner is concerned by the ongoing retention of custody images of individuals that have never been convicted of a crime.
“There are potentially thousands of custody images being held with no clear basis in law or justification for the ongoing retention,” he said.
A 2012 High Court ruling found the retention of custody images by the Metropolitan Police to be unlawful, with unconvicted people’s information being kept in the same way as those who were ultimately convicted. It also deemed the minimum six-year retention period to be not proportionate.
Wiles suggested that linking the PND to the Police National Computer (PNC), which holds the UK’s conviction records, could help the situation.
Until that happens, which could be at least three years according to Wiles, custody images of unconvicted people cannot be separated from those convicted.
Wiles added that during the six-year retention period, both convicted and unconvicted people could apply to have their images removed, with the presumption being that the police would do this if there was no good reason not to.
“We have visited this year just over half the police forces in England and Wales. There was very poor understanding of that six-year review period and little evidence it was being carried out, and there were very few applications being made to chief officers, and not all of those have been accepted,” he said.
“I’m not sure that the legal case [for retention] is strong enough, and I’m not sure that it would withstand a further court challenge.”
Privacy worries over Met Police’s AFR deployment
While a number of police forces have been trialling AFR, the Metropolitan Police recently concluded its tenth and final deployment of the technology on 14 February.
Computer Weekly was present for the penultimate deployment in Romford on 31 January, where eight people were arrested – three as a direct result of the AFR identifying individuals wanted for violent offences, and another five as part of the Met’s wider operation in the area.
Two men were also stopped for covering their faces near the deployment van, one of whom received a £90 public order offence fine.
The deployments have received criticism from privacy activists and campaign groups, with both Liberty and Big Brother Watch present on the day.
Hannah Couchman, policy and campaigns officer at liberty, said at the time: “Our key concern is that this is an enormous threat to your privacy. Every single person that walks past that facial recognition van will have a scan taken of their face, which is deeply sensitive biometric information.
“It will happen without their consent and probably without their knowledge, because there is such a lack of information available to them.”
Is it right?