Police use of automatic facial recognition technology to search for people in crowds is lawful, the high court in Cardiff has ruled.

Although the mass surveillance system interferes with the privacy rights of those scanned by security cameras, two judges have concluded, it is not illegal.

The legal challenge was brought by Ed Bridges, a former Liberal Democrat councillor from Cardiff, who noticed the cameras when he went out to buy a lunchtime sandwich. He was supported by the human rights organisation Liberty. He plans to appeal against the judgment.

Bridges said he was distressed by police use of the technology, which he believes captured his image while out shopping and later at a peaceful protest against the arms trade.

During the three-day hearing in May, his lawyers alleged the surveillance operation breached data protection and equality laws.

The judges found that although automated facial recognition (AFR) amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.

Dismissing the challenge, Lord Justice Haddon-Cave, sitting with Mr Justice Swift, said: “We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Wales police’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.”

Responding to the judgment, Megan Goulding, a Liberty lawyer, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms. Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.

“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

Bridges said: “South Wales police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”

Facial recognition technology maps faces in a crowd and compares them to a watch list of images, which can include suspects, missing people and persons of interest to the police.

The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.

Three UK forces have used facial recognition in public spaces since June 2015: the Met, Leicestershire and South Wales police.

Lawyers for South Wales police told the hearing facial recognition cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.

The technology was likened to police use of DNA. Those not on a watch list would not have their data stored after being scanned by AFR cameras, the court was told.

Fiona Barton QC, a barrister at 5 Essex Court chambers who specialises in police use of technology, said: “This case should not be taken as a green light to go ahead with the use of AFR in all and any circumstances: it was decided on specific facts, within a specific legal framework applicable to certain public authorities, and by reference to South Wales police’s policy and other documents.”

The chief constable of South Wales police, Matt Jukes, said: “I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern. So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.

“There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”

A spokeswoman for the Information Commissioners’ Office, which intervened in the case, said: “We welcome the court’s finding that the police use of live facial recognition systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”

A survey of more than 4,000 adults released on Wednesday by the Ada Lovelace Institute found that a majority (55%) want the government to impose restrictions on police use of facial recognition technology but that nearly half (49%) support use of facial recognition technology in day to day policing, assuming appropriate safeguards are in place.

Carly Kind, the director of the Ada Lovelace Institute, said: “The fact that the deployment of a new technology, with which a significant proportion of the public are not comfortable, can be deemed technically compliant underlines the need for an informed public discourse on what new restrictions and safeguards are needed. Facial recognition technology may be lawful, but that does not mean its use is ethical, especially outside of the very limited circumstances examined by the court in this case.”


LEAVE A REPLY

Please enter your comment!
Please enter your name here