TechReaderDaily.com
TechReaderDaily
Live
Policy · Biometric Surveillance

Britain's Live Facial Recognition: 173 Arrests, Policy Vapour

A Met Police pilot in Croydon achieved 173 arrests and a 10.5 percent crime drop, yet despite a High Court blessing, the UK's live facial recognition expansion still hinges on force-by-force policy, not law.

A Metropolitan Police live facial recognition van with multiple cameras mounted on its roof, deployed on a public street in Yorkshire. www.independent.co.uk
In this article
  1. Whose body, whose business
  2. What to watch for

Between October 2025 and March 2026, the Metropolitan Police ran a live facial recognition pilot in the south London borough of Croydon. Cameras mounted on vans and fixed to street furniture scanned the faces of every person who walked through their field of view, extracting biometric templates and comparing them in real time against a watchlist of wanted individuals. By the time the pilot concluded, officers had made 173 arrests and recorded a 10.5 percent reduction in overall crime across the trial area, according to Met Police figures reported by the BBC on 13 May. Among those apprehended was a woman who had evaded capture for more than two decades.

The Croydon numbers arrived in the same news cycle as two other data points that, taken together, sketch the contours of a policy vacuum. On 21 April, Reuters reported that the High Court in London had dismissed a legal challenge brought against the Met by youth worker Shaun Thompson and Big Brother Watch director Silkie Carlo. The claimants had argued that the force's use of live facial recognition (LFR) violated privacy rights and equality law. The court disagreed. One day later, the government welcomed the ruling and signalled that facial recognition would be rolled out across the country. The technology is spreading before the law has caught up with it.

The data flow is not especially complicated, and that is part of what makes it hard to regulate. A camera captures video of a public space. Software extracts faces from the frame and converts each one into a biometric template, a mathematical representation of the distances between facial features. That template is compared against a database of images drawn from police custody records and wanted-person lists. If the system registers a match above a configured confidence threshold, an alert is sent to an officer on the ground, who decides whether to intervene. If there is no match, the template is deleted, according to the Met's stated policy. The watchlist is the variable: who is on it, how it is compiled, and on what legal basis. The Guardian's explainer, published on 3 May, notes that the technology has been deployed in London since 2020 and has drawn persistent concerns over data privacy and racial bias.

The Croydon pilot was not the Met's first use of LFR, but it was its most sustained. Cameras were deployed in shopping areas, outside transport hubs, and along high-footfall streets. The watchlist included individuals wanted on warrant, those suspected of serious violent offences, and missing persons considered vulnerable. LBC reported on 12 May that a separate pilot of fixed LFR cameras, as opposed to van-mounted units, had led to the arrest of a woman wanted for 20 years in connection with an assault. "Incredible," a Met spokesperson told the broadcaster, describing the technology's capacity to identify individuals who had long since disappeared from traditional policing radars.

That word, "incredible," captures the rhetorical frame that has come to dominate official discourse around LFR. Every deployment is narrated through arrest totals and crime reduction percentages. The Met's 173 arrests in Croydon are now being cited by forces across England as proof of concept. Cambridgeshire Police announced on 13 May that it would deploy LFR for the first time in Peterborough city centre, according to AOL's syndication of a BBC report, with cameras targeting individuals wanted by the courts and those deemed to pose the greatest risk to public safety. Chief Inspector John Massey told the Peterborough Telegraph he was "confident" the technology would bring more criminals to justice. The roll-out logic is linear: if it worked in Croydon, it will work in Peterborough.

The problem is that "worked" is doing a lot of unexamined work. Arrest numbers measure the volume of matches, not the quality of those matches. A match is not a conviction. A deployment that scans 100,000 faces in an afternoon and produces three matches, two of which turn out to be false positives, is operating at a level of imprecision that would be unacceptable in any other forensic context. And yet no UK police force routinely publishes its false-positive rate, its match confidence thresholds, or the demographic breakdown of who is being flagged. The public is asked to trust the machine, but the machine's audit trail is sealed.

Essex Police became, briefly, the exception that proved the rule. In March 2026, the force suspended its use of LFR, Computer Weekly reported, after identifying potential accuracy and bias risks in its system, which was supplied by an Israeli vendor. The force had found that the cameras identified more Black people than other ethnic groups relative to the population passing through the camera's field of view. The suspension was notable precisely because it was so unusual: a police force acknowledging that an algorithmic system might be producing racially skewed outputs and acting on that acknowledgment before being compelled to do so by a court or a regulator.

The Essex episode exposes a structural asymmetry in how LFR is governed in the United Kingdom. There is no primary legislation that sets out when, where, and under what conditions police forces may deploy real-time biometric surveillance. Instead, the legal basis rests on a combination of common law policing powers, the Data Protection Act 2018, the Surveillance Camera Code of Practice, and force-by-force policy documents that are not subject to parliamentary scrutiny. The High Court ruling in April affirmed that this patchwork was lawful, but lawfulness is not the same as adequacy. The Biometrics and Surveillance Camera Commissioner, Professor Fraser Sampson, has repeatedly warned that the oversight framework is not fit for purpose. In May, The Guardian reported that Sampson and other watchdogs had concluded that face-scanning technology was "not as effective as claimed" and that new laws were urgently needed.

The legal basis rests on a combination of common law policing powers, the Data Protection Act 2018, the Surveillance Camera Code of Practice, and force-by-force policy documents that are not subject to parliamentary scrutiny. Lawfulness is not the same as adequacy. — , framing drawn from High Court ruling analysis and Biometrics Commissioner statements

Scotland may yet take a different path. Biometric Update reported on 7 May that the Scottish privacy commissioner continues to express "consternation" over the potential use of live facial recognition by Police Scotland. The Scottish government has been considering primary legislation that would specifically govern live FRT, making it potentially the first nation within the United Kingdom to do so. The Scotsman noted in an opinion piece by Martyn McLaughlin that Scotland could be the first nation to pass such laws, but cautioned that legal safeguards would be vital. The devolution of policing and justice to Holyrood creates an unusual constitutional experiment: two jurisdictions on the same island taking fundamentally different approaches to the same surveillance technology.

Whose body, whose business

What makes live facial recognition distinct from retrospective facial recognition, the kind used to identify suspects after a crime has occurred, is that it operates on entire populations, not on individuals already connected to an investigation. Every person who walks past an LFR camera is subjected to a biometric search. The question of consent is rendered almost meaningless: there is no practical way to avoid the camera, no mechanism to opt out, and in most deployments no meaningful signage that would allow a person to make an informed choice about whether to enter the surveillance zone. The Met's policy states that officers are trained to engage with members of the public who ask questions, but engagement is not consent, and a conversation after scanning has already occurred is not prior authorisation.

The vendors of this technology operate in a market that is expanding faster than the regulatory architecture can keep pace with. The Met has used systems from multiple suppliers over the years, including NEC's NeoFace platform. Essex Police's suspended deployment came from an Israeli firm. Cambridgeshire's procurement details have not been made public. The vendor landscape matters because different algorithms have different error profiles across demographic groups, and without transparency about which algorithm is in use, where, and with what parameters, there is no way for an independent researcher to audit the system. The National Physical Laboratory has conducted some testing, but the results are not published in a form that allows for replication or public scrutiny.

Academic researchers have been trying to fill the audit gap. The Ada Lovelace Institute published a detailed examination of the Met's LFR deployments in 2023, finding that the legal basis was contestable and the human rights implications under Article 8 of the European Convention on Human Rights were significant. The Institute's researchers noted that the Met had declined to publish key performance metrics, including the ratio of correct to incorrect matches. Three years later, that data is still not publicly available. The Croydon pilot's 173 arrests and 10.5 percent crime reduction have been announced without the denominator data that would allow an independent assessment of the system's precision and its demographic impacts.

The racial bias concern is not hypothetical. Studies of facial recognition algorithms have consistently found higher false-positive rates for Black and Asian faces, particularly for women. The National Institute of Standards and Technology (NIST) in the United States has documented these disparities across dozens of commercial algorithms. In the UK, the Equality and Human Rights Commission has warned that LFR deployments risk violating the Public Sector Equality Duty, which requires public bodies to consider how their policies affect people with protected characteristics. The High Court ruling in April acknowledged these concerns but found that the Met's Equality Impact Assessment was sufficient. Civil liberties groups disagree. Silkie Carlo of Big Brother Watch called the ruling "a dark day for democratic freedoms" after the judgment was handed down, Courthouse News Service reported.

What to watch for

The trajectory is clear, and it is accelerating. The Met's success in defending its LFR programme at the High Court has emboldened forces that were previously cautious. Cambridgeshire's Peterborough deployment, announced within weeks of the Croydon pilot results, is a template for what will follow: a county force with no prior LFR experience adopting the technology on the strength of another force's arrest figures. The Home Office has indicated support for a national rollout, and the College of Policing is developing guidance, but guidance is not legislation, and guidance can be updated without parliamentary debate.

Scotland is the jurisdiction to watch. If the Scottish Parliament passes primary legislation, it will create a natural experiment: two policing systems operating under different legal standards on the same island. If it does not, live facial recognition will continue to spread under the same thin legal patchwork that governs it in England and Wales, a patchwork that the High Court has now stamped as lawful but that every independent watchdog has described as inadequate. The Croydon numbers are arresting, in every sense, but they are not the whole story. The whole story is this: a surveillance technology that scans the faces of everyone in a public space is being deployed across Britain without a single piece of primary legislation to govern it, and the 173 arrests are being used to ensure nobody asks why.

For readers who want to verify what their own police force is doing, every force in England and Wales is required to publish a Surveillance Camera Code of Practice compliance statement. These are available through the Surveillance Camera Commissioner's website. The Met publishes its LFR deployment data, including locations and watchlist criteria, on its public-facing transparency pages, though key performance metrics are not included. Big Brother Watch maintains a tracker of UK police forces using or considering LFR. The Scottish Biometrics Commissioner's office publishes updates on the legislative process in Holyrood. The questions to ask are these: who is on the watchlist, what is the algorithm's false-positive rate by ethnicity and gender, and what independent oversight exists before the camera is switched on. In most of the United Kingdom, in May 2026, the honest answer to that last question is: very little.

Read next

Progress 0% ≈ 9 min left
Subscribe Daily Brief

Get the Daily Brief
before your first meeting.

Five stories. Four minutes. Zero hot takes. Sent at 7:00 a.m. local time, every weekday.

No spam. Unsubscribe in one click.