UK police force presses pause on live facial recognition after study finds racial bias
- Reference: 1774013725
- News link: https://www.theregister.co.uk/2026/03/20/uk_police_force_suspend_live_faical_recog_racial_bias/
- Source link:
Essex Police said it had paused use of the technology to update the system with the help of the algorithm software provider. Another similar study identified no bias, it said.
The report from Cambridge University researchers found the Essex police system was more likely to correctly identify men than women and was statistically significantly more likely to correctly identify Black participants than participants from other ethnic groups.
Microsoft doesn't want cops using Azure AI for facial recognition [1]READ MORE
Police forces can use LFR to identify people on a pre-configured watchlist, usually made up of criminals, people of interest, or missing vulnerable individuals.
[2]The study [PDF] used 188 volunteers to act as members of the public in a controlled field experiment during a real police deployment. Because the researchers knew exactly who was present, it was possible to measure both correct and missed identifications.
[3]
It found that at the "current operational setting" used by Essex Police, the system correctly identified around half of the people on the watchlist who passed the cameras and that incorrect identifications were "extremely rare."
[4]
[5]
"Of the six false positive identifications observed in this test, four involved Black individuals. Given that observations of Black subjects constituted 536/2,251 (23.8 per cent) of the sample, the observed imbalance is unlikely to be due to chance alone but this could reflect the limited number of false positive events rather than a true systematic effect," it said.
The finding should be treated as suggestive rather than conclusive, it added.
[6]
A spokesperson for Essex Police said that as part of a commitment to its Public Sector Equality Duty, it had commissioned two independent studies which were completed by academia. "The first of these indicated there was a potential bias in the positive identification rate, while the second suggested there was no statistical relevant bias in the results.
"Based on the fact there was potential bias, the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software."
[7]Smile! Uncle Sam wants to scan your face on the way in – and out
[8]UK's first permanent facial recognition cameras installed in South London
[9]Cops love facial recognition, and withholding info on its use from the courts
[10]Meta algorithms push Black people more toward expensive universities, study finds
The force added: "We then sought further academic assessment. As a result of this work, we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community."
Earlier this year, the British government decided the police in England and Wales should [11]increase their use of live facial recognition (LFR) and artificial intelligence (AI) under wide-ranging government plans to reform law enforcement.
In a white paper [PDF], the Home Office launched plans to fund 40 more LFR-equipped vans in addition to ten already in use. It said they would be used in "town centers and high crime hotspots" with the government planning to spend more than £26 million on a national facial recognition system and £11.6 million on LFR capabilities. ®
Get our [12]Tech Resources
[1] https://www.theregister.com/2024/05/04/microsoft_cloud_facial_recog/
[2] https://www.essex.police.uk/SysSiteAssets/media/downloads/essex/about-us/live-facial-recognition/2026-03-12-lfr-accuracy-watchlists-deterrence-cambs-uni.pdf
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/publicsector&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2ab19Mo6IazlKLgg53f1avwAAAsY&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/publicsector&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44ab19Mo6IazlKLgg53f1avwAAAsY&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/publicsector&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33ab19Mo6IazlKLgg53f1avwAAAsY&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/publicsector&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44ab19Mo6IazlKLgg53f1avwAAAsY&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[7] https://www.theregister.com/2025/10/29/us_foreigner_facial_scans/
[8] https://www.theregister.com/2025/03/27/uk_facial_recognition/
[9] https://www.theregister.com/2024/10/07/cops_love_facial_recognition_and/
[10] https://www.theregister.com/2024/06/04/meta_ad_algorithm_discrimination/
[11] https://www.theregister.com/2026/01/28/tech_in_policing_white_paper/
[12] https://whitepapers.theregister.com/
Re: This isn't the real problem
That's another bias. Why police don't do stop an search in the City and similar locations. If you are white and suited up, you are invisible.
Re: This isn't the real problem
If you steal from 10 people, you're a criminal. If you steal from thousands, you're a business person.
Only the little people are criminals.
Re: This isn't the real problem
If you are white and suited up, you are invisible.
Good thinking! Let's have quotas for stop and search, that'll improve everything.
/s
Re: This isn't the real problem
Don't be daft. It's nothing to do with quotas but rather being afraid of upsetting someone powerful.
It's just much easier to stop and search some no names from impoverished community than someone wearing a watch costing more than your lifetime of earnings with visible suspicious white powder in the nostrils.
Re: This isn't the real problem
Even easier: don't arrest anyone, poor or rich, for white powder in their nostrils, and focus on the thieves instead -- regardless of whether they steal with a knife or a spreadsheet. Focus on the violent scumbags instead, whether they beat a stranger senseless on the street, or beat their wife senseless in their country estate. Focus on the sickos who hurt kids, whether they're some creepy drifter or the Andrew formerly known as Prince.
Re: This isn't the real problem
If the Met start stopping and searching in the City the CoL police will want to have a word with them about jurisdiction!
You are right about being invisible though, over the years I have reported a physical assault and two bike thefts to the Met and they genuinely could not have done less to help. Zero interest, refusal to even look at CCTV footage, just handed out a crime reference for the insurance and a mumbled warning about having nice things in public.
Re: This isn't the real problem
My comment was about police farce in general.
Re: This isn't the real problem
Crime isn't uniformly distributed throughout society.
Poverty and discrimination (which causes poverty) prevent people from achieving their full potential. We shouldn't be surprised when crippled economic opportunities spread poverty and hopelessness, thus spreading crime. Wherever you find concentrated intergenerational poverty, you'll find a huge statistical excess of street crime. Most people will commit crime if desperate and/or destabilized enough. Conversely, most non-sociopaths won't commit crime once they reach a dignified level of prosperity and stability.
Not sure the average cop is much use doing stop and searches in the City. If he found cooked books, would he even know what he's looking at? Need better auditors for that.
May not be a racial bias at all
If you consider the brightness range of an image of say a causasian face under good lighting you will get a significant difference in brightness levels (e.g. in the HSB colour model) over different parts of the face. If you do the same with e.g. an African/Carribean face there will also be a range, but the range low and high levels will obvious be lower brightness, and the range may well be narrower too. e.g. If the darkest pixel is half the brightness of the brightest pixel in each case, the darker image has a lower range, hence lower contrast. This gives the pattern matching algorithm a lower signal strength and worse signal-to-noise ratio in the latter case.
If a trial was run with the same group of purely 'white' people, once in good lighting and again in dim lighting with no exposure compensation, leading to a similar brightness drop, would you not expect the false positive rate to be higher in the second case?
It is no more than "Who is that?", "I'm not sure as I cannot see them clearly enough."
Re: May not be a racial bias at all
Unfortunately pointing out the technology's inherent racial bias isn't really proving 'may not be a racial bias at all'...
Re: May not be a racial bias at all
Unfortunately pointing out the technology's inherent racial bias isn't really proving 'may not be a racial bias at all'..
I don't think that's the issue, ie from the article-
...was statistically significantly more likely to correctly identify Black participants than participants from other ethnic groups.
If the system is correctly identifying wanted criminals, this is a good thing. Issue seems to be its less good at correctly identify non-black people. Then the usual automation challenges of FMR (False Match Rate) and FRR (False Rejection Rate), so not recognising wanted people, or incorrectly matching them.
Re: May not be a racial bias at all
So, let the computer find the black criminals to arrest and send out all of the officers to search for the tan and white criminals.
Why stop using an application that works, even if it only works well on a subset of cases? If the situation was reversed, where it could id whites at 100% and non-whites at 25%, do you think they'd stop using it?
Re: May not be a racial bias at all
That is not a racial bias, it is a lighting level bias. Read the 2nd half of my post.
Good practice
Rather than get too hung up on minutiae, perhaps it is worth taking a moment to congratulate the Police in this instance for actually conducting a trial, involving independent researchers, etc.? At a time when confidence in the police is very low, taking the requisite time to properly evaluate a technology that could well be hijacked as a pretext for unrest seems like a sensible thing to have done.
Re: Good practice
Ah, but they came the the wrong conclusion. They're thowing away a tool that is perfectly useful when operated within manageable constraints.
Sbudrf in the US
Here ICE is using facial recognition to identify people that they're targeting for deportation. This package -- "Mobile Fortify" -- is similar to other packages available to state and local law enforcement and is able to draw on a wide range of image sources. Its also used to identify anyone who protests their action. The application isn't that accurate which accounts for why ICE doesn't just pick up the wrong people from time to time but systematically ignores their documentation -- if the data says "Tuttle" then it doesn't matter that the person's actually named "Buttle", he's captured for processing etc.**
Facial recognition has its uses. The problem is that the our society's tendency towards and electronic Panopticon leads to an over-reliance on a "the system says yes"mindset Instead of being masters of our toolsets those toolsets become our masters.
(**Obscure "Brazil" reference.)
This isn't the real problem
The trial included a set of records that matched the profile of the volunteers.
The real problem is that the police train the system on mug shots of people arrested, proportionally much more black than the general public.
They then set it loose in public where 10% are black rather than the 50% they have trained it on. Even a perfectly "fair" system is going to misidentify black people 5x more
It's like having a military system trained only on images of tanks and then having it look for tanks on the M25 - it's going to disproportionally misidentify a lot of bulldozers.