Facial Recognition Error Sees Woman Wrongly Accused of Theft (bbc.com)
- Reference: 0178057087
- News link: https://slashdot.org/story/25/06/15/1817236/facial-recognition-error-sees-woman-wrongly-accused-of-theft
- Source link: https://www.bbc.com/news/articles/cdr510p7kymo
"We acknowledge and understand how distressing this experience must have been," an anonymous Facewatch spokesperson tells the BBC, adding that the store using their technology "has since undertaken additional staff training."
A woman was accused by a store manager of stealing about £10 (about $13) worth of items ("Everyone was looking at me"). And then it happened again at another store when she was shopping with her 81-year-old mother on June 4th:
> "As soon as I stepped my foot over the threshold of the door, they were radioing each other and they all surrounded me and were like 'you need to leave the store'," she said. "My heart sunk and I was anxious and bothered for my mum as well because she was stressed...."
>
> It was only after repeated emails to both Facewatch and Home Bargains that she eventually found there had been an allegation of theft of about £10 worth of toilet rolls on 8 May. Her picture had somehow been circulated to local stores alerting them that they should not allow her entry. Ms. Horan said she checked her bank account to confirm she had indeed paid for the items before Facewatch eventually responded to say a review of the incident showed she had not stolen anything. "Because I was persistent I finally got somewhere but it wasn't easy, it was really stressful," she said. "My anxiety was really bad — it really played with my mind, questioning what I've done for days. I felt anxious and sick. My stomach was turning for a week."
>
> In one email from Facewatch seen by the BBC, the firm told Ms Horan it "relies on information submitted by stores" and the Home Bargains branches involved had since been "suspended from using the Facewatch system". Madeleine Stone, senior advocacy officer at the civil liberties campaign group Big Brother Watch, said they had been contacted by more than 35 people who have complained of being wrongly placed on facial recognition watchlists.
>
> "They're being wrongly flagged as criminals," Ms Stone said.
"They've given no due process, kicked out of stores," adds the senior advocacy officer. "This is having a really serious impact." The group is now calling for the technology to be banned. "Historically in Britain, we have a history that you are innocent until proven guilty but when an algorithm, a camera and a facial recognition system gets involved, you are guilty.
> The Department for Science, Innovation and Technology said: "While commercial facial recognition technology is legal in the UK, its use must comply with strict data protection laws. Organisations must process biometric data fairly, lawfully and transparently, ensuring usage is necessary and proportionate.
>
> "No one should find themselves in this situation."
Thanks to [2]alanw (Slashdot reader #1,822) for sharing the article.
[1] https://www.bbc.com/news/articles/cdr510p7kymo
[2] https://www.slashdot.org/~alanw
Finger of blame pointing in the wrong direction? (Score:4, Informative)
TFS doesn't quite add up to what the headline implies. AFAICT, the actual sequence of events is that Ms. Horan bought and paid for some toilet rolls on May 8th, after which *human error* at the store resulted in her being added to the Facewatch programme. Because she had been added to the Facewatch DB, the programme then did exactly as it was supposed to and flagged her entry into the stores on May 24th & June 4th, prompting the store staff to react pretty much as you'd expect under the circumstances and ask her to leave, albeit perhaps without sufficient discretion.
There's really only one screw up here, and that was by the staff at the May 8th store who added her to the Facewatch DB, everyone and everything else seems to have done as they/it should have done under the circumstances. Still, on the "lessons learnt" front, users of systems like this *really* need to allow for the possibility of human error in the submission or a mistaken ID by the system (not that this seems to have happened here) when challenging someone like this, and have a clear cut audit trail and process of appeal. If Home Bargains had been able to say, right off the bat, that it was down to a presumed theft of toilet rolls on May 8th and undertake an on-the spot review on May 24th, this could easily have been avoided.
Re: (Score:3)
In other words, the problem isn't that the facial recognition system didn't work, it's that it did , with zero errors.
Re: (Score:3)
Exactly. The headline is wrong. It says "Facial recognition error" and there was no facial recognition error. It correctly identified her. She was wrongly added to the database.
Re: (Score:2)
So an error?
Re: (Score:2)
An apparently human error, not a facial recognition one, unless you would call any other case where a chain sends out a picture of an innocent person and says "do not admit this person" a "facial recognition error".
The criticism is over the qualifier for the word "error", not the label as an error.
Re: (Score:2)
Sure, in the lessons learnt that would be great if that all takes place but this is all private enterprises so what's gonna make them do it? Maybe demands from their customers but I have to imagine the selling point of these companies to their customers is not having to do all that human work so there's some perverse incentives happening that lead to this happening.
I mean lets be real, it wasn't simple oversight that it seems like precisely none of those things are in the system now. This probably should b
Re: (Score:2)
> AFAICT, the actual sequence of events is that Ms. Horan bought and paid for some toilet rolls on May 8th, after which *human error* at the store resulted in her being added to the Facewatch programme.
Actually, the wording of that Facewatch email is weaselly enough - it implies this was the case, but does not actually state it.
Is all that Facewatch doing just sharing info created entered by the retailers?
Re: (Score:2)
The system doesn't detect if you're a shoplifter- so her, or someone that erroneously is matched as her- must have been marked by a human.
I agree it's not clear which of those it is, and the distinction is very important.
> Is all that Facewatch doing just sharing info created entered by the retailers?
Yes. With some auto-detection of that person. Basically a really advanced wanted poster.
Re: (Score:2)
The issue is that they have a very powerful facial recognition system deployed at multiple stores - not just BM, other shops as well, so potentially huge consequences if they make a mistake. And apparently all it takes is one unverified report to get you on their list, at which point you can expect to be accosted at any number of venues, with no idea why.
It's lucky she paid with a card, or she might not have been able to prove she didn't steal anything, and wouldn't have been able to get the situation sorte
The problem is the human reaction not the tech (Score:2)
The problem isn't that the face recognition system, it's the excessive response of immediately barring her from the shop. They should have let her in normally and just pay extra attention through the cameras.
If the system displays "100% MATCH" blinking in blinking red letters like we'd see in a movie, then that's the the problem. The system should say it's merely a possible match, and the matter must be treated with caution. The picture should be made available on a paper or a tablet if anyone is sent to ta
Re: (Score:2)
Exactly. Even if you did steal something once, this is scary. You are essentially barred from buying food, ever.
Re: (Score:2)
What stores would typically do is bar you for a few years, but not for life time, at least apply to petty theft. Violent behavior or whatever would probably bring a life time ban. And no, stores do not have to serve you if you don't play by the rules.
Either way, what seems to have happened there is a complete breakdown of process. A woman was wrongly accused of theft by incompetent staff and publicly embarrassed through no fault of herself.
Meh (Score:2)
> Historically in Britain, we have a history that you are innocent until proven guilty but when an algorithm, a camera and a facial recognition system gets involved, you are guilty.
Brits really love their surveillance cameras everywhere. Not sure what they expected to happen.
The tech is needed to prevent extremes (Score:2)
Just ban it from being used for petty stuff, under penalty of law. Anyone who uses the tech for catching petty thieves or something abusive like fucking with an ex should be charged with stalking or something like that. We do need facial and activity recognition to screen for actual stalkers, violent people, kidnappers, and psychos. Of course due process needs to be enforced in every criminal case, and during the case the tech must be open to scrutiny. The standard must be that a number of humans review the
Sounds like a good lawsuit (Score:5, Interesting)
They have no reason to keep her from their stores, and it is specifically tied to her.
I wonder if a defamation suit would be in order?
You're right - there's no "America" anymore (Score:2)
>> outside of America.
> Oh c'mon! Everybody knows there's no such thing!
You're right - it's now known as the "Shit Hole of America", because the rest of the world can rename things too!
Re:Sounds like a good lawsuit (Score:4, Interesting)
This seems like exactly the kind of situation lawsuits are intended for. A google search of "UK false accusation law" turns up multiple ambulance chasing lawyer web pages that inform me that false accusations, as well as milder defamation, can be prosecuted both criminally and civilly and I should definitely contact them as soon as possible to make sure I know my rights.
Re: (Score:2)
What we read in TFS makes me think she is preparing for that civil lawsuit: "Everyone was looking at me", "My heart sunk and I was anxious", "my mum as well was stressed". She's stating to whoever wants to listen she and her mum suffered psychological damage, and she can later claim that in court.
Re: (Score:2)
On the other hand, at the very least, Facewatch actually followed up and looked into her claims of innocence and admitted their error -- though they could do better than an "anonymous spokesperson." Some (many? most?) companies might have simply blown her off or perpetually routed her through automated systems (like I've heard Google, Facebook, etc... do) until she gave up and perhaps actually sued. Not defending their tech or the store's proactive use of it, but their follow-up could have been worse.
Re: (Score:2)
"It was only after repeated emails to both Facewatch and Home Bargains that she eventually found there had been an allegation of theft of about £10 worth of toilet rolls on 8 May. " The article doesn't say how many specifically, but clearly more than 1. And I would also guess no phone number was available either. Probably not much different than google. Society has really let this idea of large org's being unreachable become way too common. I'd hope a big slapdown. A slapdown so large Facewatch is no
Re: (Score:2)
She has a good case under GDPR rules. They processed her personal data (her biometrics), relying on legitimate interest to avoid having to get permission. But that brings a lot of responsibility too, and clearly they have failed here.
The difficulty will be that Facewatch blames Home Bargains, and Home Bargains blames Facewatch. I'd say the liability is mostly with Facewatch, since they flagged her as a thief, and they clearly didn't vet the information they were given, or provide proper training to Home Bar
Re: (Score:2)
There is obviously a personal data angle here. There might also be a defamation angle if the system works as implied by TFS, since it appears that someone's reputation has been affected because someone else lied about them and this has demonstrably caused harm? If there was more than one relevant incident then there might also be a harassment angle.
Please be careful with that advice about requesting compensation in a Letter Before Action, though. There are fairly specific rules for what you can and can't cl
Re: (Score:2)
The ICO used to have examples of compensation awards for various types of DPA issues, but I can't find it now. My numbers were based on that, but IANAL and this is not legal advice. You are right, get legal advice, the cost can be passed on to them anyway.
Re: (Score:2)
> You are right, get legal advice, the cost can be passed on to them anyway.
AIUI, your costs can't (or couldn't) generally be passed on when using the small claims system. Has that changed? It's been a while since I went through the process, so it's possible that my information here is out of date.
Re: (Score:2)
Exactly! The court slapping a company with a large lawsuit judgement is what gets things fixed.