News: 0175202339

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

US Police Seldom Disclose Use of AI-Powered Facial Recognition, Investigation Finds (msn.com)

(Sunday October 06, 2024 @10:11PM (EditorDavid) from the I'll-be-seeing-you dept.)


An anonymous reader shared [1]this report from the Washington Post :

> Hundreds of Americans have been arrested after being connected to a crime by facial recognition software, a Washington Post investigation has found, but many never know it because police seldom disclose their use of the controversial technology...

>

> In fact, the records show that officers often obscured their reliance on the software in public-facing reports, saying that they identified suspects "through investigative means" or that a human source such as a witness or police officer made the initial identification... The Coral Springs Police Department in South Florida instructs officers not to reveal the use of facial recognition in written reports, according to operations deputy chief Ryan Gallagher. He said investigative techniques are exempt from Florida's public disclosure laws... The department would disclose the source of the investigative lead if it were asked in a criminal proceeding, Gallagher added....

>

> Prosecutors are required to inform defendants about any information that would help prove their innocence, reduce their sentence or hurt the credibility of a witness testifying against them. When prosecutors fail to disclose such information — known as a "Brady violation" after the 1963 Supreme Court ruling that mandates it — the court can declare a mistrial, overturn a conviction or even sanction the prosecutor. No federal laws regulate facial recognition and courts do not agree whether AI identifications are subject to Brady rules. Some states and cities have begun mandating greater transparency around the technology, but even in these locations, the technology is either not being used that often or it's not being disclosed, according to interviews and public records requests...

>

> Over the past four years, the Miami Police Department ran 2,500 facial recognition searches in investigations that led to at least 186 arrests and more than 50 convictions. Among the arrestees, just 1 in 16 were told about the technology's use — less than 7 percent — according to a review by The Post of public reports and interviews with some arrestees and their lawyers. The police department said that in some of those cases the technology was used for purposes other than identification, such as finding a suspect's social media feeds, but did not indicate in how many of the cases that happened. Carlos J. Martinez, the county's chief public defender, said he had no idea how many of his Miami clients were identified with facial recognition until The Post presented him with a list. "One of the basic tenets of our justice system is due process, is knowing what evidence there is against you and being able to challenge the evidence that's against you," Martinez said. "When that's kept from you, that is an all-powerful government that can trample all over us."

>

> After reviewing The Post's findings, Miami police and local prosecutors announced plans to revise their policies to require clearer disclosure in every case involving facial recognition.

The article points out that Miami's Assistant Police Chief actually told a congressional panel on law enforcement AI use that his department is "the first to be completely transparent about" the use of facial recognition. (When confronted with the Washington Post's findings, he "acknowledged that officers may not have always informed local prosecutors [and] said the department would give prosecutors all information on the use of facial recognition, in past and future cases".

He told the Post that the department would "begin training officers to always disclose the use of facial recognition in incident reports." But he also said they would "leave it up to prosecutors to decide what to disclose to defendants."



[1] https://www.msn.com/en-us/news/crime/police-seldom-disclose-use-of-facial-recognition-despite-false-arrests/ar-AA1rM3Nc



Re: (Score:3)

by XXongo ( 3986865 )

> It's not exactly legal, you know.

Unfortunately, it's not exactly illegal, either.

Re: (Score:1)

by gosso920 ( 6330142 )

"When the President does it, that means it's not illegal."

Re: (Score:2)

by Retired Chemist ( 5039029 )

Apparently, only for him though.

Re: (Score:1)

by mi ( 197448 )

> Unfortunately, it's not exactly illegal, either.

Could you elaborate on the "unfortunate" part? Why are you lamenting law enforcement's use of facial recognition?

As well as DNA and fingerprints, while you're at it...

colour me thurprithte (Score:1)

by invisiblefireball ( 10371234 )

yes, the police are liars, the sort of people who do things they wouldn't want to be caught doing, with absolute seriousness and not even a glint of irony.

Re: (Score:1)

by invisiblefireball ( 10371234 )

they'll even stand there with their bare faces hanging out raving about how it'll be the end of the world if you take this criminal's tool out of their hands so they cannot also commit crime with it, because "you can't keep up with the bad guys without using their weapons," ignoring all actual data on successful police policy.

Huh? (Score:5, Insightful)

by laughingskeptic ( 1004414 )

Prosecutors do not get to decide what to disclose... that is the whole point of Brady.

> leave it up to prosecutors to decide what to disclose to defendants.

This disingenuous weaseling seriously undermines Armando Aguilar's credibility. He seems to be saying, "It's not the police's fault if the prosecutor fails to disclose" ... however the prosecutor can't disclose what the police do not disclose. Armando Aguilar knows very well that prosecutors are not supposed "to decide what to disclose to defendants", but it is clear that the culture in law enforcement by both the police and the prosecutors offices has been to violate Brady on a daily basis.

Re: (Score:2)

by timeOday ( 582209 )

How is it a violation if the facial recognition is not used as evidence for the prosecution, and is also not exculpatory?

This is the key: the facial recognition is not used as evidence. They cannot claim "you're him!" because an algorithm thinks you are. That's not what this is about.

Re: (Score:3)

by Retired Chemist ( 5039029 )

No, but they can make you a suspect. Once they have decided you are suspect, especially if you are a minority, they will find or sometimes create the needed evidence. Also, it has been shown that these programs have a high error rate with people of color.

Re: Huh? (Score:1)

by kenh ( 9056 )

As a reminder, the prosecution relies on one of two things in a criminal case to get a conviction - a unanimous jury decision OR a confession of guilt.

Can the wrong person be convicted? Sure, it happens, but not because Facial Recognition picked a name out of a hat and decided that that is the person of color to investigate/prosecute.

Re: Huh? (Score:1)

by angryman77 ( 6900384 )

This is the part where it doesn't matter if you're guilty or not, odds are if you say anything at all to the cops, they're going to fuck you with it.

Short answer, never talk to the cops. They're not trying to straighten things out or understand your situation. They're looking to get a conviction, regardless of whether you're the one who committed the crime or whether their was an actual crime committed at all. Nothing you tell them will ever serve your best interests, it will only ever serve theirs.

Re: Huh? (Score:1)

by angryman77 ( 6900384 )

This is relevant re "confession of guilt." The first time you mix up details or say something inconsistent with something an eyewitness saw, they've caught you lying to them, and they know how to twist that so hours or days later you'll confess your guilt and throw yourself on the mercy of the court.

The court has no mercy.

Re: (Score:2)

by timeOday ( 582209 )

I wouldn't say it "makes you a suspect." Say the police have a fuzzy picture of a suspect and they open a tip line. 100 people call and say they saw somebody like that. At this point, each of the 100 leads is a longshot, but they follow up on them. Most of the people are soon cleared. A few cannot be ruled out so they gather more evidence. One of them is so suspicious they get a warrant and find the murder weapon and a DNA match and he confesses.

At that point, the fact that the initial tip by which the

System of Injustice (Score:2)

by RossCWilliams ( 5513152 )

That is complete fantasy. A more realistic scenario is that facial identification is used to screen for people with prior arrest or other police contact. Maybe there are three or four people who fit that description. They check each one's alibi and all but one has enough of one that will make "reasonable doubt" likely. They now focus their efforts on gathering evidence against the remaining suspect. The alternative is to start over from scratch if they can't make a case against that last one standing. So th

Re: (Score:2)

by Pinky's Brain ( 1158667 )

The Brady decision did not concern itself with unconstitutional investigation methods, but withholding evidence which could prove innocence. If the defendant bragged about the crime on social media and they found the bragging through AI, how does knowing that help the defendant prove his innocence?

This is just a different flavor of parallel construction and there should be rulings against that, but Brady isn't it.

On the basis of facial recognition alone? (Score:2)

by kenh ( 9056 )

Can someone be wrongly convicted of a crime they didn't commit based solely on facial recognition?

I don't think so.

I imagine facial recognition gives them a starting point to build a case, but having the computer identify, say, a rapist by comparing video with the state drivers license photo database? I'm ok with that.

Defense can challenge the idea, present exonerating facts, etc - this isn't a reverse "Johnnie Cochran" ("if the face is recognized, you must convict")

Re: (Score:2)

by RossCWilliams ( 5513152 )

> Can someone be wrongly convicted of a crime they didn't commit based solely on facial recognition?

Of course they can. We just executed somebody who was found guilty based on the testimony of a jailmate of a jailhouse confession and an ex-girl friend with no physical evidence to connect him to the crime. You can't confess to a jailmate until you are arrested. The jailmate got "consideration" for his testimony in sentencing for his own crime. There was even evidence that someone else committed the crime.

Once someone is identified as a possible suspect there is always a chance they will be convicted unles

This is kinda fucked up (Score:4, Insightful)

by jacks smirking reven ( 909048 )

Two lines in the summary really stood out.

He said investigative techniques are exempt from Florida's public disclosure laws...

Like "is that true?" and seems like, [1]yes, it is. [flsenate.gov] I get that, you don't want to disclose all your methods with the criminals but it kinda clashes with the second one

acknowledged that officers may not have always informed local prosecutors

That seems somewhat malicious in intent. Prosecutors should definitely not be in the dark about these things and what kind of department lets officers use this tools and not record that it happened or at their discretion disclose it or this guy is passing the buck to diffuse "officers". Prosecutors should probably have access to records of these types of searches being performed.

[1] https://www.flsenate.gov/laws/statutes/2021/119.071

If it isn't used as evidence... (Score:2)

by ahoffer0 ( 1372847 )

does have it to be submitted to the defendant?

Linux is obsolete
(Andrew Tanenbaum)