AI Company Eightfold Sued For Helping Companies Secretly Score Job Seekers (reuters.com)
- Reference: 0180634016
- News link: https://yro.slashdot.org/story/26/01/21/1841214/ai-company-eightfold-sued-for-helping-companies-secretly-score-job-seekers
- Source link: https://www.reuters.com/sustainability/boards-policy-regulation/ai-company-eightfold-sued-helping-companies-secretly-score-job-seekers-2026-01-21/
> The lawsuit, filed on Tuesday accusing Eightfold of violating the Fair Credit Reporting Act shows how consumer advocates are seeking to apply existing law to AI systems capable of drawing inferences about individuals based on vast amounts of data.
>
> Santa Clara, California-based Eightfold provides tools that promise to speed up the hiring process by assessing job applicants and predicting whether they would be a good fit for a job using massive amounts of data from online resumes and job listings. But candidates who apply for jobs at companies that use those tools are not given notice and a chance to dispute errors, job applicants Erin Kistler and Sruti Bhaumik allege in their proposed class action. Because of that, they claim Eightfold violated the FCRA and a California law that gives consumers the right to view and challenge credit reports used in lending and hiring.
[1] https://www.reuters.com/sustainability/boards-policy-regulation/ai-company-eightfold-sued-helping-companies-secretly-score-job-seekers-2026-01-21/
Re: (Score:2)
It seems like a hard case to make for the plaintiffs. If I understand California law correctly, an employer can look up your social media history and use that to help decide if they will hire you or not. They can't ask for your username or passwords (obviously), but if you can figure out what social media accounts are theirs, you can deny them employment without telling them that you used their social media to inform the decision.
So how is it different if you have an algorithm do that work for you? If this
Re: (Score:2)
> So how is it different if you have an algorithm do that work for you?
It's not about using an algorithm, it's about using a third party whose involvement was not disclosed, and disclosing applicants' information to them. This information is in the summary.
Re: (Score:2)
> It's not about using an algorithm, it's about using a third party whose involvement was not disclosed, and disclosing applicants' information to them. This information is in the summary.
Again, either the article (the actual article, not the summary) is leaving out important information, or this is going to be a hard case to win for the plaintiffs.
> The lawsuit, filed on Tuesday accusing Eightfold of violating the Fair Credit Reporting Act shows how consumer advocates are seeking to apply existing law to AI systems capable of drawing inferences about individuals based on vast amounts of data.
So that requires the definition of the FCRA
> The Fair Credit Reporting Act (FCRA) is a U.S. federal law ensuring the accuracy, fairness, and privacy of personal information in consumer credit reports, regulating how Credit Reporting Agencies (CRAs) collect, use, and share data. It empowers consumers with rights to access their credit reports, dispute errors, get free annual reports, and place fraud alerts, while also placing obligations on businesses that furnish information (like lenders) and those that use credit reports (like employers or insurers).
My point was that the article mentions nothing about running a credit report. Either the article left that part out for some weird reason, or these two plaintiffs are trying to say that any third party report gathering information without the consent of the applicant qualifies as a credit report. I'm not
Re: (Score:2)
> All these reports are simply opinions. Having an opinion, even for sale, is not an actionable item.
You could say that about credit reports too. They're just someone's opinion of how credit worthy you are.
Practically speaking, that ship has sailed. We've already collectively decided if someone is collecting data and making a decisions which can have profound effects on someone, that someone has a legal right to see what data went into that decision. However, the law as written may or may not apply to AI because AI wasn't a thing at the time. If it doesn't, IMHO, that's a job for Congress or the California
Disclosure should be required. (Score:1)
With how often AI gets its wires crossed with data, this should be mandatory.
This sorta thing happens anyway, on a regular basis. I don't know if its AI driven or not, by my father-in-law has a common name and has gotten email in relation to things he has no connection to. My wife also has a semi-common name and for a while, was getting emails and calls regarding non-existent employment work on her linkedin.
So I would expect errors to be worse with AI and it needs to be held accountable seeing as people's
Fuck Disclosure (Score:1)
Companies don't own this data. It rightfully belongs to the individual job seekers looking for work. Companies need to learn to stop using data they don't rightfully own with or without disclosure.
If we ok this practice with disclosure, companies will simply require job candidates to give up all rights to their own personal data. Job seekers will do it in an attempt to find work. That is absolute bullshit. My data is mine.
Hell, we already see companies posting fake jobs in order to collect resumes w
Use the DROP act? (Score:2)
Since these individuals live in California, can they not request Eightfold remove all data about them under the recently enacted DROP law? They might still not be interviewed or hired, of course.
Re: (Score:2)
I would be all for this, but the sad reality is that it equates to playing whack-a-mole. Having to opt out of lists justs guarantees that another list pops up. It's infuriating and exhausting.
Forcing companies to only use data where the target has opted in would barely help, too, because the opt-in would be buried deep in a TOS, and it would include allowing them to sell it, essentially opting that person in to third parties... and then the whack-a-mole continues.
Contracts can not be used to break laws. Agr
So many known issues (Score:2)
I have heard so many horror stories of job candidates. Often with no chance of finding out why, but people can lose jobs because:
1) name is similar to a pedophile from another state.
2) debt from identity theft - when they did not know it was going on and were NOT told by the employer.
3) their index finger was too long.
4) wrong age/race/gender/religion - yes this is illegal in the US but it still happens.
Companies get inundated with way too many categories which leads them to be arbitrary. They treat job
It's Pauline's Evil AI, Mickey Luv (Score:2)
Attention job seekers. How do you know if you are right for the job--unless you know what your options are?
Many companies collect data and train AI with it (Score:5, Interesting)
I'm not saying that the AI company shouldn't readily seek permission to harvest and use personal data, but it's far from a special case. Take Palantir for instance. Are they not doing the same thing?
It seems odd that the onus is on the third-party platform to be transparent to the candidates about data collection and AI use, and not the companies using the tool.
I am all for this lawsuit being successful if it sets a precedent for all AI companies using our data. But if the scope is specifically around the hiring process, I think the individual companies should be held accountable for using the tool without candidates' consent or knowledge.
Re: (Score:2)
I'm inclined to agree. While setting some jurisprudence around AI company data collection is good- the real problem that I see here, is the employer's use of this tool... they seem like they should be the ones litigated against in this instance.
After all- if they're not hit for doing this, they'll just find another way to do it if this one goes away.
Re: (Score:2)
The reason this is tied to only AI with hiring instead of all of AI and our personal data is because the law states for lending or hiring purposes.
We would need strong personal privacy laws and that is never going to happen in the USA.