News: 0175819093

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Siri 'Unintentionally' Recorded Private Convos; Apple Agrees To Pay $95 Million (arstechnica.com)

(Thursday January 02, 2025 @05:20PM (BeauHD) from the always-listening dept.)


An anonymous reader quotes a report from Ars Technica:

> Apple has [1]agreed (PDF) to pay $95 million to settle a lawsuit alleging that its voice assistant Siri routinely [2]recorded private conversations that were then sold to third parties for targeted ads . In the proposed [3]class-action settlement (PDF) -- which comes after five years of litigation -- Apple admitted to no wrongdoing. Instead, the settlement refers to "unintentional" Siri activations that occurred after the "Hey, Siri" feature was introduced in 2014, where recordings were apparently prompted without users ever saying the trigger words, "Hey, Siri." Sometimes Siri would be inadvertently activated, a whistleblower [4]told The Guardian , when an Apple Watch was raised and speech was detected. The only clue that users seemingly had of Siri's alleged spying was eerily accurate targeted ads that appeared after they had just been talking about specific items like Air Jordans or brands like Olive Garden, Reuters noted. It's currently unknown how many customers were affected, but if the settlement is approved, the tech giant has offered up to $20 per Siri-enabled device for any customers who made purchases between September 17, 2014, and December 31, 2024. That includes iPhones, iPads, Apple Watches, MacBooks, HomePods, iPod touches, and Apple TVs, the settlement agreement noted. Each customer can submit claims for up to five devices.

>

> A hearing when the settlement could be approved is currently scheduled for February 14. If the settlement is certified, Apple will send notices to all affected customers. Through the settlement, customers can not only get monetary relief but also ensure that their private phone calls are permanently deleted. While the settlement appears to be a victory for Apple users after months of mediation, it potentially lets Apple off the hook pretty cheaply. If the court had certified the class action and Apple users had won, Apple could've been fined more than $1.5 billion under the Wiretap Act alone, court filings showed. But lawyers representing Apple users decided to settle, partly because data privacy law is still a "developing area of law imposing inherent risks that a new decision could shift the legal landscape as to the certifiability of a class, liability, and damages," the motion to approve the settlement agreement said. It was also possible that the class size could be significantly narrowed through ongoing litigation, if the court determined that Apple users had to prove their calls had been recorded through an incidental Siri activation -- potentially reducing recoverable damages for everyone.



[1] https://cdn.arstechnica.net/wp-content/uploads/2025/01/Lopez-v-Apple-Unopposed-Motion-for-Preliminary-Approval-of-Class-Action-Settlement-12-31-24.pdf

[2] https://arstechnica.com/tech-policy/2025/01/apple-agrees-to-pay-95m-delete-private-conversations-siri-recorded/

[3] https://cdn.arstechnica.net/wp-content/uploads/2025/01/Lopez-v-Apple-Proposed-Settlement-Agreement-12-31-2024.pdf

[4] https://www.theguardian.com/technology/2019/jul/26/apple-contractors-regularly-hear-confidential-details-on-siri-recordings



Bwahahahahaha!! (Score:1)

by angryman77 ( 6900384 )

Yeah, that's entirely believable.

Sure, Apple knew nothing about it. (Score:2)

by wakeboarder ( 2695839 )

Thats a good cover story, they knew they were selling ads from private conversations. Data is too valuable, companies can't resist the money they can make off of it.

The last class action I was in... (Score:3)

by Hey_Jude_Jesus ( 3442653 )

I received a gift card for $7.96 that could only be used at one online store. The attorneys received tens of millions.

Wow, $20? (Score:2)

by balaam's ass ( 678743 )

That's not even enough to buy a new (Apple-TM) dongle!

No wonder Apple's the #1 market cap company in the world.

Backups? (Score:4, Insightful)

by backslashdot ( 95548 )

Permanently deleted off backups too? sounds very unlikely

Peanuts (Score:3)

by devslash0 ( 4203435 )

An average payout will be $1.99 and affected people will get it as a credit in Apple Store. Apple should be forced to clean up any and every instance of that leaked data. They should be forced to train any 3rd party of a party of a 3rd party and ask for deletion all the way down. They should ensure every bit of that information is scrubbed and any AI models trained on it eradicated. They should be forced to do it and to fight any legal parties with 3rd parties to get it done. That would be the only justice and only compensation for they done. Asking forgiveness rather than permission is just not good enough and most companies will keep following this "permission model" until there are real consequences for the perpetrators, including jail time for directors.

Hacker's Guide To Cooking:
2 pkg. cream cheese (the mushy white stuff in silver wrappings that doesn't
really come from Philadelphia after all; anyway, about 16 oz.)
1 tsp. vanilla extract (which is more alcohol than vanilla and pretty
strong so this part you *GOTTA* measure)
1/4 cup sugar (but honey works fine too)
8 oz. Cool Whip (the fluffy stuff devoid of nutritional value that you
can squirt all over your friends and lick off...)
"Blend all together until creamy with no lumps." This is where you get to
join(1) all the raw data in a big buffer and then filter it through
merge(1m) with the -thick option, I mean, it starts out ultra lumpy
and icky looking and you have to work hard to mix it. Try an electric
beater if you have a cat(1) that can climb wall(1s) to lick it off
the ceiling(3m).
"Pour into a graham cracker crust..." Aha, the BUGS section at last. You
just happened to have a GCC sitting around under /etc/food, right?
If not, don't panic(8), merely crumble a rand(3m) handful of innocent
GCs into a suitable tempfile and mix in some melted butter.
"...and refrigerate for an hour." Leave the recipe's stdout in a fridge
for 3.6E6 milliseconds while you work on cleaning up stderr, and
by time out your cheesecake will be ready for stdin.