Microsoft's OneDrive spots your mates, remembers their faces, and won't forget easily
- Reference: 1760377728
- News link: https://www.theregister.co.uk/2025/10/13/microsoft_face_grouping_ondrive/
- Source link:
This writer has been enrolled in a OneDrive feature on mobile to group photos by people. We're not alone – others have also [1]reported it turning up on their devices.
[2]According to Microsoft, the feature is coming soon but has yet to be released – and it's likely to send a shiver down the spines of privacy campaigners.
[3]
It relies on users telling OneDrive who the face is in a given image, and will then create a collection of photos based on the identified person. Obviously, user interaction is required, and asking a user to identify faces in an image is hardly innovative. However, OneDrive's grouping of images based on an identified face is different. According to Microsoft's documentation, a user can only change the setting to enable or disable the new People section three times a year.
Screenshot from Microsoft document showing privacy settings for People section
The Register asked Microsoft why only three times, but the company has yet to provide an explanation.
[4]Nextcloud withdraws European Commission OneDrive bundling complaint
[5]To digital natives, Microsoft's IT stack makes Google's look like a model of sanity
[6]Outlook outage over North America, Microsoft scrambles to respond
[7]Word to autosave new docs to the cloud before you can even hit Ctrl+S
Unsurprisingly, Microsoft noted: "Some regions require your consent before we can begin processing your photos" – we can imagine a number of regulators wanting to discuss this. It took until July 2025 before Microsoft was able to make Recall available in the European Economic Area (EEA), partly due to how data was processed.
However, it is that seemingly arbitrary three-times-a-year limit applied to the People section that is most concerning. Why not four? Why not as many times as a user wants?
[8]
Turning it off will result in all facial grouping data being permanently removed in 30 days. There is also no indication of what Microsoft means by three times a year. Does the year run from when the setting is first changed, from when the face ID began, or on another date?
This feature is currently in preview and has yet to reach all users. While Microsoft is clear that it won't use facial scans or biometric data in the training of its AI models, and that grouping data can't be shared (for example, if a user shares a photo or album with another user), the idea of images being used in this way might make some customers uncomfortable. ®
Get our [9]Tech Resources
[1] https://hardware.slashdot.org/story/25/10/11/0238213/microsofts-onedrive-begins-testing-face-recognizing-ai-for-photos-for-some-preview-users
[2] https://support.microsoft.com/en-us/office/group-photos-by-people-21065f48-c746-48ad-a98a-cbe4631631bc
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/storage&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aO12daRtkfzOahuML6v8YAAAABU&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[4] https://www.theregister.com/2025/10/09/nextcloud_withdraws_ec_onedrive_bundling/
[5] https://www.theregister.com/2025/09/29/dont_even_consider_microsoft/
[6] https://www.theregister.com/2025/09/11/outlook_outage_microsoft/
[7] https://www.theregister.com/2025/08/27/microsoft_word_cloud_autosave/
[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/storage&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aO12daRtkfzOahuML6v8YAAAABU&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[9] https://whitepapers.theregister.com/
Re: Nobody needs to change this setting more than ...
640K changes ought to be enough for anyone.
Re: Nobody needs to change this setting more than ...
After you signed the contract. Of course.
Re: Nobody needs to change this setting more than ...
It doesn't matter if you change the setting to sod off or do me. Given M$'s past behaviour, a "bug" will override your choice and turn the feature on whether you like it or not. Maybe M$ can staple it to my prostate right next to CoPilot
Re: Nobody needs to change this setting more than ...
Trust us Trust us. Trust us.
Done. Now that we have earned your trust, let us screw you over even more royally.
You have to lawyer out their claims
All of these terms and conditions are written by lawyers and riddled with loopholes in the exact same way that threat actors worm in and leave APTs in software.
Realistically, there's no way to parse out who is really doing what with your data, assuming they don't flagrantly violate their terms and conditions as big tech is often wont to do. After all, even if they get caught, what are you going to do about it?
How do you even compute damages for loss of privacy?
How do you even determine the scope of what is shared / lent / given / copied / grouped / sold / leased / teasingly flashed or whatever between companies?
You either trust a company or you don't, based on what they've been caught doing in the past.
Personally, I don't trust Microsoft further than I can throw their oddly-bloated founder who somehow bears an uncanny resemblance to the pregnant man emoji.
Nobody should trust Alphabet or Meta, at all, ever.
Ai Ai Ai Ai !!!
Nothing to worry about !!!
How else would MS get to test out their 'AI' unless you allow them total access to your 'Data' for your their benefit !!!
Of course, the data is safe and would NEVER be misused !!!
Every day in Every way I feel Safer & Safer knowing that the Tech Behemoths & 'AI' are looking after my Best Interests !!!
:)
Re: Ai Ai Ai Ai !!!
Oh course they're looking after your best interests. How else can they take them if they can't find them?
It might be someone else's photograph ...
but if it is my face how do I opt out ? I do not use any Microsoft software or service.
This should be opt in by the individuals in the pictures.
Re: It might be someone else's photograph ...
Well, you do not have standing.
At least, that will be the reason they give you. Unless, you prove that your photograph was misappropriated and misused.
Then they say, go sue the person(s) who misappropriated your photograph.
All in all, Microsoft will say it is a Not-A-Microsoft-Problem situation and you should bother someone else.
Nothing
There is nothing you can do. Microshaft supplies governments so they are untouchable.
Eh
Does that mean if you disable it, then at some point it will reactivate
To me, you deactivate and it should not activate until you decide
I particularly enjoy the fact that at 3 changes a year, mashing that setting will lock it in the 'on' state. Sneaky.
Dark Patterns..
https://hardware.slashdot.org/story/25/10/11/0238213/microsofts-onedrive-begins-testing-face-recognizing-ai-for-photos-for-some-preview-users#microsoft_spokesperson
Slashdot: What's the reason OneDrive tells users this setting can only be turned off 3 times a year? (And are those any three times — or does that mean three specific days, like Christmas, New Year's Day, etc.)
[Microsoft's publicist chose not to answer this question.]
... Which if MS management hadn't redacted it, probably would have said:
" Yes, we hear you've called this sort of thing a 'Dark Pattern'. How cute. Yes, we just don't want you turning it off, really. The more of you who do, the less dosh we get from [1]Mr. Ellison et al. "
[1] https://www.theregister.com/2025/02/12/larry_ellison_wants_all_data/
And the info....
...will be passed onto your local authoritative police force.
ICE have already wet themselves at the thought of this data.
Why 3 times a year? Because they are paid the ingress cost and the compute costs and they don’t want people to flick the setting on/off/on/off. They already knew this would create resource issues in Azure so they set the retention to 30days anyway. I guess they don’t do a good job of remembering which pictures are already processed and so are worried about thrashing the upload/compute each time it’s toggled on/off/on. I bet the database backend ‘delete user’s data’ request is heavy and slow so they want to protect the backend. It’s a crap UX and suggests there’s a crap architecture/implementation behind it.
Apple have had this feature on iOS for years - I suspect the difference is that the ‘people’ metadata is stored on the user-device on the Apple implementation and not some central creepy database (Microsoft implementation).
You seriously believe that after so many apple-cloud scandals? The are not better. Just different.
Well, they at least " [1]think different ". Especially when it comes to knowing the difference between an adjective and an adverb.
[1] https://en.wikipedia.org/wiki/Think_different
No OneDrive not cry
Ob-observing the hypocrites
As they would mingle with the good data we fed
And when its gone
Everything is gonna be alright
Then shalt thee
s/ee/ou/
Yet another reason to never use OneDrive
Enough said.
How dialog box options have changed over the years
Yes | Ask again later | No
Yes | Ask again later
Yes | Ask again later (maximum 3 times)
And soon it'll be:
Yes
Re: How dialog box options have changed over the years
From the company that infamously redefined the [X] Close Window button to mean "Yes, please go ahead and perform the Windows 10 upgrade", what else would we expect?
No photos, please
I use Linux, so I'm not concerned about my own photos, but I will definitely think twice before letting anyone else take photos of me. I guess I have to start treating family like paparazzi.
OneDrive - no thanks
I have it because I have MS accounts. But I don't put any photos anywhere near it.
Mega
People stop this nonsense and switchover to Mega, which is a file storage similar to OneDrive but end-to-end encrypted. No A.I. bullshit and no one scrutinizing your files or feeding them to A.I.
https://www.mega.io
Re: Mega
Or [1]Cryptomator (open source) to encrypt the file client side before uploading to any cloud service.
[1] https://github.com/cryptomator
Mac v Windows
Why would anyone that can afford ~60£/month for a interest free Mac continue to use anything M$.
I would bet if you turn it off and then back on again it deletes and then re-creates the training data at great expense of electricty usage. Imagine if a few million (insert onedrive user numbers joke here) people constantly turned it off and on again... MS datacenteres would be glowing.
The joys of juxtaposition.
On one story today:
'Microsoft is clear that it won't use facial scans or biometric data in the training of its AI models'
Another:
'Microsoft 'illegally' tracked students via 365 Education, says data watchdog'
Your very own digital panopticon :o
Your very own digital Panopticon. The people building this don't seem to realize it will eventually be turned on them. Not that it matters as the IRS and the spooks already have access to your confidential records.
[1]The Digital Panopticon: Surveillance Technologies, Big Data, and the Crisis of Democratic Societies
[1] https://understandingai.iea.usp.br/nota-critica/the-digital-panopticon-surveillance-technologies-big-data-and-the-crisis-of-democratic-societies/
Just make sure you identify every face in every photo you let get near Onefart as being Satya Nadella.
Maybe even set up a few dummy accounts too.
Besties
Satya Nadella
The deliberate over reach
There are, I have no doubt, people who will gratefully use, even welcome, the core feature of this.
But what MS ( and other big tech companies ) do is add in a whole lot of other stuff round itfor no justifiable reason- like making it difficult to not use.
Microsoft’s egregious spreading of this stuff is beyond belief - but the likes of Meta aren't far behind.
AI "features" are just the most recent and worst examples of this.
There's an argument for making AI stuff available to users. But the tech companies are trying to make it unavoidable. So yes, have face recognition in OneNote or that silly AI symbol in Whatsapp. But that doesn't mean it has to be forced on to users.
It would be really helpful for El Reg to always embed a How To Turn It Off, right there in the article.
Same for Recall, for those of us who were unfortunate enough to either be forced to run a work machine with Win11, with an IT dept that one doesn't trust to even know one of these features exists (I wish I was kidding, oh, how I wish). But also for the inevitable plethora of relatives who found themselves in the grip of Microsoft's enforced "opt-out-if-you-can" "enhancements".
I say putting it here, because we all know that web searching has become useless and GPTs are often hopelessly out of date.
Nobody needs to change this setting more than ...
64 .... I mean three times ...