News: 0180182867

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Microsoft Warns Its Windows AI Feature Brings Data Theft and Malware Risks, and 'Occasionally May Hallucinate' (itsfoss.com)

(Sunday November 23, 2025 @03:34AM (EditorDavid) from the game-of-Risks dept.)


"Copilot Actions on Windows 11" is currently available in Insider builds ( version 26220.7262) as part of Copilot Labs, [1]according to a recent report , "and is off by default, requiring admin access to set it up."

But maybe it's off for a good reason...besides the fact that it can access any apps installed on your system:

> In [2]a support document , Microsoft admits that features like Copilot Actions introduce " novel security risks ." They warn about cross-prompt injection (XPIA), where malicious content in documents or UI elements can override the AI's instructions. The result? " Unintended actions like data exfiltration or malware installation ."

>

> Yeah, you read that right. Microsoft is shipping a feature that could be tricked into installing malware on your system. Microsoft's own warning hits hard: " We recommend that you only enable this feature if you understand the security implications ." When you try to enable these experimental features, Windows shows you a warning dialog that you have to acknowledge. ["This feature is still being tested and may impact the performance or security of your device."]

>

> Even with these warnings, the level of access Copilot Actions demands is concerning. When you enable the feature, it gets read and write access to your Documents, Downloads, Desktop, Pictures, Videos, and Music folders... Microsoft says they are implementing safeguards. All actions are logged, users must approve data access requests, the feature operates in isolated workspaces, and the system uses audit logs to track activity.

>

> But you are still giving an AI system that can " hallucinate and produce unexpected outputs " ( Microsoft's words, not mine ) full access to your personal files.

To address this, [3] Ars Technica notes , Microsoft added this helpful warning to its support document this week. "As these capabilities are introduced, AI models still face functional limitations in terms of how they behave and occasionally may hallucinate and produce unexpected outputs."

But Microsoft didn't describe "what actions they should take to prevent their devices from being compromised. I asked Microsoft to provide these details, and the company declined..."



[1] https://itsfoss.com/news/new-windows-ai-feature-can-be-tricked/

[2] https://blogs.windows.com/windowsexperience/2025/10/16/securing-ai-agents-on-windows/

[3] https://arstechnica.com/security/2025/11/critics-scoff-after-microsoft-warns-ai-feature-can-infect-machines-and-pilfer-data/



urg (Score:5, Insightful)

by MilenCent ( 219397 )

Tech companies: Security is such a huge priority that we'll load our software with power and memory wasting countermeasures that annoy the hell out of you. You may hate being that using two-factor authentication requires you to grab your phone for a text message before you log into anything, but it's all in the name of security! You should learn with it, it's all for the best!

Also tech companies: It's so important to lard our work with generative AI features that a little security compromise is fine!

because.. (Score:5, Informative)

by fortunatus ( 445210 )

it's not an AI, it's an LLM being marketed as an AI. on the other hand, if it /were/ an actual AI, it could simply be /convinced/ to spy, steal and damage!

Sounds like an escape clause. (Score:2)

by SeaFox ( 739806 )

"Using Windows AI may cause data theft and malware risks, so don't come to us when it happens, you were warned (not that you had the choice to disable the AI...)"

Re: (Score:2)

by ZiggyZiggyZig ( 5490070 )

it's definitely legalese.

Obvious answer (Score:5, Informative)

by NotEmmanuelGoldstein ( 6423622 )

> ...what actions they should take to prevent their devices from being compromised.

Obviously, uninstall Windows. Because one can't uninstall AI crap-ware MS Recall and MS Co-pilot.

Re:Obvious answer (Score:4, Informative)

by Mr. Dollar Ton ( 5495648 )

An advice that is almost 30 years late, but welcome.

Interesting times (Score:3)

by gweihir ( 88907 )

And dangerous for dumb people. Remember that "malware installation" usually means lateral movement and then compromise of the whole organization these days, because AD security sucks and then it is often misconfigured on top of that.

I would not trust this on a hardened Linux with network access. Windows? Do you want to get hacked?

Also note that they only put that in there because the lawyers told them they had to. This means this technology represents a fundamental and systematic novel risk they do NOT have under control. The usual limitations of warranty are not enough. Providing or using this feature will likely fall under gross negligence. Microsoft can get out of the resulting liability by explicitly warning its users that this is not a feature you can use securely and that result quality is very much not ensured. Or in other words, this is a very dangerous toy, not a professional product.

That they feel they need to add a warning with this uncommon level of clarity is very telling. I am sure all the MS fans and all the nil wits will still ignore it. So let me add this, because it will be relevant: We told you so.

There is, however, a strange, musty smell in the air that reminds me of
something...hmm...yes...I've got it...there's a VMS nearby, or I'm a Blit.
-- Larry Wall in Configure from the perl distribution