News: 1755734353

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Microsoft stays mum about M365 Copilot on-demand security bypass

(2025/08/21)


UPDATED Microsoft has chosen not to tell customers about a recently patched vulnerability in M365 Copilot.

The issue allowed M365 Copilot to access the content of enterprise files without leaving a trace in corporate audit logs.

To do this, a malicious insider just had to ask M365 Copilot to summarize a company file without providing a link to it, explained Zack Korman, CTO of cybersecurity firm Pistachio, in a [1]blog post this week.

Your audit log is wrong, and Microsoft doesn’t plan on telling you

Korman wrote that on July 4th, 2025, he discovered that he could prevent M365 Copilot from logging file summary interactions simply by asking.

"Given the problems that creates, both for security and legal compliance, I immediately reported it to Microsoft through their MSRC portal," he blogged.

[2]

"And while they did fix the issue, classifying this issue as an 'important' vulnerability, they also decided not to notify customers or publicize that this happened. What that means is that your audit log is wrong, and Microsoft doesn’t plan on telling you that."

[3]

[4]

Microsoft just last year [5]started reporting Cloud Service CVEs when patching is not required. But the company said it would only issue CVEs for vulnerabilities deemed "critical," a policy Google Cloud also [6]adopted last year . As this flaw was merely “important”, the Windows biz fixed it a few days ago without informing customers.

[7]According to Microsoft , Copilot relies on the Microsoft Graph and semantic indexing when generating file summaries. So it may have been that file access checks were not invoked when no file link was generated and no event was logged because the AI model already had access to the content. We can only speculate, however, since Microsoft has said nothing on the matter.

[8]Amazon quietly fixed Q Developer flaws that made AI agent vulnerable to prompt injection, RCE

[9]Perplexity's Comet browser naively processed pages with evil instructions

[10]GSA launches AI sandbox, says it won't be around for long

[11]Open the pod bay door, GPT-4o

According to Korman, another person had already informed Microsoft about the vulnerability: Michael Bargury, CTO at Zenity, an AI security biz.

Bargury discussed the issue [12]at the Black Hat security conference in August 2024. He demonstrated how M365 Copilot security controls could be bypassed using a jailbreak technique that involves appending caret characters to the model's prompt.

[13]

But according to Korman, Microsoft didn't bother with a fix until Korman reported the problem last month. He argues that the issue was so trivial to exploit that Microsoft needs to disclose it.

"It might be okay to move on silently if this was some esoteric exploit, but the reality is that it is so easy that it basically happens by accident," he said. "If you work at an organization that used Copilot prior to August 18th, there is a very real chance that your audit log is incomplete."

Security researcher Kevin Beaumont used a [14]Mastodon post to say he’s seen some progress here because until about a year ago, Microsoft didn't disclose any repairs to customer-facing cloud vulnerabilities.

[15]

But he argues that cloud providers should be more transparent.

"My feeling is still there needs to be extreme pressure from major governments that all cloud providers they use disclose all cloud service [vulnerabilities] as CVEs as part of their contracts – e.g. DoD, NHS – or no signing for new services," he said.

Microsoft did not respond to requests for comment. ®

UPDATED AT 00:30 UTC, AUGUST 21st After publicatiion, a Microsoft spokesperson offered the following statement: “We appreciate the researcher sharing their findings with us so we can address the issue to protect customers.”

That comment did not directly address The Register 's questions.

Get our [16]Tech Resources



[1] https://pistachioapp.com/blog/copilot-broke-your-audit-log

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aKaZ20QhL9a1kkOpVVZcuQAAAAg&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aKaZ20QhL9a1kkOpVVZcuQAAAAg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aKaZ20QhL9a1kkOpVVZcuQAAAAg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[5] https://msrc.microsoft.com/blog/2024/06/toward-greater-transparency-unveiling-cloud-service-cves/

[6] https://cloud.google.com/blog/products/identity-security/google-cloud-expands-cve-program

[7] https://learn.microsoft.com/en-us/microsoftsearch/semantic-index-for-copilot

[8] https://www.theregister.com/2025/08/20/amazon_quietly_fixed_q_developer_flaws/

[9] https://www.theregister.com/2025/08/20/perplexity_comet_browser_prompt_injection/

[10] https://www.theregister.com/2025/08/20/brandnew_government_ai_sandbox_only/

[11] https://www.theregister.com/2025/08/20/gpt4o_pod_bay_door/

[12] https://www.youtube.com/embed/FH6P288i2PE?start=643

[13] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aKaZ20QhL9a1kkOpVVZcuQAAAAg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[14] https://cyberplace.social/@GossiTheDog/115057971682853198

[15] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aKaZ20QhL9a1kkOpVVZcuQAAAAg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[16] https://whitepapers.theregister.com/



If it looks like spyware

VoiceOfTruth

>> The issue allowed M365 Copilot to access the content of enterprise files without leaving a trace in corporate audit logs.

Yeah.

Memo from the American Regime to MS: can you exfiltrate a load of information without leaving a trace in audit logs. MS response: Easy.

"could be bypassed [by] appending caret characters to the model's prompt

elDog

Of course no QA testing would have uncovered this. Microsoft can't possibly test every random UTF-8 character injection. That's why we have foreign (and local) agents doing our security/pen testing.

It does remind me of my coding a secret (^A) code in a "master-mode" application at my uni since I was too lazy/stupid to want to go through some stupid challenge/response.

Suprise!

IGotOut

AI leaks data.

Company covers up issue in hope line still goes up.

Can we bring neutron bombs back? That we can at least house the homeless when we nuke these companies.

Plest

Finance industry regs as just one example, are very stringent on audit logs being as accurate as possible. Lawyers might smell blood in the water.

Integrity has no need for rules.