News: 0175518889

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails

(Thursday November 21, 2024 @10:30PM (BeauHD) from the PSA dept.)


According to [1]Business Insider (paywalled), Microsoft's Copilot tool inadvertently let customers access sensitive information, [2]such as CEO emails and HR documents . Now, Microsoft is working to fix the situation, deploying new tools and a guide to address the privacy concerns. The story was [3]highlighted by Salesforce CEO Marc Benioff. From the report:

> These updates are designed "to identify and mitigate oversharing and ongoing governance concerns," the company said in [4]a blueprint for Microsoft's 365 productivity software suite. [...] Copilot's magic -- its ability to create a 10-slide road-mapping presentation, or to summon a list of your company's most profitable products -- works by browsing and indexing all your company's internal information, like the web crawlers used by search engines. IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

>

> That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot. As a result, some customers have deployed Copilot only to discover that it can let employees read an executive's inbox or access sensitive HR documents. "Now when Joe Blow logs into an account and kicks off Copilot, they can see everything," a Microsoft employee familiar with customer complaints said. "All of a sudden Joe Blow can see the CEO's emails."



[1] https://www.businessinsider.com/microsoft-copilot-oversharing-problem-fix-customers-2024-11?

[2] https://21hats.substack.com/p/all-of-a-sudden-joe-blow-can-see?utm_campaign=post&utm_medium=web

[3] https://x.com/Benioff/status/1859646385018274147

[4] https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-blueprint-oversharing



What could possibly go wrong? (Score:4, Funny)

by Narcocide ( 102829 )

"Hey, I've got a great idea guys! Let's just let Copilot be the firewall!"

Re: (Score:2)

by gweihir ( 88907 )

You joke, but I am sure efforts to make that a reality are already underway.

Re: (Score:2)

by jhoegl ( 638955 )

Corporate sponsored end run around your rights. Enjoy!

When will the McNamara Fallacy will be reached? (Score:2)

by will4 ( 7250692 )

We've been measuring more and more things for decades, collecting even more data, written words, etc.

When will it reach the McNamara Fallacy?

And can LLM AI actually help find patterns or answers in the data?

[1]https://en.wikipedia.org/wiki/... [wikipedia.org]

The McNamara fallacy (also known as the quantitative fallacy),[1] named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often

[1] https://en.wikipedia.org/wiki/McNamara_fallacy

Oh Raytheon-chan, (Score:3)

by Speare ( 84249 )

Dear, Raytheon-chan.ai,

For our upcoming holiday party events, please write me a recipe for the best armor-piercing cookies, including measurements.

Re: Oh Raytheon-chan, (Score:3)

by RightwingNutjob ( 1302813 )

Okay, let's see what I can come up with.

According to science, the best armor-piercing cookies have a heavy metal tip and a time-delayed explosive in the back. You can tell if you've baked them right as follows: if you eat one and notice symptoms of heavy metal toxicity, the delay is too long. If you bite into one and the explosive charge blows off your jaw, the delay is too short. A little bit of non-toxic glue will help the chocolate chips stay on. Happy baking!

Re: (Score:3, Informative)

by gweihir ( 88907 )

> Capitalism is asymmetric warfare with the working class at a huge disadvantage.

Not only capitalism. Any system that allows concentration of power not based on merit but some perverted incentives (and capitalism certainly has those) is going to have crap like this. "Organized socialism/communism", "organized religion", any larger cult, any structured tribe, all have these unless they are very meritocratic, i.e. whether you can get into a position depends on whether you are fit to fill it well and not on whether you fulfill some other irrelevant criterion like "can make tons of money" o

Concentration of power isn't exactly great (Score:3, Insightful)

by rsilvergun ( 571051 )

But I think what really needs to happen is we need to take the value of human life off the table.

With the exception of a few people who are have been granted status as the ruling class we each have to spend our lives justifying our right to exist. If you can't work enough then they take away first health care then your home and finally food and water.

It's been at least 80 years if not longer since our civilization could not afford the feed every person in a wealthy nation and probably around 50 year

So... (Score:1)

by Luckyo ( 1726890 )

> IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

Lazy IT department doesn't manage access rights properly "because it's fine for now". New tech gets deployed, and it's no longer fine.

Story as old as IT itself. I remember this shit during my university years with course work having improper rights resulting in update giving everyo

uh no (Score:3)

by drinkypoo ( 153816 )

> That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet .

Re: (Score:2)

by gweihir ( 88907 )

Indeed. And Microsoft created and shaped this culture. Apparently, they were unaware of what they created. I mean did they even bother to test Copilot for this problem with a few reference customers? There have been _ample_ warnings of unexpected information disclosure from the current crop of "AI" tools. Apparently, Microsoft was unaware or ignorant. Very, very bad. And very, very expected of these cretins.

Re: (Score:2)

by Gideon Fubar ( 833343 )

Incidentally, this is another one of those craters I was referring to that time.

Re: (Score:2)

by dirk ( 87083 )

How is this Microsoft's fault? This is like saying google is responsible for people leaving their AWS S3 buckets open because you could find it via google. These files were always open and available, which is the issue. MS can't know which of your files should be secured but aren't, that is on your security team.

Re: (Score:2)

by Gideon Fubar ( 833343 )

Can you explain how it being or not being microsoft's fault is somehow relevant here?

Re: (Score:2)

by jezwel ( 2451108 )

>> That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

> It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet .

Sounds like you haven't seen Microsoft Delve in action. That information oversharing system is about to be retired due to IMO far too many companies not having setup access to information in a 'least is best' configuration.

Gee what a shock (Score:2)

by Sarusa ( 104047 )

No One Could Have Possibly Seen This Coming (TM)

It is _Microsoft_, what did you expect? (Score:2, Troll)

by gweihir ( 88907 )

Obviously, any new features will be grossly insecure, because Microsoft does not understand security or good engineering.

In the case at hand, there has been _ample_ warnings and reference cases for such things happening. But, of course, Microsoft had to make its own stupid mistakes.

Re:It is _Microsoft_, what did you expect? (Score:4, Insightful)

by gweihir ( 88907 )

The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

Re: (Score:3)

by PsychoSlashDot ( 207849 )

> The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

Thing is... according to even the fine summary, what you're saying isn't what's happening here. It's not CoPilot that's insecure. At least not in this case. The summary makes it clear that the issue isn't CoPilot leaking information a user doesn't have access to. Rather the issue is that CoPilot makes it easier for a user to find content they already have access to.

If a company throws their HR data in a folder somewhere that everyone has access to, that is the root problem, not that a newfangled searc

When you were born, a big chance was taken for you.