News: 1752490027

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Stopping the rot when good software goes bad means new rules from the start

(2025/07/14)


Opinion The 21st century is turning out weirder than we thought. For the entire history of art, for example, tools could be used and abused and would work more or less well, but generally helped the wishes and skills of the user. They did not plot against us. Now they can – and do.

Take the painter's palette. A simple, essential, and harmless tool – just don't lick the [1]chrome yellow , Vincent – affording complete control of the visual spectrum while being an entirely benign piece of wood. Put the same idea into software, and it becomes a thief, a deceiver, and a spy. Not a paranoid fever dream of an absinthe-soaked dauber, but the observed behavior of a [2]Chrome extension color picker . Not a skanky chunk of code picked up from the back streets of cyberland, but a Verified by Google extension from Google's own store.

This seems to have happened because when the extension was first uploaded to the store, it was as simple, useful, and harmless as its physical antecedents. Somewhere in its life since then, an update slipped through with malicious code that delivered activity data to the privacy pirates. It's not alone in taking this path to evil.

[3]

Short of running a full verification process on each update, this attack vector seems unstoppable. Verifying every update would be problematic in practice, as to be any good the process takes time and resources both for producers and store operators. You need a swift update for security and bug updates, and a lot of the small utilities and specialized tools that make life better for so many groups of users may not have the means to cope with more onerous update processes.

[4]

[5]

You can't stop the problem at the source either. Good software goes bad for lots of reasons: classic supply chain attack, developers sell out to a dodgy outfit or become dodgy themselves, or even the result of a long-term strategy like deep cover agents waiting years to be actuated.

What's needed is more paranoia across the board, some of which is already there as best practice, where care should be taken to adopt it, and some of which needs to be created and mixed well into the way we do things now. Known good paranoia includes the principle of parsimony, which says to keep the number of things that touch your data as small as possible to shrink the attack space. The safest extension is the one that isn't there. Then there's partition, like not doing confidential client work on a browser that has extensions at all. And there's due diligence, checking out developer websites, hunting for user reports, and actually checking permissions. This is boring, disciplined stuff that humans aren't good at, especially when tempted by the new shiny, and only partially protective against software rot.

[6]

So there needs to be more paranoia baked into the systems themselves, both the verification procedure and the environment in which extensions run. Paranoia that could be valuable elsewhere. Assume that anything could go bad at any point in its product lifetime, and you need to catch that moment – something many operators of large systems attempt with various levels of success. It boils down to how can you tell when a system becomes possessed. How to spot bad behavior after good.

[7]AI scores a huge own goal if you play up and play the game

[8]Your browser has ad tech's fingerprints all over it, but there's a clean-up squad in town

[9]The one thing SME IT can do that the big guys can't: Change the world

[10]Put Large Reasoning Models under pressure and they stop making sense, say boffins

In the case of demonic design tools, the sudden onset of encrypted data transfers to new destinations is a bit of a giveaway, as it would be in any extension that didn't have the ability to do that when initially verified. That sounds a lot like a permission-based ruleset, one that could be established during verification and communicated to the environment that will be running the extension on installation. The environment itself, be it browser or operating system, can watch for trigger activity and silently roll back to a previous version while kicking a "please verify again" message back to the store.

The dividing line between useful and harmful behaviors is always contextual and no automation will catch everything. That doesn't mean a pinch more paranoia in the right places can't do good, especially where limits can be set early on and patrolled by automation.

If you're riding herd on corporate infrastructure, you'll know how far the bad actor will go to disguise themselves, obfuscating egressing traffic and making internal changes look routine when they're really rancid. The bad guys learn about the tools and skills that can defeat them as soon as you do, and there's no automation that can change that. Elsewhere in the stack, though, there's still room to provide more robust methods of setting and policing behavioral rules.

After all, a demon-possessed color picker dropping a rootkit that opens the door to ransomware injection will make your life just as unpleasant as anything grander. Paranoia wasn't invented in the 21st century, but it's never been more valid as the default way to think. ®

Get our [11]Tech Resources



[1] https://cameo.mfa.org/wiki/Chrome_yellow

[2] https://www.theregister.com/2025/07/08/browser_hijacking_campaign/

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/applications&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aHUploRtTnfeOESlbTewzwAAAc8&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/applications&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aHUploRtTnfeOESlbTewzwAAAc8&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/applications&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aHUploRtTnfeOESlbTewzwAAAc8&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/applications&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aHUploRtTnfeOESlbTewzwAAAc8&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[7] https://www.theregister.com/2025/07/07/ai_scores_a_huge_own/

[8] https://www.theregister.com/2025/06/30/opinion_browser/

[9] https://www.theregister.com/2025/06/23/opinion_column_sme_agile_change/

[10] https://www.theregister.com/2025/06/16/opinion_column_lrm/

[11] https://whitepapers.theregister.com/



Product liability

b0llchit

The real problem is that there is no product liability for those producing the software and the ones peddling it to the users. If google and apple were to be liable for what they distribute, then a lot of the problems would suddenly be much harder to slip through the net.

The rules should be simple: If you profit, directly or indirectly, you are liable. If you do not profit, then the full source code must be disclosed, downloadable and verifiable. Any build from that source must result in a reproducible and verifiable binary.

Re: Product liability - nice idea but...

Duncan Macdonald

Politicians take bribes - Google and Apple can easily buy enough politicians to ensure that this sensible idea never becomes law.

The only practical solution is for each new version to go through verification before it is allowed onto the Google or Apple stores. An automated process could check for significant changes and flag new versions with significant changes for manual checking. This would not be perfect - but it is better than the current situation.

It Isn't *Just* the Source You Need!

An_Old_Dog

You also need to know which, which versions of, and twhich options were fed to, the assemblers, compilers, linkers, and interpreters were used in the production of the software, or you will not be able to accurately reproduce the build.

If any of those tools are expensive or otherwise unavailable, you're hosed.

Re: It Isn't *Just* the Source You Need!

b0llchit

The "full source" includes instructions how to build and what tools you need. This is a topic discussed often and the keyword is that builds must be reproducible and verifiable. Therefore, you cannot hide how you made the program.

It is not possible that you need tools that are unavailable because that would run afoul of the reproducibility and verifiability requirement rule.

That leaves expensive tools. Yes, that is possible, but unlikely when you cannot profit, directly or indirectly, from the program. Otherwise you'd be liable again. However, if a tool is prohibitively expensive, then you'd again run afoul of the reproducibility and verifiability requirement.

Re: It Isn't *Just* the Source You Need!

b0llchit

And, if you require any intermediary to have reproduced and verified the build before they are allowed to pass it on, then high priced tools become slightly more academic. Imagine you have a tool that costs one Mega bucks and apple/google/microsoft/whoever needs to have done the build and verified the output... I guess they will no longer peddle that particular program.

Re: Product liability

Ian Johnston

Likewise organisational IT departments need to be held responsible for everything put in front of users and everything which happens as a result. Seven phishing link which makes it through: IT carries the can. Any dates written when a user clicks on a link IT allowed through: IT carries the can. Blaming the users is sheer laziness.

Operating systems are software too...

Mentat74

Even though the article only talks about browser extensions going bad... I can smell the stench coming off of Windows getting stronger every day...

jonha

I never ever use Google's stores (Android or browser extensions), even don't have a Google account on phones/tablets. I use a few trustworthy stores (F-Droid and similar) and I sideload everything, after checking with two malware checkers and a period of quarantine. The only browser extension I use is uBlock Origin on Pale Moon and Firefox (actually LibreWolf). I accept that all this requires a bit more effort, but one advantage of apk sideloading is that I can update the zoo of phones and tablets I'm admin'ing with a single adb job.

I am not saying that one can totally avoid these sorts of threats but there are some ways to minimise the attack surface.

Filippo

It sounds like you're describing a system where I can control what data or services each software component gets access to. I would love something like that, and I believe it could be created without being excessively obtrusive - somethng just a bit more granular than the average phone's permissions system would be enough, and most people can already manage that. The malicious color picker would cause the browser to raise an alert as soon as it tried to access data outside the current web page, or send data to anywhere.

Unfortunately, I don't believe this is going to happen as long as the largest tech companies benefit massively from us not knowing what data or services their software has access to.

"Short of running a full verification process on each update"

Pascal Monett

And, pray tell, why is that not happening ?

Does Google not have enough computing power to make its portal secure ? Does it not have enough money for that ?

It doesn't give a flying fuck is the problem.

Once upon a time, we didn't have the luxury of downloading updates. We had to wait for the next month's magazine CD to deliver an update to us (if we were lucky). Magazine that we had to pedal uphill to get, and we had to pedal uphill to go back home and install it. That's why our programs worked , because they had to . .

Mumble, mumble, kids these days . . .

Do not underestimate the power of the Force.