Memory is running out, and so are excuses for software bloat
- Reference: 1766498712
- News link: https://www.theregister.co.uk/2025/12/23/memory_software_opinion/
- Source link:
As memory prices continue to rise, it is time engineers reconsidered their applications and toolchains' voracious appetite for memory. Does a simple web page really need megabytes to show a user the modern equivalent of Hello World? Today's Windows Task Manager executable occupies 6 MB of disk space. It demands almost 70 MB before it will show a user just how much of a memory hog Chrome is these days. [1]The original weighs in at 85 KB on disk. Its successor is not orders of magnitude more functional.
Those who remember effective software running in kilobytes rather than gigabytes have long shaken their heads at the profligate ways of modern engineering. But as tech progress marched on and memory densities seemed destined to increase without end, protesting about bloat felt a lot like "old man yells at cloud."
[2]
Enter the AI boom. As the world races to pack datacenters full of computing gear, memory prices [3]have rocketed in recent months and currently show no signs of returning to levels where a developer could shrug and bolt on another multi-megabyte framework to meet an arbitrary user requirement.
[4]Server prices set to jump 15% as memory costs spike
[5]Cheaper 1 GB Raspberry Pi 5 lands as memory costs go through the roof
[6]Commodity memory prices set to double as fabs pivot to AI market
[7]Memory boom-bust cycle booms again as Samsung reportedly jacks memory prices 60%
Developers should consider precisely how much of a framework they really need and devote effort to efficiency. Managers must ensure they also have the space to do so. The energy spent securing a toolchain should go into checking its efficiency too.
It is often joked that the memory and computing power that enabled humans to land on the Moon compare poorly to those of a modern smartphone. However, it is not so very long ago that perfectly usable applications and operating systems ran from floppy disks on devices with RAM measured in kilobytes rather than megabytes.
[8]
Reversing decades of application growth will not happen overnight. It requires a change of thinking and a different outlook. Toolchains must be rethought, and rewards should be given for compactness, both at rest and in operation.
In the 1970s, a shortage of energy spurred efficiency. In the 2020s, a shortage of computer memory might finally result in software that doesn't fill every byte with needless fluff. ®
Get our [9]Tech Resources
[1] https://www.theregister.com/2025/11/12/thirty_years_of_task_manager
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aUrKrSgTh0tCvRuoCOH0YgAAAEE&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://www.theregister.com/2025/11/19/commodity_memory_price_rise/
[4] https://www.theregister.com/2025/12/04/server_prices_15_percent_jump_memory_costs/
[5] https://www.theregister.com/2025/12/01/raspberry_pi_5_1gb/
[6] https://www.theregister.com/2025/11/19/commodity_memory_price_rise/
[7] https://www.theregister.com/2025/11/14/samsung_price_jump/
[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aUrKrSgTh0tCvRuoCOH0YgAAAEE&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[9] https://whitepapers.theregister.com/
Re: Lovely idea - no chance of it ever happening
Like you, I think the articles sentiment around memory should just be extended to software efficiency in general.
You only have to look at Notepad.exe as a good example of something that's now considerably larger and slower, and delivers little benefit for the bloating (perhaps we didn't need to rewrite it to require a .NET dependency to get tabs? etc)
I grew up trying to get a program to fit in the boot sector on an Amiga floppy disk - an era when people tried to wring ever last ounce of performance out of a system.
Some of that still goes on in certain circles - like where there's a cost saving to be made (many years ago Google did a decent bunch of work around optimising webpage efficiency, reducing bloat, best practice for images etc - I guess the result is that the "cost" was shunted to the client, but serving the content at scale became significantly cheaper). More often than not the general idea is to do something good enough more quickly so that revenue can be recognised. Compromises are made so that things can be delivered at scale more quickly - e.g. microservices can be seen as a way to manage people.
Ignoring what we might think about AI - a lot of the code it produces is inefficient and usually benefits from significant refactoring (which you can get use AI to do, but I digress) - AI slop does fall into the "good enough" boat in order to get that precious revenue.
All of that is to say that software inefficiency is not a technical problem, but a management issue.
Microsofts fault...
Double: First the inefficiency in Windows 11, the defender and the browser, and then stealing our RAM for their AI crap, which needs even more RAM on our local machines even when we did not ask for it. (Server versions of Windows are still quite clean and lean in that regard, but Server 2025 shows a bit more memory hunger than 2022)
Re: Microsofts fault...
Honestly I want to give you a hundred thumbs up: Spot on with the AI crap: Most of us don't need it, don't want it, so why is it running on our systems, consuming resources including RAM but also CPU cycles. Why make it so hard to disable the darn stuff?
Although... M$ isn't the only company pushing bloatware and unnecessary functionality such as AI. Google are doing so, too.
Of course Linux users
Will simply ask "What's this all about then ?"
My latest Mint install is still running perfectly in a 4Gb laptop from last decade.
Re: Of course Linux users
I’m afraid the Linux world doesn’t get away with it either. I have noticed a steady increase in bloat in unixy software. A more of it seems to stem from the massive dependency list many applications require. Does that text editor REALLY need to bring in 2 dozen dependencies? Probably to use just one or two features from each. And before someone points out that they are shared libraries and so used by lots of stuff, that’s not the point. The point is laziness - why write a small function to do exactly what you want (and only what you want) when you can bring in megabytes of other crap instead?
Wasting RAM wasting Cache
Performance in CPUs depend on the (CPU) cache. Wasting RAM is also wasting cache, which in turn is wasting performance.
However I think there is an opposite trend, where fast SSDs everywhere makes it less relevant for programs to load data from storage into RAM.
That should at least flatten out the need for RAM a bit.
Re: Wasting RAM wasting Cache
On the contrary: Quite a number of SSDs, especially cheaper ones, use a bit of the system RAM as cache to avoid having to overwrite the same blocks several times just 'cause one bit changed (there was a reg article around issues with that and windows 11, cannot find it right now). The better SSDs have that bit of RAM on their own board.
Yeah, not gonna work.
The bloat isn't likely to go anywhere.
All we can really hope for is the idiotic AI bubble to pop soon. It's doing nothing but increasing misery.
This argument has been going on for at least the thirty years I've been in the industry. I have to agree with those earlier posters that those who call the shots seem to care about time to market more than anything else. So nothing will change.
Ironic considering that just yesterday The Register had...
... [1]an article berating Linux desktop users for not "using Flatpaks, Snaps, and AppImages to install programs instead of worrying about library incompatibilities and the like" .
Well, here's the thing. Linux Mint includes both "System Package" and Flatpak versions of the Gnome Calculator app.
The system package version is 7 MB. Which would have been ludicrous for a calculator app back in the day, but is small by modern standards.
The Flatpack version is "1.1 GB to download, 3.6 GB of disk space required" . For something we can assume is still basically the same 7 MB calculator app .
Yes, that's disk space rather than RAM, but it illustrates the principle regardless.
Supposedly Flatpak gets more efficient at using space as more packages are installed and it reuses duplicate files, but that's still nothing short of horrendous.
[1] https://www.theregister.com/2025/12/22/what_linux_desktop_really_needs/
Re: Ironic considering that just yesterday The Register had...
I would love to see a breakdown of exactly what a calculator application does with that 1.1MB
Or a web browser that uses 100MB of memory before you actually bring up a web page (and then uses 30MB per page, or whatever it is)
It’s ludicrous - but you already know that
"long shaken their heads at the profligate ways of modern engineering"
The real problem is that it's not engineering -- it's clusterfudging. Software development ceased to be engineering when the microcomputer took over from the mainframe and mini. Those were programmed by experts aware that, on time sharing systems, anyone who crashed the machine would be seriously unpopular with all other users. Plus they worked inescapably very near the metal so they understood the technical implications of their code. The "micro revolution" was, however, driven mainly by self-taught kids in back bedrooms who had unlimited enthusiasm but neither the ethics nor the technical mindset of the engineering discipline. (I know, I was there, but was fortunate to have had a scientific training which imposes the same discipline).
The parsimonious use of memory at that time was not a matter of judgement, or even choice. It was forced on those writing code by the cost of memory (e.g. £1.60 + 15% sales tax per kilobyte from Watford Electronics in August 1982). So it was done, but without being any fundamental concept that would stick when memory became more plentiful and cheaper. Unfortunately, by virtue of the commercial success of the resultant negligent approach, there's never been any incentive to professionalise micro software development. Indeed the opposite has to a great extent occurred -- witness the deprecation of C as a "hazardous" language in favour of newer languages that prevent the making of basic coding errors -- seemingly eliminating the need to pay strict attention to what one is coding.
The details may be open to argument, but the basic truth exists that software development is not yet an engineering discipline but absolutely must become one. Not only bloat but fragility and vulnerability have reached utterly unacceptable proportions given the extent to which we rely on software to keep our societies running and safe. In all established branches of engineering (even down to gas fitting and plumbing) there are formally ratified mandatory standards that must be met. We need the same for software development in any domain where personal privacy, business security, livelihoods or lives could be affected by inadequate code. And almost inevitably, such standards would drive down bloat, as excess complexity is itself a primary source of the relevant hazards. Bluntly, we have to train would-be software developers to consider carefully (and feel responsible for) the implications to the end user of what they develop -- that's the primary principle of the engineering mindset.
Re: "long shaken their heads at the profligate ways of modern engineering"
Hmmm... Inquiring minds want to know if AI can be led into such software design and implementation...
Maybe the answer to soaring RAM prices is to use less of it !!!
Yes ... 10000000000000+++ times !!!
Sick of bloated software that is insecure, 'plays badly with others', spys on everything for 'reasons' and is upgraded, in 9 months, to something 'Better' that is totally different in UI & functionality terms because 'now we must include 'AI' or whatever is flavour of the month' !!!
Back in the day I wrote software in Assembler to fit the small amount of memory or to maximise speed !!!
Hard work but extremely satisfying as you had to know what you were doing to get it to work ... and it did !!!
:)
Re: Maybe the answer to soaring RAM prices is to use less of it !!!
/me smugly points out that I am somewhat masochistically designing a FAT32 system to work with compact flash on a 2MHz 65c02 with a whole 32kB of ram to play with!
(And in response to Watford Electronic prices, I still remember the shock of buying two 1k by 4 memory chips from Technomatic in 1978... for a tenner each.)
Re: Maybe the answer to soaring RAM prices is to use less of it !!!
Please, please, please do not consider this in any way, shape or form a political statement or endorsement but:
Make Assembly Great Again
Become a Linux Terminal Wizard then, some of the most memory effective programs run text only from the Linux terminal.
Call it Gary's theory
feature bloat x levels of abstraction = huge programs
I'm trying to figure out what's going to happen if the RAM panic continues? Do we just surrender personal computing to the lizard men of California that can't optimize their software AND bottleneck the supply of memory to run their bullshit?
You'll be buying 32G off of street dealers.
Gonna get expensive!
"it is time engineers reconsidered their applications and toolchains' voracious appetite for memory."
Most of us do already - at least those of us that remember being 'king of the hill' one's 286 AT clone had a whole 2MiB of DRAM. At work, things were of a more embedded nature and we were constantly having to re-factor code to fit things into the available space.. all 8KiB of it. With the more mature embedded stuff, the code deduction required to get enough space for the fix/new feature could sometimes be harder than the fix etc..!
As an old fart who was used to counting the bytes, I've often wondered how the 'memory bloat' introduced by the OS/Tools/Runtimes etc. could be tolerated. The answer is, of course, 'plug in more RAM!'. ISTR that RAM was about £10/MiB at the time.
The idea idea that RAM is a finite resource appears to have dropped out of the syllabus:-) Think yourselves lucky - on occasion one had to count the CPU Cycles used by each machine instruction if things were time-sensitive.
It's a shame, but bloatware is here to stay because generally speaking they don't teach people to design code any more.
When I went to uni (1988-91) my Comp Sci degree was highly theoretical - Comp Sci was new and many of the Faculty were mathematicians. So we learned all about data structures, algorithm complexity, that kind of thing. And we had to write frugal code, because in those days we were working in Modula-2 on Mac desktops with 4MB RAM and in C on Sun-3 shared systems with 32MB or 64MB.
I remember competing in the BCS's annual programming competition back then, too: each team was given a PC with a copy of Quick-C and you had to keep it small and not bust the "small" memory model which if memory serves was something like 640KB. Taught you to think about the algorithm and not just throw a highly recursive, clunky monster at it and hope, because the judges (of which I later became one) would see that coming and would have test cases that would make the code bust the RAM limit. I once set a question (Sudoku solver) which did that, for that precise reason - if you brute-forced it, you'd blow up, so you had to write a vaguely clever algorithm.
I flinch when someone tells me they're a "coder". There are many, many extremely good software engineers in this world, but they're a dying breed because modern technology saves us from ourselves when we write bloated, inefficient code and so the need to actually design code properly is vastly reduced compared to 30 years ago. There are two many people who can write programs, but not good ones.
Incidentally, in the BCS competition each team had one PC. You were given a bunch of questions and you shared them out, designed the solutions with pen and paper, and then took your turn on the PC to bash in the code. I wonder how many people do that today.
:-)
Not all optimisation in software engineering has been about resource efficiency
There are other concerns that the commissioners of software have long been concerned about need to be considered yet it's fashionable amongst us older devs to pretend that back in my day, we had to lick t' road clean before fatha would even let us go to school and we made every byte count, etc. etc.
The problem is if you cast your mind back to software development in the 70s and 80s and even 90s ... it was fucking awful. The tools were primitive. There wasn't a lot in the way of useful abstraction, which means stuff took ages to develop, was usually fragile and tied to specific bits of hardware, and riddled with the sorts of potential security bugs that would have you hung by the foreskin until sorry for in modern day computing (but fortunately back then everything was air-gapped).
Not only did stuff take ages to develop, it didn't actually do anything much that everyone takes for granted these days. Software almost certainly wouldn't have worked with any other character set than US ASCII. It wouldn't have handled RTL script. It very likely didn't have any undo feature, or any cut and paste or global clipboard. Your fonts would have been shitty bitmaps on a low-res monochrome screen instead of beautifully rendered glyphs in 32 bit colour rendered slightly larger on your 4K monitor for your tired miserable old eyes. It might not, if you go far enough back, even have had a GUI. It won't have been able to even access more than 2GB of RAM until it got a 64 bit OS. It would likely have just crashed when it ran out of physical memory. All of these things have been added and demonstrably made using software vastly better than it used to be... and it's all cost space. Everything's come at a cost in space, and CPU power.
And while it was taking ages to develop, very few people had much of the required patience and autistic attention to detail to actually do it properly, and so they commanded a very high price, which they charged for a very long time, and this vexes people who pay for these things to get developed, so they're very keen to make it a) easier and therefore less of an exclusive and hence expensive club and b) quicker so it costs even less to make and gets to market faster. And these two last drivers of market forces are the full force of what's driven software development for the last ... 3 decades or so? Make it quicker. Make it cheaper.
It's still vastly cheaper to buy RAM than it is to optimise software. Vastly, vastly, cheaper. Even at todays slightly higher prices. And ... I'm fine with that, because I can concentrate on the first, and most difficult bit of software development - make it work - for longer before I have to worry about the next bit - making it fast.
"rewards should be given for compactness, both at rest and in operation"
They will be - in time. When Linux will have as large a library of applications as Windows and be able to run comfortably in an 8GB PC while doing everything you need to do, just like in a Windows Bloat system with 32GB of RAM.
There are people who are still capable of minimalist programming, but they do not include GitHub libraries in their codebase. They write their own libraries and know exactly what is in them and why.
But yeah, that takes time. Time to think about the how, time to write and time to debug and make sure it works in all use cases including edge cases.
Time is money, so managers prefer to bring on the GitHub bloat - even if that means "supply chain risks".
The cost of RAM is up ? Who cares ? Time to market is more important (especially for bonus purposes).
Lovely idea - no chance of it ever happening
The penchant for just lifting huge chunks of code from GitHub or (worse) getting grossly bloated and inefficient code prepared for developers by "AI" (previously copy/paste from Stack Overflow being the mode du jour) mean that whilst a noble idea, this has a negligible chance of success.
Ultimately the world in general has demonstrated by its choices that it prefers obese and shoddy code that's thrown together as quickly as possible (preferably quicker) but that only works when the wind is prevailing in a South-Westerly direction over well-engineered systems and that ain't gonna change anytime soon (or likely at all).
By way of a hopefully vaguely interesting anecdote, I did some consulting work at a place where I'm been a permie 10 years previously. I'd barely got through the door before I was harangued about some Turbo Pascal code that I'd written more than a decade before having stopped working. I ran said code and it finished so quickly that I presumed they must be right, only to find that it was actually a network configuration issue and that the reason it had finished almost instantly was just that the performance of their hardware was massively superior to what had been the case when I originally wrote it.