News: 1739887265

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Time to make C the COBOL of this century

(2025/02/18)


Opinion Nobody likes The Man. When a traffic cop tells you to straighten up and slow down or else, profound thanks are rarely the first words on your lips. Then you drive past a car embedded in a tree, surrounded by blue lights and cutting equipment. Perhaps Officer Dibble had a point.

There's no perhaps about the FBI and CISA getting [1]snippy at buffer overflows . These people worry about exploits that threaten car-crash incidents in enterprise IT, and they've seen enough to get angry. It's not that making mistakes is a crime when writing code. No human endeavor worth doing is without error. It's more that this class of bug is avoidable, and has been for decades, yet it pours out of big tech like woodworm from a church pew. Enough already, they say. They are right.

You know all about buffer overflows. A coder moves data from A to B, but doesn't check that A will always fit in B. When it won't, it gets copied into the memory beyond B, which may be catastrophic. Let's call that memory C for catastrophe. Or chaos. Or, well, C.

[2]

Among the many remedies suggested by the Feds is ditching C and its chaotic family in favor of more modern languages that have robust defenses against creating buffer overflows. Which C most definitely does not. It is a chainsaw of a language without safety guards, born in the time when if a programmer wanted to cut their own femoral artery open, then they're the boss. Some circus performers can juggle with chainsaws. You do not want chainsaw-juggling gift-sets sold for family Christmas presents.

[3]

[4]

You don't have to abandon C to be safer, as the Feds point out. You can do better testing, in lots of ways. You can use safe coding practices and inspection tools. We have computers that can compile the entire Linux source in minutes. There is no excuse not to use such raw power to do a heck of a lot more work in testing code as well.

This is at the core of the bug watchers' frustration. The same companies that are burning billions on AI that nobody asked for aren't bothering to spend a fraction of that in clearing up their messes.

[5]

Everyone else who sells services and goods to others has responsibilities and consequences. An electrician who wired 240-volt circuits to 115-volt outlets would not last long. Nor would a plumber who used one-inch pipes for toilet outlets, if you want a real buffer overflow. Yet companies like Microsoft do this to millions of customers when they don't have to, and they're free to keep on doing it.

Yes, testing is expensive and doesn't guarantee safety. Guess what? Testing becomes quicker, easier, and cheaper if you're testing code that doesn't have so many errors in it. No need to check for a class of mistake when that class has been eliminated earlier in the pipeline. Which brings us back to the choice of coding platform and especially the choice of language.

[6]The biggest microcode attack in our history is underway

[7]Microsoft starts boiling the Copilot frog: It's not a soup you want to drink at any price

[8]Huawei's farewell to Android isn't a marketing move, it's chess

[9]Smart homes may be a bright idea, just not for the dim bulbs who live in 'em

Changing language is hard, and it gets harder the better you are at what's being changed. The world's best chainsaw juggler is not going to want to move to precision safety cutters. If your business is built on an ecosystem of chainsaw-juggling kits, and the blood spilled ain't yours, then why is it anyone else's business? The challenges to an organization aren't just technical; they're cultural and personal and should not be underestimated. Look at the Rust rumblings in the Linux kernel, where [10]ringmaster Linus is having to use his lion-tamer whip on some big beasts .

None of this is an excuse for not making the move. There will come a time when C becomes the COBOL of the 21st century, and the faster that happens, the less blood will stain the pavements of the information superhighway. Doing things badly when you know how to do them well leads to irrelevance, even extinction, sooner or later. Writing and shipping better code is a competitive advantage once you have the processes to do it efficiently and the people to do it well.

Transition costs are one-off, ongoing savings are forever. How soon do you want to get there? What are you doing to make it happen?

[11]

It's also great to be on the side of the angels when sins become crimes. If bad practice is called out enough times and is still deliberately ignored, the door opens to making it actionable. A new law or even just a court case on what we have on the books now could do it – an exploit caused by avoidable bad code makes a company explicitly liable. Defining legally bad code is hard, but making the use of known good methods an acceptable defense is in line with logic.

That list won't include "Don't use C," at least not immediately. One day, it may be needed, if the problem persists long enough. It's far better not to make that happen by making C history, at least in production code that can cut not just your leg off but those of millions of users. Those chainsaws can't rust up quickly enough. ®

Get our [12]Tech Resources



[1] https://www.theregister.com/2025/02/13/fbi_cisa_unforgivable_buffer_overflow/

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/cso&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Z7S8sTK4FuHbq-6fef4kkwAAAMQ&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/cso&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Z7S8sTK4FuHbq-6fef4kkwAAAMQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/cso&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Z7S8sTK4FuHbq-6fef4kkwAAAMQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/cso&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Z7S8sTK4FuHbq-6fef4kkwAAAMQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[6] https://www.theregister.com/2025/02/10/microcode_attack_trump_musk/

[7] https://www.theregister.com/2024/11/18/opinion_piece_ai_tools/

[8] https://www.theregister.com/2024/10/28/opinion_column_huawei_harmony_os/

[9] https://www.theregister.com/2024/10/14/opinion_column_smart_gadgets/

[10] https://www.theregister.com/2025/02/07/linus_torvalds_rust_driver/

[11] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/cso&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Z7S8sTK4FuHbq-6fef4kkwAAAMQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[12] https://whitepapers.theregister.com/



Good luck...

ICL1900-G3

...as they say, with that.

C is the new COBOL

Admiral Grace Hopper

I assume you mean that it will underpin more of the world of finance, banking, transport and government than you image for many decades to come and provide employment for many grey-haired code-slingers in the West and fresh-faced young programmers in India for as long as it does so.

Re: C is the new COBOL

MiguelC

Excellent!

Came here looking for this post, was not disappointed

Re: C is the new COBOL

Caver_Dave

For how many years did the UK banks still have a shim layer to convert from decimal the LSD that the core code ran (on an emulator of the original hardware) and then back with the results?

I will give you a clue - decimalisation was 15 February 1971, and I know someone who looking after one of the afore mentioned systems this century.

Re: C is the new COBOL

that one in the corner

If I had a farthing for every time I hear of something like that, by now I'd have nearly a crown in assorted specie.

Re: C is the new COBOL

Roland6

The author really needs to explain what they mean by this, as it doesn’t make any rational sense.

I suspect the problem with COBOL is that it wasn’t designed by academic computer scientists and was focused on business applications, something the academics weren’t interested in. Plus given the social environment of the day, I would not be surprised if part of it was sexism given the leading role women played in its initial development.

COBOL is derided today, because the bias has been institutionalised by those who only pay attention to social media influencers.

Spazturtle

It's going to happen and I don't know why people are against it.

If you are a C dev then fine, nobody is going to make you learn another language. There will be enough C projects that need maintaining to last for the rest of your career.

Snake

People are against it because, so highly ironically, a lot of people involved in the ever-changing world of high tech...are personally resistant to change when said change means they must change.

Understand that this pattern affects all of us and is all around us: cell phone reviewers will downgrade any phone that doesn't behave like an iPhone or a Samsung. Windows changes a button location and it's the End of the World as We Know It (ahem, regtards ). GNOME changes and all hell breaks loose; init is replaced by something [they state is far more manageable] and it's heresy.

For a group of people intrinsically involved in one of the the cutting edge of social endeavours...I am constantly amazed at how change resistant individuals have become.

You got into programming knowing that it was a moving target, that change would come to you constantly and that one of the requirements of the industry was constant study and updating of skills. I am not one of those personally but those on the front line of the core of the structures - programming, servers, networking, chip design, etc - know this going in.

Yet they forget and then resist, when said resistance allows them a personal benefit even though it hurts the industry as a whole.

"C is powerful! C is the Swiss Army knife of languages!", I've been told. Yes, and just like a Swiss Army knife, it is useful for a lot of things...but nothing is specialized. You can cut a rope and crudely filet a fish, but you wouldn't want to build a car with it.

The comparison might be more apt than they imagined: C is also the Swiss cheese of languages. Many, many possible holes and, maybe, too many to patch closed. That's the point of the FBI et al statements mentioned here in this article, but that stands in irrelevance compared to our comfort with the language, fanatic C supporters essentially claim. C was developed at a time of low resources, limited I/O and compute power compared to today's hardware - we all, compared to when C was created, are sitting in front of mainframe-level power. Multi-core, multi-threaded, multitasking behemoths if we brought this hardware back in time when C was created. So now humans are responsible for dealing with all this complexity and - forget your egos - it is almost humanly impossible to do it without errors. Period. Stop your ego denials saying that "Real programmers can handle it!"

Haven't so far, and never will. Humans are fallible and you saying that you, especially, aren't, is stupid by this point in time.

You have the processing power to help you in your endeavour of reaching programming Nirvana. Use it. Leverage it. Get over yourself and allow the tech in front of you to help instead of believing yourself to be the uncontested master of every aspect of its operation. You aren't. Sorry. Unless you also laid out the circuit design and microcode in that CPU you are using, there are (many) things in these boxes that are beyond your ability and/or reach.

So, use the best modern tools you have available.

heyrick

" resistant to change when said change means they must change. "

Change for the sake of "trendy new methodology" means taking something that mostly works and likely has a long history behind it...and rewriting it from the ground up in a different language which, given how people like to fiddle with things and managers like to optimise (read: cut costs), this will essentially mean that you'd be swapping one set of potential problems for a different set.

that one in the corner

Well, I suppose it is always good to have rant, although I do think you might want to tighten up your focus.

> You got into programming knowing that it was a moving target, that change would come to you constantly and that one of the requirements of the industry was constant study and updating of skills

True. Although I'd represent it as something that has constant additions (and, yes, this does mean that I look upon a "change" - where we've lost the previous rather than just had a new option addded - with suspicion, because I've tried to make things have a decent long service life and be capable of holding up for, say, a 25 year life as a part of a large and expensive piece of machinery).

> people intrinsically involved in one of the the cutting edge of social endeavours

Whoooah there, Nelly. I never got into this to be part of any "social endeavours"; I'm not trying to "overturn the world" or "create a new paradigm" and I don't personally know anyone who was! It was a fun job to do, always lots of new stuff to learn, problems to solve and you got the pleasure of building things that Users, well, use. Ok, some of the Users were...

> I am constantly amazed at how change resistant individuals have become.

> are personally resistant to change when said change means they must change.

But then you mix up changes that are clearly just marketing and have removed functionality (Windows buttons - and the taskbar, grr), changes that were done against expressed wishes of users (GNOME were told beforehand), changes that have genuine technical concerns being expressed (replacing init - ok, there is also a lot of politicing going on there).

And you even mix up complaints from the techies, the ones that actually have to deal with the ugly bits, with reviewer playing marketing wank (the cell 'phones bit).

> (long comment about C which anyone who uses C already knows - it came from the 1970s, lots of much bigger machines exist now - hey, bigger machines existed in the 1970s than the ones that C targeted)

And totally ignoring that many, many (have I said "many"?) systems are written NOT in C - the vast majority, in fact. Web site creators - do they write in C? Building a new database - are you using SQL or trying to do it in C? Text processing systems (mailing lists, documentation aggregation) - are you using C or are you using, oh, LaTeX or a Markdown processor or Word?

> So, use the best modern tools you have available

Yup. I and those I've worked with have done just that. As do all the people doing the jobs I referenced above.

Trouble is, many (frankly, probably all) of the things that you hear being shouted about are not necessarily the best tools - they may be the best *marketed* tools instead! But we (in my Corner) have to be sure they are going to pass the test of time - and that they are clearly designed with the intent of longevity![1] Or we'll be piling up even more problems for ourselves.

Oh, and strangely enough (!) there are still situations where C (and we'll lump "old fashioned" C++ in here as well) still is the best, stable, modern solution: all those boring little embedded systems that you probably don't even notice. Hopefully "more safe" options will occur (the MCUs are big enough now to run Python - but you can save on the BOM by using a smaller device...) but we need that to have stabilty and longevity.

> when said change means they must change.

(Back to this again): for an awful lot of people, who is *paying* for them to change? Is *anyone* paying for the change?

[1] Are you supposed, on day 1, to rely on the presence of a server "somewhere on the 'Net", in order to pull in modules just to make "Hello World" work? Or is the default install totally self-contained, can be put into Escrow for 10, 20 years and still work?[2]

[2] Sorry, sorry, don't get into my own rant, or I'll be going on about throwaway code and the problems/waste that causes.

Roland6

If C hasn’t displaced COBOL to any great extent, I doubt Rust or any other language with roots in Algol-60 will any more successful.

"Changing language is hard, and it gets harder the better you are at what's being changed."

Dan 55

Luckily we can add C++ constructs to C code over time which makes it quite easy.

Let us remember the wise words of [1]Isidore of Seville who said in the 7th century, "where there is a fixed-length buffer, may we have a string. Where there is error, may we have exceptions. Where there is a malloc, may we have scoped variable. And where is a hand-crafted array, may we have a vector".

[1] https://aleteia.org/2018/10/06/why-is-st-isidore-of-seville-patron-saint-of-the-internet

Re: "Changing language is hard, and it gets harder the better you are at what's being changed."

Caver_Dave

In C I once wrote a system that had structures preceded by headers with a 32bit unique ID and a 32bit length. (Would be a 64 bit length now I suppose.)

Then any pointer could be checked (by a macro) that the structure was preceded by a header of the correct type which could spot a huge number of issues and could also (again by a macro) check that it did not exceed the length specified in the header. Macro's also dealt with allocations and frees to maintain the headers.

The macro's became empty in the production build and so execution time was not compromised.

The forerunner of C++ classes I suppose, although implemented in a very different way.

Re: "Changing language is hard, and it gets harder the better you are at what's being changed."

Phil O'Sophical

In C I once wrote a system that had structures preceded by headers with a 32bit unique ID and a 32bit length

A bit like VMS argument descriptors, from the 1970s? Available to C and every other language which used the VMS calling standard?

The compiler could generate a lot of extra code for bounds checking

Howard Sway

and completely virtualise all memory management without even having to change any source code (all the oh-so-clever C hacks that misuse the lack of such things would of course then break). The main problem is that your operating system compiled with it will now be 10 times bigger and 10 times slower than all your competitors. So who's going to jump first?

Re: The compiler could generate a lot of extra code for bounds checking

Andy Non

Speed differences can be huge. A factor of 1000 between fastest and slowest. When I first got into programming the first DOS pc's, I wrote some tests in 8086 assembly, same tests in C, COBOL and BasicA.

Assembly = Transwarp.

C = Warp 9

COBOL = impulse drive

BasicA = Are we there yet? Are we there yet? ...

Re: transwarp performance

MiguelC

In a college project (30 years ago....God, I'm old!), I used some embedded assembly code in my Visual C++ project to load a polygon vector file. While most of my colleagues' code took several minutes to load the largest files, mine did it in a couple of seconds - it loaded the first file so quickly the teacher first thought my code just didn't work - he only realised it had done it's thing when I told him to press the menu button to show the 3D object, and everything worked

The problem isn't the chainsaw...

Mentat74

It's the idiot wielding it...

Re: The problem isn't the chainsaw...

Andy Non

To be fair, the poor guy has only got one arm. ;-)

Re: The problem isn't the chainsaw...

Gary Stewart

Gee, I wonder what happened to the missing arm?

Re: The problem isn't the chainsaw...

Steve Foster

'Tis but a scratch.

Ironically...

cschneid

...you have to try pretty hard to overflow a buffer in COBOL.

Re: Ironically...

that one in the corner

Although you can do some real damage with an ALTER statement!

A non-IT pleb can read COBOL

Anonymous Coward

As title.

COBOL is readable. Traditionally it is also run on industrial strength operating systems which simply don't allow buffer overflow nonsense.

You try to write A to B and B is to small, everything stops. Dead. No overflow. Can't happen.

Perhaps it's time to be realistic about C?

It's a coder hostile language, and makes most assembler languages look easy.

Half the problem with the rising C greybeard problem is the macho hostility of its proponents, as commonly seen in Linux communities where newbies are derided for not being experts.

You've reaped what you sowed.

Re: A non-IT pleb can read COBOL

LionelB

> It's a coder hostile language, and makes most assembler languages look easy.

You'll not have coded in assembler, then.

C is essentially a portable assembler language, based on an abstract machine which supports sequential processing of instructions, branching, linear memory layout, and little else. As with assembler, you can code pretty much anything in C, but it doesn't hold your hand, and it requires that you know exactly what you're doing (in that respect it is demanding rather than "hostile"). As with any language, you may write good or bad code in C. C is appropriate to some programming tasks, and inappropriate to (many) others.

FWIW, I suppose I am a C "greybeard", but I did code a bit in COBOL back in the day when I worked in the telecoms industry. It was horrible, but rock-solid. Several of my contemporaries from that time continued for decades to earn top dollar as contractors in the finance sector (as far as I know, some still do). The first language I learned was Fortran 77, and I still have a soft spot for it - no buffer overflows there either (I think that was only introduced in Fortran 90 ;-)) I have programmed in several flavours of Basic - that was okay, it got the job done. I was attracted to C++ for a while, then backed away when it crawled too far up its own arse (as did my code). I find Java unpleasant and Python offensive - I'm not quite sure why. I have no experience with Rust, and probably never will; it is not useful to me. As a mathematician, statistician and research scientist, nowadays I code mostly in Matlab, for which I write C plugins for non-native functionality requiring high performance. I would probably have switched by now to Julia, which is rather lovely for scientific programming, if not for the fact that it's not quite mature enough (yet) in terms of libraries, and in any case Matlab is still de facto standard in my research area.

It's not one-size-fits-all. Never will be.

Re: A non-IT pleb can read COBOL

Roland6

>” > It's a coder hostile language, and makes most assembler languages look easy.

You'll not have coded in assembler, then.”

I have, having written an OS for the 8086/286 in ASM, a C code generator for the 8086 platform which supported segmentation, and back in thee day some Unix device drivers in C, I agree with the original post in that it can make assembler seem easy. However, I much prefer to use C than assembler.

With (x86) assembler it is harder to do some of the things that C will do for you ie. You have to deliberately rather than inadvertently do some things. Buffer overflow has always been and will always be an issue in languages that support string processing /runtime variable length parameters. However, with assembler you can get bogged down into the details and so lose the thread of logic you are actually trying to code.

Re: A non-IT pleb can read COBOL

heyrick

" It's a coder hostile language "

You mean it doesn't have training wheels and it will quite happily build code for pointer[-1234] because that's valid syntax even if there's a pretty good chance it's not what was wanted.

" and makes most assembler languages look easy "

Ummm... While C is often called a higher level assembler, the massive benefit of the compiler is that I don't have to deal with all of the tedious bullshit. Stack frames? Register allocation? Remembering what was stacked to unstack it later? Not my problem. Plus you can often write things in C that would take _many_ lines of assembler to do.

" Perhaps it's time to be realistic about C? "

Let's be properly realistic about C. It doesn't go out of its way to hold your hand or cuddle you. But, then again, it was intended for writing operating systems using hardware vastly less capable than the sorts of things we have today.

" You've reaped what you sowed. "

I think a lot of the problem is the changing development environment in which it is expected to ship early and ship often, rolling updates, and using the users as testers (though nobody openly admits that). This gives fewer opportunities to get things right. Couple that with the number of systems that are now always connected to a hostile outside world, it poses real problems. But, alas, such things as effective testing and unit tests and a set of employees whose job it is to break things, all of this costs money. Money which is better put into the pockets of shareholders rather than, you know, making sure the bread is buttered correctly. Just look, for example, at how often Microsoft screws up their updates. And they're not alone, just frequent offenders. Changing language can fix a certain class of problem (by sheer virtue of that particular problem being specifically catered for by the language), but it can't fix institutional malaise. It can't fix sales promising the moon on a stick in two months and if you don't deliver it will be the apocalypse, nor can it fix penny pinching, bad management, and poorly written specs.

karlkarl

COBOL is still the COBOL of today.

C is... forever. It is basically the entire computing platform at this point. Its almost akin to saying, "lets get rid of assembly" or even processors.

Paradox

elsergiovolador

If your carpenter can't make a cabinet, do you settle on getting few planks tied with a gaffer tape so at least you can put something on it, do you change your requirements so that you actually be okay with a piece of wood, or do you hire competent carpenter?

This bufferflow conundrum is corporations wanting to hire cheap incompetent carpenters and force them somehow to make great cabinets to increase profits.

Or kicking square peg into round hole.

Nonsense.

Just pay up for the skill!

Re: Paradox

Anonymous Coward

Finding a carpenter who knows what he's doing is a bit of a Holy Grail.

Somewhat pretentious

david1024

Have to echo the 'good luck'

They've tried language legislation in the past, maybe the new guys feel they'll succeed this time? Likely, no, just like coke floats were forgotten by the new folks and 'reinvented'. They'll need to (re) learn this too.

Real programming

Primus Secundus Tertius

C is for real programmers, not for wimps. We need more real programmers, rather than the wimps who proliferate today with all their half thought out bungled pseudo-systems.

COBOL rules

davebarnes

Time to make COBOL the COBOL of this century.

You kids and your trendy languages can just go away.

To converse at the distance of the Indes by means of sympathetic contrivances
may be as natural to future times as to us is a literary correspondence.
-- Joseph Glanvill, 1661