News: 1754900107

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

AI coding tools crash on launch, could reboot better in future

(2025/08/11)


Opinion Here are two snapshots of AI in coding in mid 2025. The CEO of GitHub, coding’s universal termite mound, says that [1]AI is going to do all the coding and that’s a good thing . Meanwhile, real life AI coding tools make coders [2]less productive while spreading the hallucination that they’re more so.

Can something that makes life worse for the workers building the digital world turn around so completely? Put into historical context, it looks not only inevitable but essential.

The proper place to start is, as so often, with "Amazing" Grace, or Rear Admiral Grace Hopper as she was more properly known. Seventy years ago, she began work on the first high-level computer language designed to create data processing programs through English-like words rather than equation-esque symbols. The result, FLOW-MATIC, led in short order to COBOL and our current mayhem. A creation myth of the digital age.

[3]

Less well known is the opposition to the idea of high-level languages that Hopper had to overcome. There were around [4]88 electronic computers in operation in the US in 1955, making computation an incredibly expensive commodity, jealously guarded by its creator class of high status mathematicians and engineers. Many considered compute cycles and storage far too valuable to waste on anything other than directly crunching numbers. Translating words from people who couldn't or wouldn't learn how to express themselves in machine symbols was unconscionable.

[5]

[6]

Hopper, having been a mathematics professor herself, knew this attitude to be hopelessly limiting even at the time, let alone if computers were going to become widespread. As you may have noticed, she was right. While the detractors were correct about resources being too limited at the time, that time changed with dizzying rapidity. That specific criticism was equally rapidly extinguished.

The underlying theme was not. Resource limitation coupled with entrenched thinking has been used to decry many fundamental advances since. As computers moved towards a mass market, each breakthrough brought initial resource constraints that kept alive previous programming practice optimised for efficiency and speed.

[7]

C made cross-platform software work in the early days of minicomputers, It didn't stop it being ridiculed as a gussied-up macro assembler by assembler programmers. Likewise, during the Ediacaran era of microcomputers, there was much snorting at the arrival of intermediate representation or IR.

IR is where a compiler initially produces a common format that is later translated to executable code via a virtual machine. That offers a very dynamic portability across architectures, but initially at too great a demand on the base hardware. P-code, beloved of Pascal heads, was just too tardy to live. Java and its bytecode was equally sluggish at first, the joke being it was impossible to tell the difference between a machine that had stopped because of a crashed C program, and one that was running Java.

Java was saved by a combination of Moore's Law and ubiquitous networking. Now IR is itself ubiquitous through technologies such as LLVN, and C itself has become an IR in compilers for languages such as Nim and Eiffel. Which makes sense for portability and optimization, at least for now. It's impossible to envision a coding world as rich and powerful in this interconnected age without these ideas.

[8]

All this illustrates that increased abstraction comes hand-in-hand with increased complexity and capability. In truth, almost no code actually running on silicon in mainstream IT has been touched by human finger nor seen by human eye. Even ignoring platform VMs and microcode, the code that actually processes data these days has been written by machines, usually many times.

Thus to AI. AI is cursed thrice: by name, by hype, and by resource limitations. It is very clever data analytics and inference, it is not intelligent, and calling it so sets bad expectations. It can be very effective at well-formed, well-bounded tasks, but it is being sold as universal fairy dust. More bad news, more fuel for entrenched attitudes.

[9]When hyperscalers can't safeguard one nation's data from another, dark clouds are ahead

[10]The tiny tech tribe who could change the world tomorrow but won't

[11]Stopping the rot when good software goes bad means new rules from the start

[12]Your browser has ad tech's fingerprints all over it, but there's a clean-up squad in town

Looking at the fit to current capabilities, it's uncomfortable. You can usefully run big AI models on mildly muddled local hardware, but training is a different matter. Huge resource restraints and very questionable business models are not great ingredients for evolving tools that fit well. These are early days, and no wonder coding AI is a very mixed blessing.

This will change, and it can only go one way. As Grace Hopper knew with complete clarity, removing barriers between thought and result accelerates technology. Coding AI will do that, if we advance the art with care and vision. It will mean more of the human work going into forethought, design and consideration of what we actually want, which are good disciplines that are badly undercooked at the moment.

There's one last old programming joke to bear in mind. When computers can be programmed in written English, we'll find out that programmers can't write English. Here's hoping that the law of limited resources and embedded attitudes makes history of that one too. If not – hey – people are still making a living fixing COBOL. ®

Get our [13]Tech Resources



[1] https://www.theregister.com/2025/08/07/github_ceo_ai_coding/

[2] https://www.theregister.com/2025/07/11/ai_code_tools_slow_down/

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aJm_N0QhL9a1kkOpVVbU6AAAAAc&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[4] https://www.lindahall.org/about/news/new-acquisition-a-survey-of-domestic-electronic-digital-computing-systems-1955/

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aJm_N0QhL9a1kkOpVVbU6AAAAAc&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aJm_N0QhL9a1kkOpVVbU6AAAAAc&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aJm_N0QhL9a1kkOpVVbU6AAAAAc&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aJm_N0QhL9a1kkOpVVbU6AAAAAc&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[9] https://www.theregister.com/2025/08/04/when_hyperscalers_cant_safeguard_one/

[10] https://www.theregister.com/2025/07/24/column_settings_standards/

[11] https://www.theregister.com/2025/07/14/software_rot_opinion/

[12] https://www.theregister.com/2025/06/30/opinion_browser/

[13] https://whitepapers.theregister.com/



Natural vs. programming language

b0llchit

When computers can be programmed in written English, we'll find out that programmers can't write English.

And luckily, programmers are not linguists. A natural language is not deterministic while a programming language is deterministic. Therefore, there is no point in using any natural language, English or other, because you cannot use it to program correctly .

You need a programming language to program and the programming language you use should be the one suited (best) for the job. Even Grace Hopper knew that.

Re: Natural vs. programming language

Anonymous Coward

I have a colleague who, although he speaks perfect English with a Middle-England accent, drops odd word structures, or grammatically incorrect verb conjugations into specifications. Similar in many ways to our Asian colleagues. In the end I had to ask him and found out that he was a product of the Indian schooling system and all its idiosyncrasies.

How is natural language processing meant to deal with the inconsistencies of language use like this?

Anon as he may well read this publication; after I have recommended it to him a few times.

Re: Natural vs. programming language

James Anderson

English is a great language for poetry and jokes. This is because of its lack of precision and multiple ambiguity’s. Also it will be years before a machine learned program will be able to recognise the subtle nuances behind the choice of words; say between “comfort station”, bathroom, toilet, jakes, bog or shithouse.

Re: Natural vs. programming language

snowpages

..and will it be able to cope with grocers' apostrophes?

(ambiguities)

G

Anonymous Coward

and all i got from that story is the authors an idiot

Resources being too limited at the time

abend0c4

The thing is, resources are becoming limited again: not because of a shortage of computers, but because of a shortage of power and water to sustain the sudden - and entirely speculative - growth in their numbers.

The type of computers we have - stored program arithmetical calculators - are remarkably poor at the tasks for which AI is supposed to be the solution - even when the outcome is successful, the cost of getting there is disproportionate. That's not to say we won't in the future have better models of reasoning and inference that can be trained - and corrected - incrementally and which can be sustained on more modest resources, but that's not the current position.

What we have is a at best a proof of concept: if you're able to see past the rough edges, you can get an idea of how these systems could indeed be useful. However, it's cost so much money to get to this stage, any improvements - and any returns on the investment - are going to be dependent on creating a dependence on the flawed technology through the "lock-in" for which IT vendors have become notorious.

The growth in computing resources in the past was a case of increased supply leading to increased demand for the improvements they created. What we're currently seeing is not a demand for more computing resources, but a demand for fewer human resources. There's no great suggestion that AI will do things "better", just that things can be done with fewer people and the implication that will lead to less cost. I don't presently see a scenario in which there could be a widespread uptake of AI that led to a reduction in costs because - even if we actually exterminate the displaced staff - competition for natural resources is simply going to make the massive compute demands of AI unaffordable. The cost hike will of course not be immediate, but the "boiling frog" effect will leave clueless customers in immaculate ignorance until it's too late .

To some extent, we the IT professionals have only ourselves to blame. We've spent to much time gazing into our own navels, inventing methodologies and frameworks and microservices and constantly rewriting perfectly adequate code to suit the fad of the week rather than simply getting the job done. However, the type of management that allowed us go get away with that is going to stand very little chance against the well-honed sales patter of an AI salesman dangling the prospect of incredible savings.

Oh, and just remember that "high level" computer languages only exist for humans. The only reason these coding assistants are spewing forth human-readable code is because that was what was available to train them and it's what today's development tools expect as input. The fact that you presently get to see it and understand it is merely an artifact of history. An ideal AI system might produce binary code directly, or invent its own intermediate representation, cutting you out of the process completely. And that's another instance of a boiling frog - over time there'll be less human oversight of both the code and its consequences.

This is in many ways an existential crisis for humanity. Not because we're at risk from the rise of the machines, but because we're at risk from the rise of supreme wealth. The wealthy have never had much sympathy for the poor, but tolerated them as necessary cogs in the money-making machine. Now, they see the prospect of their complete elimination. AI is a necessary part of their dystopian future. But, of course, I exaggerate. The idea of such pathological misanthropy is absurd. Do join me in this pan, the water's lovely.

Oh don't the days seem lank and long
When all goes right and none goes wrong,
And isn't your life extremely flat
With nothing whatever to grumble at!