News: 0180130649

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Cloud-Native Computing Is Poised To Explode (zdnet.com)

(Tuesday November 18, 2025 @10:30PM (BeauHD) from the what-to-expect dept.)


An anonymous reader quotes a report from ZDNet:

> At KubeCon North America 2025 in Atlanta, the Cloud Native Computing Foundation (CNCF)'s leaders [1]predicted an enormous surge in cloud-native computing , driven by the explosive growth of [2]AI inference workloads. How much growth? They're predicting hundreds of billions of dollars in spending over the next 18 months. [...] Where cloud-native computing and AI inference come together is when AI is no longer a separate track from cloud-native computing. Instead, AI workloads, particularly inference tasks, are fueling a new era where intelligent applications require scalable and reliable infrastructure. That era is unfolding because, said [CNCF Executive Director Jonathan Bryce], "AI is moving from a few 'Training supercomputers' to widespread 'Enterprise Inference.' This is fundamentally a cloud-native problem. You, the platform engineers, are the ones who will build the open-source platforms that unlock enterprise AI."

>

> "Cloud native and AI-native development are merging, and it's really an incredible place we're in right now," said CNCF CTO Chris Aniszczyk. The data backs up this opinion. For example, Google has reported that its internal inference jobs have processed 1.33 quadrillion tokens per month recently, up from 980 trillion just months before. [...] Aniszczyk added that cloud-native projects, especially Kubernetes, are adapting to serve inference workloads at scale: "Kubernetes is obviously one of the leading examples as of the last release the dynamic resource allocation feature enables GPU and TPU hardware abstraction in a Kubernetes context." To better meet the demand, the CNCF announced the Certified Kubernetes AI Conformance Program, which aims to make AI workloads as portable and reliable as traditional cloud-native applications.

>

> "As AI moves into production, teams need a consistent infrastructure they can rely on," Aniszczyk stated during his keynote. "This initiative will create shared guardrails to ensure AI workloads behave predictably across environments. It builds on the same community-driven standards process we've used with Kubernetes to help bring consistency as AI adoption scales." What all this effort means for business is that AI inference spending on cloud-native infrastructure and services will reach into the hundreds of billions within the next 18 months. That investment is because CNCF leaders predict that enterprises will race to stand up reliable, cost-effective AI services.



[1] https://www.zdnet.com/article/cloud-native-computing-is-poised-to-explode-thanks-to-ai-inference-work/

[2] https://www.ibm.com/think/topics/ai-inference



No (Score:1)

by Anonymous Coward

> They're predicting hundreds of billions of dollars in spending over the next 18 months

Complete bullshit. All these companies are throwing around huge numbers, claiming that they are going to spend eleventy gazillion dollars. Do you really believe that companies are going to spend 10 times their total yearly revenue in just the next 2 years? Are you really that stupid? It's all bullshit. It's all just gaming the stock market. Make big announcement, stock go up.

Re:No (Score:4, Insightful)

by OrangeTide ( 124937 )

Part of raising capital is convincing everyone that you need the money and have a good plan for it. We're probably better off if these companies don't actually buy even 10% of the things they are saying they will buy. The hype funding schemes are incredibly harmful to business long-term, but the short-term pay off is too attracted for them to ever stop doing it.

What is a 'token'? (Score:2)

by Shane A Leslie ( 923938 )

And why are being 'processed'?

Re: (Score:1)

by Tablizer ( 95088 )

LLM's version of concepts or categories.

Re: (Score:3)

by abulafia ( 7826 )

A "token" is a substring. They're usually parts of words or whole short words.

"Processing" "tokens" is fundamentally what an LLM does.

Simplified, It takes input text, tokenizes it (splits it up according to the same rules as the corpus), maps that to a huge sparse network of vectors that serve as a lossy represention the tokenized training corpus, and then plays "pick the next most likely token" to respond.

If you choose to pay money to one of the robot timeshares, you are effectively buying the right t

Paint me skeptical (Score:3, Insightful)

by Tablizer ( 95088 )

"Foo Foundation predicts there will be a Foo boom"

Re: (Score:1)

by rudy_wayne ( 414635 )

> Cloud-Native Computing Is Poised To Explode

If by "explode" you mean crash and burn, then yes I would agree.

Reject All Things AI (Score:3)

by BrendaEM ( 871664 )

While there is virtue good or bad in a programmer art, any legitimate AI purpose--has been negated by the AI companies, who stole so much from so many, put so many out of work--all to make the least-working people--the richest.

Really? (Score:3)

by devslash0 ( 4203435 )

At my place we're currently migrating everything we can back to server-based architectures because trying to maintain a zillion of lambdas and other microservices costs us more in development time and money than it costs to run a proper monolithic server. Kubernetes on top and you're sorted.

Turns out having a stack that devs can run FULLY locally, gives them 10x bonus to development speed.

I mean, we're still in the cloud, just not cloud-native.

Re: Really? (Score:2)

by devslash0 ( 4203435 )

You can also run your own server on someone else's server.

No... (Score:2)

by Junta ( 36770 )

You already had your turn in the late 2010s after big data and before AI. You don't get to come back to the hype line for seconds.

Dear Emily:
I'm having a serious disagreement with somebody on the net. I
tried complaints to his sysadmin, organizing mail campaigns, called for
his removal from the net and phoning his employer to get him fired.
Everybody laughed at me. What can I do?
-- A Concerned Citizen

Dear Concerned:
Go to the daily papers. Most modern reporters are top-notch computer
experts who will understand the net, and your problems, perfectly. They
will print careful, reasoned stories without any errors at all, and surely
represent the situation properly to the public. The public will also all
act wisely, as they are also fully cognizant of the subtle nature of net
society.
Papers never sensationalize or distort, so be sure to point out things
like racism and sexism wherever they might exist. Be sure as well that they
understand that all things on the net, particularly insults, are meant
literally. Link what transpires on the net to the causes of the Holocaust, if
possible. If regular papers won't take the story, go to a tabloid paper --
they are always interested in good stories.
By arranging all this free publicity for the net, you'll become very
well known. People on the net will wait in eager anticipation for your every
posting, and refer to you constantly. You'll get more mail than you ever
dreamed possible -- the ultimate in net success.
-- Emily Postnews Answers Your Questions on Netiquette