News: 0180055150

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

AI Bubble Is Ignoring Michael Burry's Fears (bloomberg.com)

(Wednesday November 12, 2025 @11:50AM (msmash) from the closer-look dept.)


An anonymous reader shares a report:

> Costing tens of thousands of dollars each, Nvidia's pioneering AI chips make up a hefty chunk of the $400 billion that Big Tech plans to invest this year -- a bill expected to hit $3 trillion by 2029. But unlike 19th-century railroads, or the Dotcom boom's fiber-optic cables, the GPUs fueling today's AI mania are [1]short-lived assets with a shelf life of perhaps five years .

>

> As with your iPhone, this stuff tends to lose value and may need upgrading soon because Nvidia and its rivals aim to keep launching better models. Customers like OpenAI will have to deploy them to stay competitive. So while it's comforting that the companies spending most wildly have mountains of cash to throw around (OpenAI aside), the brief useful life of the chips and the generous accounting assumptions underpinning all of this investment are less consoling.

>

> Michael Burry, who made his name betting against US housing and who's recently turned to the AI boom, waded in this week, warning on X that hyperscalers -- industry jargon for the giant companies building gargantuan data centers -- are underestimating depreciation. Far from being a one-off outlay, there's a danger of AI capex becoming a huge recurring expense. That's great for Nvidia and co., but not necessarily for hyperscalers such as Google and Microsoft. Some face a depreciation tsunami that's forcing them to be extra vigilant about controlling other costs. Amazon has plans to eliminate roughly 14,000 jobs.

>

> And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans. This includes lending to more speculative startups known as neoclouds, who offer GPUs for rent. Microsoft alone has signed more than $60 billion of neocloud deals.



[1] https://www.bloomberg.com/opinion/articles/2025-11-11/ai-bubble-is-ignoring-big-short-michael-burry-chip-depreciation-fears



How Big and How Short? (Score:2)

by Pseudonymous Powers ( 4097097 )

Arguably, most people think this "AI" boom is a bubble (except for the people who think that true AGI is happening sometime next year, which to my mind means the end of capitalism, and possibly civilization, shortly thereafter, but whatever), but nobody knows how to time it. The same was true during the subprime mortgage crisis. Burry deserves credit for loudly and publicly stating that the emperor has no clothes at a time when few others in his profession would, but the point of that story was not that i

Re: (Score:2)

by Zocalo ( 252965 )

It wasn't hard to tell that the emperor in the fable was naked at the equivalent point in the tale either, but it still took that lone voice to pipe up and say so. In the case of sub-prime, the smart people (or at least their smart financial advisors) sat up, paid attention to what Burry was saying and took some mitigating action, everyone else took a bath or, if they had the right contacts and leverage, got a government bailout.

In my mind, AI is just about at that point but is still suffering from a co

Obvious questions (Score:2)

by ebonum ( 830686 )

What depreciation method are these companies using? What is suggested by GAAP? What is reality ( or how fast are these chips actually going to zero value? )?

My understanding is that most companies use 3 or 5 year straight-line depreciation with 0 residual value for "computers". This seems reasonable for these Nvidia chips. Are they doing something different?

Re: (Score:2)

by ndsurvivor ( 891239 )

I think the implications of that is that there will be no taxes paid by any of these companies, probably for the next 20 years, all while pocketing billions upon billions of dollars.

I Don't Understand The Story's Intent (Score:2)

by SlashbotAgent ( 6477336 )

Are they saying that this isn't a bubble? It definitely is!

Are they saying that the bubble won't pop because AI chips "have" to be replaced? Wanna bet? Burry has. Time will tell, even if his timing of off.

Are they seriously implying that this highly suspect $3trillion number is going to be recurring revenue? LOL! Not a chance.

Frankly this just reads like Bloomberg is pumping the bubble. It can definitely get big. But it can't grow infinitely and it can't maintain it's current size for very long.

Finance drama (Score:3)

by abulafia ( 7826 )

This is a very specific form of writing. It is kind of, but not quite journalism, not quite fictionalization, and not just an attempt to influence other market participants.

The author is trying to tell the story within the form - A Titan of Finance is making a Bold Bet with big implications for the little peoples' 401Ks!

Various folks with input to the story all have their own angle and want to steer it to their advantage. Everyone outside the story who is paying attention can see the bubble, but have t

Re: (Score:2)

by Eneff ( 96967 )

I think they're trying to say that the GPUs will depreciate more quickly than expected and thus the expectations of their return on investment (on which the loans financing the GPU purchases) depend on will leave all of these major companies heavily in debt without revenue generation on the assets to justify their purchase in later years.

Conceivably, this could lead to bankruptcies and a chain of failures from companies like google and amazon, with a massive drop in stock value and a "too big to fail" probl

All for a dollar. (Score:2)

by Ostracus ( 1354233 )

You say it like it's a bad thing. In case people forgot mining rigs went on the market, cheap putting an end to the starvation before. Same will happen with those AI rigs.

They won't depreciate that much (Score:2)

by gr8_phk ( 621180 )

Moore's law is over. TSMC 14A node is pretty much the end of the road with the current 2nm node close to it in terms of performance. nVidia has also got packaging quite good, so the chips can't really get packaged much closer. In other words, compute capability per rack is not going to increase very much beyond the next couple years, and even from now to then there won't be a whole lot of improvement. I think old data centers will still have some value, just less than the final ones a few years from now. If

Re: (Score:1)

by 0123456 ( 636235 )

Without Moore's Law you can build more powerful chips by making them bigger, but they'll take more power to run. Which means more cooling to keep them running and more power plants to run them.

There might be improvements to chip design to make them more optimal for AI software, but that's likely to be a one-off.

Supercomputer vs PC. (Score:2)

by Fly Swatter ( 30498 )

This is the supercomputer phase of AI; it needs huge amounts of space, resources, and expensive equipment. When an eventual successor is developed that reduces all those resources down to a small box that sits in someone's home or pocket and does the same thing faster and almost infinitely cheaper - all this debt will be worse than just throwing money in the fireplace.

Until then this idea of just making an AI database bigger will never be profitable. The only hope is that new developments quickly render

They already did that (Score:2)

by ebunga ( 95613 )

Nobody wanted those stupid AI laptops.

Depends on the meaning of "shelf life" (Score:2)

by DeplorableCodeMonkey ( 4828467 )

Assuming the GPUs aren't unusable due to wear, they can be repurposed to provide low cost services.

I've worked on projects where they'd have spent millions of dollars on renting GPUs per quarter if the AWS sales pitch was "these are so last 3 years, but they're dirt cheap for letting your data scientists experiment."

I think he's 100% over the target about the accounting side, but I think he is potentially underestimating how much money corporations would be willing to throw at "old GPUs" that are substantia

This can't be right. (Score:2)

by nightflameauto ( 6607976 )

> And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans.

Seriously? GPUs as collateral? Can you use something that will depreciate to nearly zero before the term of the loan is up? Or are these extremely short-term loans? Are banks just impressed with the big number of greendbacks a company has slung at GPUs and utterly ignorant of how little than number will mean in ten, or even five, years time? Again I ask, "What in the actual fucking fuck are we doing?" I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety

Re: (Score:1)

by 0123456 ( 636235 )

The Economy relies on ever-increasing amounts of debt to function. Banks are fine with lending money because they expect taxpayers to bail them out if the loans go bad.

> I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety of functional society, just because a very few people might make some money off of it. WTF?

It's been like that for years now. Society is collapsing and we're in the Looting The Treasury phase.

Re: This can't be right. (Score:1)

by blue trane ( 110704 )

"they expect taxpayers to bail them out"

Does the Fed need taxpayers, or does it simply print (digtally) money? Have taxes gone up or down since 2008?

Look at the bright side (Score:2)

by Waffle Iron ( 339739 )

In a few years, all of these GPUs will be available on eBay for a few bucks each.

Then I'll finally be able to snag a whole bunch of them and build a Beowulf cluster to run SETI@home faster than anybody else.

Re: (Score:2)

by Gilmoure ( 18428 )

[golf clap]

So the problem with the bubble (Score:2)

by rsilvergun ( 571051 )

Isn't all the infrastructure and hardware. That stuff's going to get used because the goal of AI is to replace white collar workers and that tech does work. Not perfectly but it's improving every day and it already does quite a bit.

The problem is that the nature of llms means that when things shake out we're going to be left with just a couple of big players. That's because the only people who are going to be able to stay in the game are the ones who have access to training data from real human beings a

Re: So the problem with the bubble (Score:1)

by blue trane ( 110704 )

If the Fed has unlimted power to do "whatever it takes" to end panics, without needing taxpayer money, why can't it fund a basic income, and index it to inflation?

The massive spending is based on the assumption (Score:2)

by MpVpRb ( 1423381 )

...that ever increasing compute power will be needed for future AI

This reminds me of the old military saying that generals plan to fight the last war

One efficient algorithm changes everything

One different processing approach like analog hybrids or bio hybrids changes everything

The future is becoming increasingly unpredictable

An ancient proverb summed it up: when a wizard is tired of looking for
broken glass in his dinner, it ran, he is tired of life.
-- Terry Pratchett, "The Light Fantastic"