News: 0180331371

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

The Accounting Uproar Over How Fast an AI Chip Depreciates (msn.com)

(Monday December 08, 2025 @11:57AM (msmash) from the creative-bookkeeping dept.)


Tech giants including Meta, Alphabet, Microsoft and Amazon have all extended the estimated useful lives of their servers and AI equipment over the past five years, sparking a debate among investors about [1]whether these accounting changes are artificially inflating profits . Meta this year increased its depreciation timeline for most servers and network assets to 5.5 years, up from four to five years previously and as little as three years in 2020. The company said the change reduced its depreciation expense by $2.3 billion for the first nine months of 2025. Alphabet and Microsoft now use six-year periods, up from three in 2020. Amazon extended to six years by 2024 but cut back to five years this year for some servers and networking equipment.

Michael Burry, the investor portrayed in "The Big Short," called extending useful lives "one of the more common frauds of the modern era" in an article last month. Meta's total depreciation expense for the nine-month period was almost $13 billion against pretax profit exceeding $60 billion.



[1] https://www.msn.com/en-us/technology/artificial-intelligence/the-accounting-uproar-over-how-fast-an-ai-chip-depreciates/ar-AA1RVBf6



What's wrong with an accounting trick or two? (Score:2)

by Mr. Dollar Ton ( 5495648 )

It isn't like all these videocards will burn up in the race for "AGI". They will remain in working order long after the so-called "AI" bubble is gone.

Re: (Score:3)

by drinkypoo ( 153816 )

Most of them aren't video cards as they don't have video output. A DAC and ports cost money that you don't need to spend to run LLMs. The other uses for these cards are mostly scientific, and there's not enough money in that to justify owning them. Perhaps the AI bubble crashing will lead to a push towards some kind of crypto still efficiently mined with GPGPUs. Eew.

Re: (Score:1)

by gabebear ( 251933 )

If the AI crash happens in the next couple years these HUGE companies will likely hold onto the accelerators for a couple more years after that and by the time they are sold at rock-bottom prices they will be truly worthless... which at least saves us from them being used by miners...

Re: (Score:3)

by drinkypoo ( 153816 )

> which at least saves us from them being used by miners

My concern is that the companies that bought them to run LLMs will become miners themselves to try to recoup some of the costs.

Re: (Score:1)

by Mr. Dollar Ton ( 5495648 )

Whatever, matrix multipliers then.

Re: (Score:2)

by rsilvergun ( 571051 )

It's still the exact same silicon and it's got the same problems. Not all of them burn out but some of them do.

The real question is how long until it's replaced by newer or better hardware. Basically will we see custom hardware replace video cards soon for llm acceleration. Similar to what we saw with Bitcoin.

That Won't help consumers because the Fab capacity is just going to go to different silicone, but it does mean that a whole shitload of these gpus will become worthless. I guess some of them wi

Three years is too short nowadays (Score:2)

by dskoll ( 99328 )

I have always through three years was too short for servers and network equipment. Especially nowadays that Moore's Law is slowing down, I think a 5-year depreciation period for servers makes sense.

For AI processors, though, I think three years might be too long given how much change is going on in that space.

Re: (Score:3)

by Malc ( 1751 )

Indeed. I learnt recently that one of our GitLab VM hosts for Linux build runners is is hardware from 2012. The dev team discovered this when a vendor sent us an updated library that dropped SSE4.2 support and required AVX2, causing our smoke tests to fail and thus fail the builds. Why throw away hardware that is still working and performant?

Typical company approach to accounting (Score:2)

by Targon ( 17348 )

If a new generation of product comes out every one or two years, many companies will push the idea of, "customers will buy every new generation" as they inflate their projected numbers. They will also just assume that companies will continue to just throw money at buying AI based products, even when they already have enough to meet their needs.

There is SOME merit to expecting that after a 30 percent boost in performance to a new generation, companies MAY decide to upgrade/replace equipment, but that is no

Re:Typical company approach to accounting (Score:4, Informative)

by Whateverthisis ( 7004192 )

I think you're missing the point. This isn't about inflated revenue projections. It's about inflated profits.

Each of these companies is spending tremendous amounts on building servers and data centers right now. the cost of that CapEx is depreciated by it's useful service life, which can vary quite a bit depending on what it is. Servers are typically 3 years or so, whereas real estate can be up to 28 or 30 years. It's a non-cash expense, but they get to claim that as an expense and amortize it out for many years, which while a non-cash expense it does allow them to reduce their profits and thus tax basis.

The problem is it reduces profits, which makes the companies seem like they're spending too much money. As a calculated value, it's open to manipulation to make the company look better. It doesn't really matter what number of years you use for a given piece of equipment, as long as it's consistent and it makes sense. Changing your amortization schedule from what it was historically sends a signal that the company is artificially adjusting it's numbers to make things look better.

Using the numbers above, if Meta had the same pre-tax profit of $60B now but was using the 3 year depreciation schedule they used in 2020 vs the current 5.5 year, then instead of depreciation being $13B it'd be $23.8B, meanding they'd lose nearly almost $11B in recorded profits, just from a calculation. So in essence this boosts their stock price by making them look more profitable than they are.

Re: (Score:2)

by russotto ( 537200 )

This isn't just about GPUs, though; it's about all the hardware. Servers are typically used much longer than 3 years. I expect networking hardware lasts at least as long as the servers. Maybe you're burning out the AI training and inference stuff in a couple of years, but the other stuff lasts much longer.

LLM hardware is a crappy "investment"? (Score:2)

by gweihir ( 88907 )

Who would have thought...

Shell games and Ponzi schemes (Score:2)

by jenningsthecat ( 1525947 )

To some extent, it's always been the case that the value of just about anything is arbitrary, and varies according to context. But the underpinnings of what we call The Economy are becoming more and more divorced from any consistency or standards. Increasingly it's all a dirty exercise in what ranges from misplaced optimism to opportunism, extortion, and fraud.

There seems to no longer be even a pretense of fairness, or duty to society, or basic decency among corporate interests. My Slashdot sig was meant to

Or perhaps (Score:2)

by mrspoonsi ( 2955715 )

Those AI servers are expensive, why retire one early, on the same terms of a normal server which could be 10x cheaper? Better for the environment also to an extent (if powered by carbon neutral).

Isn't the lifetime shorter than 5years (Score:2)

by wakeboarder ( 2695839 )

Because they will replace that chip in 2 to 3yrs?

It comes down to power efficency (Score:1)

by gremlin123 ( 9969532 )

If you have a fleet of these servers in production, they can last a long time, many years.

Eventually, they become too expensive to operate vs new servers, mainly because the new servers are more power efficient and power is one of the largest operational costs of running a data center. The new servers tend to be faster and fewer are required, but they dont typically offer any new capability, it just comes down to the cost per unit of work.

"It runs like _x, where _x is something unsavory"
-- Prof. Romas Aleliunas, CS 435