News: 0175851031

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Nvidia's Huang Says His AI Chips Are Improving Faster Than Moore's Law (techcrunch.com)

(Wednesday January 08, 2025 @11:41AM (msmash) from the pushing-the-limits dept.)


Nvidia's AI chips are [1]advancing faster than Moore's Law , the semiconductor industry's historical performance benchmark, according to chief executive Jensen Huang. "Our systems are progressing way faster than Moore's Law," Huang told TechCrunch. Nvidia's chips have improved thousand-fold over the past decade, outpacing Moore's Law's prediction of doubled transistor density every year, Huang said. He adds:

> We can build the architecture, the chip, the system, the libraries, and the algorithms all at the same time. If you do that, then you can move faster than Moore's Law, because you can innovate across the entire stack.

>

> [...] Moore's Law was so important in the history of computing because it drove down computing costs. The same thing is going to happen with inference where we drive up the performance, and as a result, the cost of inference is going to be less.



[1] https://techcrunch.com/2025/01/07/nvidia-ceo-says-his-ai-chips-are-improving-faster-than-moores-law/



Revisionist History In The Making (Score:2)

by toddz ( 697874 )

Nvidia knows they can't beat Moore's Law so they are just going to redefine it.

Re: (Score:3)

by leonbev ( 111395 )

While the GPU complexity and GPU power usage has gone through the roof, the actual GPU rendering performance has only been improving by about 20% a year for the past 5 years. It seems like they're running headfirst into a giant efficiency wall.

Re: (Score:2)

by Rei ( 128717 )

He's not talking about rendering performance.

BTW, a lot of these performance gains are "real, but with a catch". They've gone from FP16 to FP8 to FP4 offerings. And all of those are useful for AI inference (not so much for training). And each halving of size gets you roughly double the FLOPS. So yeah, you can run inference a lot faster. But it's not exactly an apples to apples comparison.

Re: (Score:2)

by Junta ( 36770 )

Yeah, was recently in a situation where AI approach was being compared with a simulation and noting that the AI approach was much much faster to 'good enough' results compared to the simulation. Someone tweaked the simulation to use similar precision as the AI approach ultimately used, but for the traditional simulation and *of course* it ran much much faster even than the AI approach. Massive speedups through precision reduction.

An indirect boon to some fields is that the "you don't need supreme decision

Mixing metaphors here... (Score:2)

by alispguru ( 72689 )

Moore's Law was originally about price/bit.

Until about 2007, we also saw [1]Dennard scaling [wikipedia.org] which was driving the increase in clock speed.

Barring a really new and exciting technology, the only way inference is going to get faster is by massive parallelism. Which is hard, from both an algorithmic point of view and from chip cooling.

[1] https://en.wikipedia.org/wiki/Dennard_scaling

Re: (Score:2)

by Valgrus Thunderaxe ( 8769977 )

It's not even a "law" in any sense but rather a business slogan.

Re:Mixing metaphors here... (Score:5, Informative)

by Pieroxy ( 222434 )

[1]Moore's law [wikipedia.org] is the observation that the number of transistors in an integrated circuit doubles about every two years. There's nothing in it about price/bit, or whatever that means.

[1] https://en.wikipedia.org/wiki/Moore's_law

So are mine! (Score:1)

by Black Parrot ( 19622 )

Buy my stock.

What? 10 years Moore's law would mean 1024 (Score:2)

by lamer01 ( 1097759 )

So, they are basically even with Moore's law even though they have redefined from being transistor count to performance.

Re: (Score:2)

by AvitarX ( 172628 )

Moore's law hasn't been taken to mean a doubling every year for decades.

According to Wikipedia Moore adjusted it to every two years in the mid 70s.

Salesman says salesman things. (Score:2)

by nightflameauto ( 6607976 )

When a C Suite says something that borders on fantasy, it could just be salesmanship. At some point in the past decade or so, we decided that salesmanship was more important than a grasp on reality. Doesn't matter if what they're selling is real. It's how they sell it that matters! Stock price should continue to climb for another few clicks with that level of salesmanship and that much disconnection from reality. How long can Wall Street operate on daydreams? I guess we're gonna find out!

Re: (Score:2)

by timeOday ( 582209 )

The NVidia hardware / software stack has definitely become much, must faster over the last decade, there isn't any doubt about that.

It would be nice if $ per flop and watts per flop had decreased as much as performance increased, since a single H200 is about $35,000 and a DGX box packs 8 of those. Nvidia wasn't selling anything at that price scale a decade ago.

May I be forgiven a little skepticism? (Score:3)

by hyades1 ( 1149581 )

"Huang says his AI chips are improving faster than Moore's Law"

Yeah, well let me introduce all you ladies out there to my ten-inch Huang.

;-)

FORTRAN? The syntactically incorrect statement "DO 10 I = 1.10" will parse and
generate code creating a variable, DO10I, as follows: "DO10I = 1.10" If that
doesn't terrify you, it should.