Nvidia Claims 'Generation Ahead' Advantage After $200 Billion Sell-off on Google Fears
- Reference: 0180208017
- News link: https://slashdot.org/story/25/11/25/1728213/nvidia-claims-generation-ahead-advantage-after-200-billion-sell-off-on-google-fears
- Source link:
Nvidia said it remains "a generation ahead of the industry" as the only platform that runs every AI model and operates everywhere computing is done. The statement came after investors reacted to [2]the release of Google's Gemini 3 large language model last week. The model was trained using TPUs rather than Nvidia chips. A report in The Information on Monday said Google was [3]pitching potential clients including Meta on using TPUs in their data centers rather than Nvidia's chips.
Nvidia said its platform offers "greater performance, versatility, and fungibility than ASICs," referring to application-specific integrated circuits like Google's TPUs that are designed for specific AI frameworks or functions. Google's TPUs have until now only been available for customers to rent through its cloud computing service. Nvidia has lost more than $800 billion in market value since it peaked above $5 trillion less than a month ago.
[1] https://x.com/nvidianewsroom/status/1993364210948936055
[2] https://tech.slashdot.org/story/25/11/18/1634253/google-launches-gemini-3-its-most-intelligent-ai-model-yet
[3] https://www.theinformation.com/articles/google-encroaches-nvidias-turf-new-ai-chip-push
Stop with the "delighted" (Score:3)
No more ! we are officially sick of this descriptor.
Sincerely,
The Public
Re: (Score:2)
Just think of the meaning of delighted as de-lighted, i.e. have taken the light away.
High stakes (Score:2)
This is some big-league gambling! $200B in a morning! That's the entire market cap of Apple in spring of 2010. (For reference that's when the iPhone 4 came out - they were at the top of their game.)
Don't care (Score:1)
Nothing more than slop all the way down.
CPU - GPU - ASIC (Score:3)
GPUs are designed for graphics, for anything else they're basically a general purpose CPU that just by chance happens to be better than other general purpose processors for this specific task. Once the market becomes big enough, someone will expend the resources to develop ASICs which will outperform the general purpose GPUs.
Exactly the same thing happened with bitcoin.
Re: (Score:1)
Why doesn't Nvidia make more specific chips to compete with Google on such server farms? They have the cash.
Re: (Score:2)
The last thing NVidia wants to do is advance TPUs as a viable and more affordable alternative to the GPUs that are a gusher of profit for NVidia. Even if it's inevitable, delaying the erosion of the GPU market is worth billions per month to them.
Re: (Score:2)
> GPUs are designed for graphics, for anything else they're basically a general purpose CPU that just by chance happens to be better than other general purpose processors for this specific task. Once the market becomes big enough, someone will expend the resources to develop ASICs which will outperform the general purpose GPUs.
> Exactly the same thing happened with bitcoin.
Since the advent of GPGPU 20 years ago, Nvidia has had the foresight to look beyond just graphics. That early focus on parallel computing allowed the GPU to adapt to the particular needs of parallel processing. By far the most challenging problem in parallel computing is data movement. GPUs look straightforward and simple until one looks at the memory subsystem, which has evolved into something that is definitely not straightforward and simple but which is able to handle the varied and demanding bandwidt
Nvidia is in a high risk position (Score:2)
A huge portion of their revenue is from AI data centers and there is going to be a huge push to create custom built hardware specifically designed to accelerate those workloads. You saw the same thing with Bitcoin where custom hardware was built and it outperformed gpus.
This means that a few good pieces of custom hardware have the potential to completely wipe Nvidia out. This is especially tough because everything is still consolidated into a handful of monopolies and duopolies that the couple of compan
Re: Nvidia is in a high risk position (Score:1)
If Alphafold can predict protein folding better than any human, can it develop special-purpose chips?
NVIDIAS giant bubble (Score:2)
Nvidia is indeed a generation ahead. Which amounts to an approx. 1 year lead on R&D. Meanwhile, NVIDIA can't actually manufacture the chips they sell, and would have no ability to do so in less than 10 years. At any time they want, TSMC can capitalize on the AI Bubble by switching to their own Chinese chip design and stop or slow production for Nvidia/AMD. That's going to be a *big* adjustment for the S&P 500.
Re: (Score:2)
> Nvidia is indeed a generation ahead. Which amounts to an approx. 1 year lead on R&D. Meanwhile, NVIDIA can't actually manufacture the chips they sell, and would have no ability to do so in less than 10 years. At any time they want, TSMC can capitalize on the AI Bubble by switching to their own Chinese chip design and stop or slow production for Nvidia/AMD. That's going to be a *big* adjustment for the S&P 500.
What??? TSMC has their own chip designs?
It's true that TSMC can stop selling chips to Nvidia, Apple, AMD, and anyone else at any time. Of course, their stock would plummet and severely punish the executives who would dare to do that. The only events that would cause TSMC to not sell ever more to Nvidia would be a Chinese invasion or the rise of another customer that is willing to pay more than Nvidia.
TSMC is a Taiwanese company that is quite friendly with the Taiwanese CEO of Nvidia. That's helping both
Wild stuff (Score:3)
They have a slighly-less-than-good day and their market cap shifts by the entire GDP of Greece.
The Wall Street crowd is clueless (Score:4, Interesting)
NVIDIA, seen as the dominant player in AI, has largely ignored development when it comes to actual graphics rendering for the past 1.5-2 years. RTX 5000 isn't significantly faster than the RTX 4000 generation outside of....AI performance improvements. The fact that NVIDIA put almost all of its eggs in that one basket makes NVIDIA ripe for a correction when the AI bubble pops.
AMD is recognized in the AI area, and while not the dominant player, AMD hardware is still good, and the software stack has been improving over time. AMD hasn't ignored graphics(RDNA 4 was a very solid design based on the Radeon 9070XT performance considering it's mid-level design), FPGA, and of course, CPUs for consumer/business/servers. As a result, even if AI goes down, AMD isn't in danger, and will still continue to do well. NVIDIA LIED in saying that AI eliminates the need for a CPU, people still need a CPU in their computer, and in their servers, and professional workstations.
Intel is a mess, CPU division isn't doing all that well, graphics are weak, AI isn't anywhere, so if anything, Intel should sink just because it has no products that seem to be impressive.
Re: The Wall Street crowd is clueless (Score:1)
Remember when the housing bubble popped, but prices are higher than ever again?
Re: (Score:2)
Are houses as easily produced and sold as video cards?
Re: (Score:2)
> NVIDIA, seen as the dominant player in AI, has largely ignored development when it comes to actual graphics rendering for the past 1.5-2 years. RTX 5000 isn't significantly faster than the RTX 4000 generation outside of....AI performance improvements. ... AMD hasn't ignored graphics(RDNA 4 was a very solid design based on the Radeon 9070XT performance considering it's mid-level design), FPGA, and of course, CPUs for consumer/business/servers.
Nvidia has continued with graphics R&D, but it has slowed down pushing the boundaries of gaming products. Why? Well, their current strategy has seen their market share increase over the last few years. For all of AMD's fan support, its apparent technical prowess hasn't translated into sales. Some of that is due to the realities of the gaming market. Compared to CPUs or data center GPUs, the gaming market is tiny. Furthermore, even at AMD, gaming is a minor target, way behind CPUs and the push for
Predictions are hard, especially about the fuure (Score:2)
GPUs were designed to manipulate image data and put pixels on screens. Later, it was discovered that they could be used for other things.
Today's LLMs require a particular type of math processing, GPUs do it well, but special purpose chips like TPUs and the mega chips from Cerebras may be better suited for the work.
Scientists and engineers are looking into other architectures, including analog.
Yes, there is a strong chance that NVIDIA may remain the leader, but there is also a chance that new algorithms or h
TPU vs. Nvidia (Score:2)
Will TPU v7 finally be able to challenge the dominance of Nvidia GPUs in the data center? Some projections have the TPU garnering 5-10% of the data center processor market. That sounds dubious to me. If TPU is a viable alternative that is likely to be significantly cheaper than Nvidia, why would its market share be capped at 5-10%?
The entire world and every Nvidia customer is begging for an alternative. If TPU is truly viable, then it should immediately challenge Nvidia more forcefully. TPU can be some
Not so odd (Score:3)
They're bragging that you can switch between AI models freely when you're using their hardware. The model is replaceable and therefore fungible because of the hardware capabilities that they provide.
Depending on how much the industry and investors want to chase a flavor-of-the-month, that capability could be very appealing.
My employer doesn't operate in this space, so I don't know if that's a major selling point. But it never hurts to tout every advantage.
Re: (Score:3)
It's pretty important if you're working in a developing field. The original TPU couldn't do floating point so it wasn't really useful for training. IIRC they also work best with matrices that have dimensions that are multiples of fairly big numbers (128? 256?) with later generations working best with bigger matrices.
That's great for the current focus on gigantic attention matrices but not so great if the next big thing can't be efficiently shoehorned into that paradigm.
Re: Not so odd (Score:1)
FPGA, anyone?
Re: (Score:2)
The semi-gibberish headline isn't exactly a model of clarity either... how about "nVidia responds to fears of Google catching up".