AI Chip Competition From Start-Ups: Interesting comments from poster grxbstrd on Seeking Alpha:
**********************************************************************************************************************
I've been invested in NVDA for nearly 20 years. I hear and appreciate your investing philosophy and agree for the most part.
Watching this stock go parabolic over the last 5 years has been a once in a lifetime event for which me and my family are extremely lucky and grateful. But having held through 85% losses, like in 2007/8, I have a different perspective on this company, the sense that the gains can be fleeting.
CEO used to talk about changing the world and a lot of folks laughed. CEO invested in CUDA when GPGPU meant nothing except to a bunch of Oil and Gas and FinTech quants, everyone said why are you wasting your money on that. But this is when NVDA really got back on my radar and so watching NVDA's competition is one of the most important things I can spend my time on. NVDA convinced me with their execution this time is no fluke. I've been to AI hardware summits, I've heard Cerabras and Graphcore and Sambanova all talk. If you want summary go look up Karl Freund's recent article on Forbes summarizing hot chips.
What I know about NVDA's solutions are two things:
1. GPUs provide dense, flexible and programmable computation 2. NVDA's GPUs have a software stack, a moat, around it that is not going to be replicated (breached) any time soon.
Stories of the end of days for GPUs started with Graphcore and their colossus which was "orders of magnitude" faster than a GPU. Until it wasn't. Now it's a low price leader for instances for those who rolled the dice with them, same with Habana Labs.
Chips like tenstorent maybe have a future. First, I heard Keller recently speak and he was lamenting about software and resources and the need to beef up this part of his offering. So lets just say that's an issue. But second, what are they trying to do? I guarantee Jim can build a chip that does image classification better than a GPU, there is no question. But can that same chip reconfigure itself to run a 3 or 4 simultaneous NNs for a recommender, and then 4 or 5 NNs for a content and context aware NLP simulation, ask and answer with 20ms latency, and then reconfigure to run a virtual desktop ray traced design environment for a game? Beyond that their software is certified for 24/7 customer facing production at all of the largest cloud providers in the world.
This is what GPUs are doing for CSPs. And this is what's standing in front of anyone who want to replace GPUs in the enterprise including Jim Keller and his Tenstorrent startup. Utility, Performance, Flexibility, Low Total cost of ownership, high ROI, bullet proof performance and reliability. You have to admit that's a high bar for a chip, no?
Spending on GPUs is just going north, NVDA is intimating about being sold out through 2022 (including aggressive internal growth goals).
But now, while startups are still trying to match GPU performance, NVDA is shifting focus to the *entire data center*, network and storage performance (and security) with DPUs. And the Grace CPU will be here end of 22, which is just going to raise the bar again.
Startups are fumbling around with chips, trying to catch up on software, trying to understand the implications of data center performance as Graphcore, and Cerebras and SambaNova all learned, and NVDA is running at full stride, upping the bar as they go.
I say, bring the competition on. Their first step should be submitting results on MLPerf. I await Jim Keller's sure-to-be impressive results. |