Every one of NVDA's major customers already has a working prototype for AI chips. This includes AMZN, GOOG, and META (and if I recall correctly, even MSFT).
And as I said, the real appeal of NVDA is their software ecosystem, not their HW. So for now, they are safe. BUT, their competition knows this too and is working on boosting their footprint in the ecosystem. Except for CUDA, the rest are OSS. And nobody wants to use CUDA if they have the option of using PyTorch.
Compared to CUDA, PyTorch is too slow. But nobody uses python in its native form for this kind of work. They link to compiled libraries that run fast. And this what AMD, IBM, and others are contributing to (PT is OSS). They are making machine compatible versions of the libs for their chips.
Takes a bit of time (my guess is no more than 3 years ,but for all I know it could come in Q4). And when it does, grab a bag of popcorn and watch the show.
big customers who are currently buying Nvidia's chips probably are looking at building their own using the same fab
That is a neat factoid. Good for you. I remember when I hauled fiber optics from the Sprint office 2 floors above us to our little server room and then legitimately claimed we're one of the ISPs who owns its own fiber and is directly connected to a Tier 1 provider. It was a big bold claim for a start up ;)
Did you know "Kirk's Financial Links" was THE FIRST investing related site listed on early Yahoo! list of links?
|