| | | All of these circular deals in a fast moving space without solid business plans are designed for maximal hype and public mind-share rather than something that people will want to keep around in the long run. Remember how Google had the undisputed lead in the latter half of the 2010's with its leading hardware and neural net designs (they invented transformers...) --they seem to be playing second fiddle to nVidia in both training and inference now. OpenAI's MO has always been about going BIG with infrastructure using the best existing ideas along with judicious tweaks to the training process, and Altman has already said it will end in tears for some. They're going really big now, never mind bringing along any kind of actual business plan.
The largest language models are a significant scientific breakthrough, but they are just that: language models and not artificial intelligence. They have significant issues in alignment and continue to hallucinate, so I don't use them except to point me in the right general direction when I am exploring something new --I certainly don't treat, or trust them as intelligent companions. That said, I do think the more focused, smaller models are adding economic value and some companies do have these as live implementations: Qualcomm with its cellular subsystems, nVidia with its gaming graphical enhancements, proof based math networks that score well on math competitions (that still have to be checked and understood), and I could see specifically tailored models finally controlling fusion plasma or predicting weather well.
Being first or second is nice, but we are entering the stage where froth is building into a bubble and those plans hardly sound like a sustainable platform people will care about in even 5 years time. I think the talk of 6 GW of power needed to power inferencing infrastructure without tangible economic returns will not be sustainable with AMD chips (maybe no one's chips...), and even if customers come, the inefficiency of running these things over the course of a year or two could well dwarf the upfront costs vs. rival solutions. If we shovel back those inefficiencies to the supplier, they'd be a burnt hole in the ground... |
|