| | | I recently returned to a position in Q, after having sold it two years ago. I did so for a couple of reasons. First, its price point was/is depressed in comparison to other semi companies. That’s already been discussed.
More to the point, I did so in view of its AI posture, focusing on inference, and most importantly, on the most energy-efficient adaptation, especially in Data Centers or at the “edge.”
The concept able to realize Data Centers that can operate from existing power utility infrastructure or very modest increments will prevail. And, here’s why:
Those who believe they can just ask for and receive several Gigawatts (GW) of additional power supply whenever or wherever they want it are in dreamland, light-years from earth. They have NO clue what it takes to generate just one GW, 24/7, and how much time it takes to add same to a localized “grid.” For a conventional, gas-turbine-fired machine, (actually more like three or four smaller machines paralleled), probably between three and five years to complete, considering equipment acquisition/availability, environmental permitting, delivery, construction and interconnection issues. And, likely three times that much for a nuclear plant.
Just to put things in perspective, Vogtle, a Georgia Power nuclear plant just completed a 1.5GW third nuclear unit addition. The cost was about $30 Billion, and it took about 20 years. And someone in Virginia asked for 40GW for Data Centers?
Each generator requires three transformers to isolate and convert the voltage. Each one weighs roughly 350 Tons. Yes, 350 Tons each. Think about how easily obtained and transported they would be. And, all the metal buses, and immense circuit breakers to interconnect to the grid. Only a fool would buy just three, since if one were to fail, probably at least a year to find a replacement, to say nothing of transporting it.
Which brings me to the point that “if” it takes half a decade or more to install significant added generation capacity of one GW, what iteration of Nvidia processor will be hocked at that point in time? Will it still be what they’re selling or sold now? What if some innovative, creative supplier who began its existence in a low-power, battery-dependent world comes along with CPUs, GPUs, NPUs and memory that can make do within existing grid capability? (Hint: I think we know one)
Suffice it to say, my long-ago career experience with a large power utility in Silicon Valley pointed me toward Q. And, likely it’s among the first to recognize the reality of the “energy noose” around the current AI Data Center players. |
|