Subscribe Sign In
- Technology
- Artificial Intelligence
Follow
Nvidia Takes an Added Role Amid AI Craze: Data-Center Designer Beyond its chips, the company is playing a growing role in shaping the server farms where AI is produced and deployed
By Asa Fitch Follow
Sept. 2, 2024 5:30 am ET
Holding more of the value in AI data centers adds revenue and makes Nvidia’s offerings stickier for customers. Nvidia dominates the chips at the center of the artificial-intelligence boom. It wants to conquer almost everything else that makes those chips tick, too. Chief Executive Jensen Huang is increasingly broadening his company’s focus—and seeking to widen its advantage over competitors—by offering software, data-center design services and networking technology in addition to its powerful silicon brains. He is trying to build Nvidia into more than a supplier of a valuable hardware component: a one-stop shop for all the key elements in the data centers where tools like OpenAI’s ChatGPT are created and deployed, or what he calls “AI factories.” Huang emphasized Nvidia’s growing prowess at data-center design following an earnings report Wednesday that exceeded Wall Street forecasts. The report came days after rival Advanced Micro Devices agreed to pay nearly $5 billion to buy data-center design and manufacturing company ZT Systems to try to gain ground on Nvidia. “We have the ability fairly uniquely to integrate to design an AI factory because we have all the parts,” Huang said in a call with analysts. “It’s not possible to come up with a new AI factory every year unless you have all the parts.” It is a strategy designed to extend the business success that has made Nvidia one of the world’s most valuable companies—and to insulate it from rivals eager to eat into its AI-chip market share, estimated at more than 80%. Gobbling up more of the value in AI data centers both adds revenue and makes its offerings stickier for customers.
Nvidia CEO Jensen Huang, right, emphasized the company’s growing prowess at designing ‘AI factories.’ Nvidia is building on the effectiveness of its 17-year-old proprietary software, called CUDA, which enables programmers to use its chips. More recently, Huang has been pushing resources into a superfast networking protocol called InfiniBand, after acquiring the technology’s main equipment maker, Mellanox Technologies, five years ago for nearly $7 billion. Analysts estimate that InfiniBand is used in most AI-training deployments. Nvidia is also building a business that supplies AI-optimized Ethernet, a form of networking widely used in traditional data centers. The Ethernet business is expected to generate billions of dollars in revenue within a year, Chief Financial Officer Colette Kress said Wednesday. More broadly, Nvidia sells products including central processors and networking chips for a range of other data-center equipment that is fine-tuned to work seamlessly together. And it offers software and hardware setups catered to the needs of specific industries such as healthcare and robotics. “He has verticalized the company,” Raul Martynek, CEO of data-center operator DataBank said of Huang’s strategy. “They have a vision around what AI should be, what software and hardware components are needed to make it so users can actually deploy it.”
Competition is growing
Nvidia’s competitors are responding. The Advanced Micro Devices deal to buy ZT Systems is fundamentally about gaining skills in how data centers are built. AMD plans to sell ZT’s manufacturing operations and retain its design experts to mount a more formidable challenge to Nvidia. AMD has made other acquisitions to bolster its data-center offerings in recent years, including of programmable-chip company Xilinx and data-center networking company Pensando in 2022.
Lisa Su, CEO of Advanced Micro Devices, which agreed to buy a data-center design and manufacturing company. Other chip suppliers, from Intel to AI-chip startups such as Cerebras Systems and SambaNova Systems, are offering services and systems that do a large share of the work for customers who want to build and operate AI tools. Analysts and industry executives say the trend partly reflects a customer preference for plug-and-play AI computing infrastructure that helps them move quickly amid an AI boom where speed is paramount. Doing more of the heavy lifting may help chip-makers better compete as AI investment shifts from experienced tech giants such as Meta Platforms and Microsoft to a wider swath of businesses with less expertise. “Enterprises typically go with the turnkey solution,” said Srini Pajjuri, an analyst at investment firm Raymond James. “They don’t have the resources, they don’t have the technology or know-how” to cobble together the equipment themselves, he said.
Risks in the strategy
There are some risks in the strategy, Pajjuri said. As AI computing becomes more competitive and the rush to set up AI factories subsides, customers might explore alternatives to many of the proprietary technologies that Nvidia has built up around its AI chips. “Right now, time to market is the most important thing, so they’re adopting solutions from Nvidia, but as the technology matures, Nvidia might not be able to capture more of the pie,” he said. Nvidia also could confront regulatory hurdles that have sprung up for many companies in the past that have come to dominate a market and extend their influence in it. The company is under scrutiny for its market practices in Europe, and its offices in France were raided this past year, although no charges have been brought against the company there. On Wednesday, Huang described Nvidia’s role as one of a coordinator and designer that could give guidance on how to set up complex AI infrastructure, but he emphasized that Nvidia isn’t going to do everything. With its next-generation Blackwell AI chips, set to ship out late this year or early next year, it is offering a design for an entire rack of computing equipment for the first time—a cabinet about 2 feet wide and 6 feet high. But Nvidia itself won’t make them. “We know how to design the AI infrastructure, provide it the way that customers would like it and let the ecosystem integrate it,” Huang said. Write to Asa Fitch at asa.fitch@wsj.com Making of a Tech Giant
Copyright ©2024 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
Hide Conversation (24)
Conversation24 Comments
By joining the conversation you are accepting our community rules and terms. To view rules, terms and FAQs, click here. Questions can be sent to moderator@wsj.com.
Conversations on news articles and news columns must be kept on the topic of the story. In opinion content, conversations can delve into other subjects. The conversation will close on this article four days after publication.
What do you think?
Sort by Newest
HG
HP GATES
44 minutes ago
If you look at NVDA's growth strategy/acquisitions and that of its most serious competitor AMD, you see striking similarities. They both will succeed but because of CUDA, NVDA has a huge edge. As the author points out, time- to-market is the priority for NVDA and AMD customers. Jensen's 17 year investment in CUDA enables customers to develop AI solutions (SW) much faster. An interesting sidenote - Jensen and AMD CEO Lisa Su are related, 2nd cousins once removed.
Reply·
· Share
Ronald Carlson
1 hour ago
We own and buy and sell NVDA daily
Reply·
1
· Share
JW
Joe Waller
2 hours ago
Own and hold NVDA, equity or options... or both.
Reply·
· Share
YS
Yi Sun
2 hours ago
Sell you all the tools to dig gold but never do it themselves. Emm...
Reply·
· Share
RC
Rick C
2 hours ago
Back during the CA gold rush the majority of miners found no gold or very little. The guys who made the bulk of the money sold shovels.
Reply·
· Share
RJ
Ray Jansen
1 hour ago
Levi's...
Reply·
2
· Share
CC
Connie Cunningham
2 hours ago
AI reminds me of the arms race. Where , when, and how will it end? I believe AI may be the modern day Frankenstein. I imagine lots of terrible things: tens of millions permanently jobless; mandatory brownouts for homes, not businesses, to ease the stress on the power grid; accidental release of weaponry by the military on its own citizens; etc. I can only hope AI kills itself and leaves no collateral damage.
Reply·
· Share
RC
Rick C
2 hours ago
Mandatory blackouts for sure. So far the only thing AI seems to be good at is summarizing meeting minutes and writing email. That doesn’t sound like a job killer to me.
Reply·
· Share
TT
Thomas Tag
3 hours ago
I just need it to hit $150
Reply·
1
· Share
RW
Richard Williams
1 hour ago
Analysts forecasts for NVDA: JPM forecast $155 BAC $165 WFC $165 MS $150
Reply·
· Share
JW
Joe Waller
2 hours ago
. . next Spring
Reply·
· Share
SC
Steven C
3 hours ago
More ways to make money this is the difference between people who can make BIG money and people who complain about inflation
Reply·
1
· Share
GL
George Lai
3 hours ago
The new GPUs have high power density, so it's wise to invest in server and data center design to cool them chips. The new Blackwell is already pushing TSMC's CoWoS-L packaging to the limit -- by keeping them chips cool you give the operating envelope a bit more margin. Server + Data Center design can add % to performance gain for relatively small cost.
Reply·
1
· Share
PR
Pauluz R
1 hour ago
Barron's has a favorable article this weekend about Vertiv Systems (VRT). Liquid cooling technology for high heat transaction processing. Integrating the cooling directly into the rack construction. SuperMicro (SMIC) is also a major player, but that firm has some issues to resolve
Reply·
1
· Share
CS
CUMAR SREEKUMAR
3 hours ago
Reminds one of the way Apple developed once Steve Jobs came back - into multiple enterprises including i-tunes, Apple stores, computer and telephone device makers and so forth. Jobs was also accused of hubris BTW!
Reply·
3
· Share
JK
Jim Kirtland
3 hours ago
This brings to mind Amazon Web Services (“AWS”). The company is no secret (60 Mins did a story on it), is little known outside the industry but it generates Amazon’s largest operating income.
Reply·
· Share
RD
R Dub
1 hour ago
Not so sure about little being known about AWS. Many may not know that Microsoft's Azure cloud is on track to exceed AWS revenue by 2026 at a dizzying ~$150 billion! Much of the "heavy lift" of AI processing can be accomplished in cloud environments (AWS, Google, IBM, Microsoft, ...). Though, requirements still exist for some to operate using local AI processing in various sorts of isolated enclaves. NVDA better be in on data center design. A single 7' rack of this stuff running full bore can exceed all power and cooling capabilities in many small-mid size on-premises computer rooms. (Edited)
Reply·
· Share
DH
Douglas Herz
4 hours ago
Hubris inevitably arrives at massively successful companies and, predictably, drains their resources.
Reply·
3
· Share
RB
Richard Berlin
5 hours ago
The lowest common denominator. Eventually all that will be produced will be the same simple-minded conclusion. The vast pool of input information becomes reduced rapidly over time.......differences diminish....and one is left with......emptiness.
Reply·
1
· Share
DE
David Eyke
5 hours ago
I think NVIDIA is engaging in price gaging and needs to be stopped “for breaking the rules.”
Reply·
1
· Share
JW
Joe Waller
2 hours ago
Yes, the answer is more governmental control ! Exactly what Kamala and the Dems/progressives want! Most important in voting .. we need to maintain separation of powers. A split congress/presidency party-wise is critically important for moderations and compromise to maintain.
Reply·
· Share
JK
Jim Kirtland
3 hours ago
Baloney. Nvidia is not price gaging, rather, it is participating in the new opportunity economy.
Reply·
8
· Share
DE
David Eyke
3 hours ago
??
Reply·
· Share
MS
Mike Spight
5 hours ago
Holding!
Reply·
1
· Share
Powered by
Terms| Privacy|Feedback
Videos
Most Popular news
Most Popular opinion
Recommended Videos
WSJ Membership
Customer Service
Tools & Features
Ads
More
Dow Jones Products

|