NVIDIA GPU Demand To Exceed Supply As Green Team Bets On ChatGPT, Thousands of AI Chips Expected To be Incorporated By Tech Giants
Hassan Mujtaba
Feb 11, 2023
Excerpts
- As reported earlier, ChatGPT and other language/image/video generation tools rely heavily on AI processing power and that's where NVIDIA's main strength lies. This is why major tech companies that are leveraging ChatGPT are utilizing NVIDIA's GPUs to power their growing AI requirements. It looks like NVIDIA's prowess in this industry might just cause a shortage of the company's AI GPUs in the coming months.
- Major tech giants such as Microsoft and Google are also planning to integrate ChatGPT-like LLMs into their search engines, reports Forbes. For Google to integrate this within every search query, it would require 512,820 A100 HGX servers with a total of 4,102,568 A100 GPUs which should end up around $100 Billion of Capex alone in terms of server and networking costs.
- Deploying current ChatGPT into every search done by Google would require 512,820.51 A100 HGX servers with a total of 4,102,568 A100 GPUs. The total cost of these servers and networking exceeds $100 billion of Capex alone, of which Nvidia would receive a large portion. This is never going to happen, of course, but fun thought experiment if we assume no software or hardware improvements are made.
- It remains to be seen how NVIDIA responds to this huge demand from the AI segment. The GPU giant is expected to announce its earnings for Q4 FY23 on the 22nd of February, 2023.
wccftech.com |