The next AI pivot will be toward efficiency and lowering costs, ex-Facebook privacy chief says Published Tue, Dec 23 202510:26 AM ESTUpdated 4 Hours Ago
 Jaures Yip
ShareShare Article via FacebookShare Article via TwitterShare Article via LinkedInShare Article via Email
Key Points
- Ex-Facebook privacy chief Chris Kelly said that AI companies will pivot towards creating efficiencies in training AI models.
- As data center demand surges, lowering costs and power consumption will be key.
- The data center market has accumulated over $61 billion in infrastructure dealmaking in 2025 as hyperscalers rush into a “global construction frenzy.”

watch now
VIDEO05:25 Expect a drive towards efficiencies in AI in 2026, says Chris Kelly
Former Facebook Chief Privacy Officer Chris Kelly said Tuesday that the next phase of the artificial intelligence boom will focus on becoming more efficient.
As major AI players race to churn out the infrastructure needed to support AI workloads, Kelly told CNBC’s “Squawk Box” that the industry will need to streamline these power-intensive buildouts.
“We run our brains on 20 watts. We don’t need gigawatt power centers to reason,” Kelly said. “I think that finding efficiency is going to be one of the key things that the big AI players look to.”
Kelly, who was also general counsel at Facebook, added that the companies able to reach a breakthrough in lowering data center costs will emerge as AI winners.
Read more CNBC tech news
The data center market has accumulated over $61 billion in infrastructure dealmaking in 2025 as hyperscalers have rushed into a global construction craze, according to S&P Global.
OpenAI alone has made over $1.4 trillion in AI commitments over the next several years, including massive partnerships with GPU leader Nvidia and infrastructure giants Oracle and Coreweave.
But the data center frenzy has garnered growing concerns about where the power to support these buildouts is coming from, with an already strained electric grid.
Nvidia and OpenAI announced in September a project that included at least 10 gigawatts of data centers, which is roughly the equivalent of the annual power consumption of 8 million U.S. households.
Ten gigawatts is also around the same amount of power as New York City’s peak summer demand in 2024, according to the New York Independent System Operator.
Cost concerns were further fueled after DeepSeek launched a free, open-source large language model in December 2024 for under $6 million, the company claimed, significantly lower than U.S. competitors.
Kelly said he expects to see “a number of Chinese players come to the fore,” especially following President Donald Trump’s recent decision to approve the sale of Nvidia’s H200 chips to the country.
Open-source models, especially out of China, will provide people access to “basic levels of compute” and generative and agentic AI, Kelly added. |