AI’s soaring energy requirements have become a defining challenge for tech leaders. Google Cloud CEO Thomas Kurian recently described energy availability as the sector’s most pressing challenge, outlining how the company is reshaping its infrastructure strategy to address what he called AI’s most problematic bottleneck.
Speaking at Fortune’s Brainstorm AI conference in San Francisco, Kurian said energy concerns were flagged years ago at Google, even before the rise of large language models. Anticipating that data centers and power supply would face the same pressure as advanced chips, the company focused early on designing highly energy-efficient computing systems.
The global numbers explain the urgency. The International Energy Agency estimates that some AI-optimized data centers already consume as much power as 100,000 homes, while the largest new facilities under development could use twenty times that amount. Meanwhile, real estate consultancy Knight Frank projects that worldwide data-center capacity will grow by 46 percent over the next two years, adding nearly 21,000 megawatts of new demand.
Against this backdrop, Google Cloud has adopted a three-part approach to manage the energy footprint of its rapidly expanding AI workloads.
The first pillar is diversification of power sources. Kurian stressed that not every form of energy can support the intense, sudden electricity surges required to train large AI models. Training clusters draw massive power spikes, which certain energy systems simply cannot accommodate reliably.
The second focus is maximizing efficiency inside the data centers themselves. Google applies AI to operate its control systems, optimizing cooling and heat exchange processes so that energy already brought on site is used more effectively and with less waste.
The third element looks beyond current solutions. Kurian said Google is exploring new, fundamental technologies aimed at generating energy in novel ways, though he did not share details on what those efforts entail.
Alongside its internal strategy, Google is also building partnerships to strengthen infrastructure supply. Earlier the same day as Kurian’s remarks, utility giant NextEra Energy announced an expanded collaboration with Google Cloud, including plans to develop new U.S.-based data-center campuses paired with dedicated power plants.
Industry leaders increasingly agree that electricity access now ranks alongside semiconductor supply and model innovation as a decisive factor in AI’s future. Nvidia CEO Jensen Huang has also highlighted infrastructure constraints, pointing out the lengthy timeline required to build data centers in the United States compared with China’s faster construction capabilities, another reminder that the race to scale AI is becoming as much about physical resources as digital breakthroughs.
