The disruption caused by Generative AI is not only deeply impacting the Tech industry and (soon) almost any economic activity but it will also have geopolitical, environmental, and energetic consequences.
Since the early 2000s, cloud computing has rapidly grown to become the dominant IT platform, delivering any type of digital and/or communication service around the planet. Companies like Amazon (AWS), Microsoft (Azure), Google Cloud, commonly referred as hyperscalers, built over the past two decades a worldwide infrastructure of data centers powering e-commerce, streaming, social networks, messaging, gaming, and many other services forming the digital economy.
This cloud infrastructure, which expanded at an already elevated rate (15%+ per annum over the past decade), is now getting a shot of steroids with the advent of ChatGPT and GenAI. But, as previously commented (here and here), the training process of multi-billion parameter models is putting the current IT infrastructure to its knees. Hence, a new generation of GenAI-dedicated data centers is in the process of being built around the world. These High-Performance Computing (HPC) platforms will operate hundreds of clusters each running thousands of accelerators (GPUs and TPUs), interconnected through a high-bandwidth network linking together storage racks holding Petabytes of data.
The global energy consumption of this GenAI HPC infrastructure is expected to shoot through the roof as the power requirements of its two main components, computing and cooling (which account each for 40% of data center energy consumption) are rising exponentially. The AI accelerator will soon suck more than 1kW (Nvidia Blackwell), compared to 700 watts for the previous generation of AI chips (H100). And this very high electric consumption and heat at the chip level will necessitate additional (liquid) cooling solutions that will further increase the electricity bill.
As the power consumption of these new AI datacenters will range between 300 and 500+ Megawatts, the need for abundant and inexpensive power as well as a reliable (smart) grid are of utmost importance to datacenter operators. These constraints, coming on top of chip export restrictions, will naturally limit the regions and countries where these new AI data centers will be built, in the same way that crypto miners were chasing the lowest and most reliable energy costs to set up their mining operations, in the most favorable jurisdictions.
In terms of environmental impact, there are for now no accurate predictions, but we believe that the GenAI impact will be quite limited in light of the many statements from hyperscalers to exclusively use (or tend to use) green energy sources. As an example, the training of GPT-3 generated about 588.9 metric tons of CO2, equivalent to the annual emissions of 128 passenger vehicles (source: Semianalysis).
But when it comes to energy demand, the impact is likely to be significant with data centers expected to drive a surge in electricity consumption. While electricity demand has grown roughly in line with GDP in the past years, it is now expected to accelerate to 5%-6% according to the US Department of Energy undersecretary for infrastructure. Globally, the whole (AI) cloud infrastructure is expected to use 5% of the world’s total energy generation by 2030, up from only 2% today.
Obviously, this strong demand environment should drive a pick-up in grid investments and, specifically, in capex related to datacenter power infrastructure. Schneider appears here as a leading player with a wide portfolio of products and services for datacenters including power distribution, medium and low voltage transformers, cooling, lighting control, sensors and meters to monitor the infrastructure.
Another consequence of this rising power consumption is that common green and intermittent energy sources like solar and wind will not be able to deliver the needed constant power flow. This is why hyperscalers are now increasingly turning towards carbon-free nuclear energy, which offers a reliable 24/7 baseload. Amazon just announced the acquisition of a nuclear-powered datacenter campus for $650 million while Microsoft is seriously considering Small Modular Reactors or SMRs to power its fleet of upcoming AI datacenters.
The GenAI wave (in addition to Electric Vehicles) is hence putting nuclear power back under the spotlight – after a lost decade following the Fukushima catastrophe – with an expected growth rate of more than 40% in terms of future power plants buildout and significantly higher budgets allocated to SMRs’ research and design.
We believe that these initial nuclear deals and projects announced by the world’s largest datacenter operators are just the beginning of a secular trend, a trend that is also likely to help other green energy sources like hydrogen used in fuel cells to replace the datacenter energy backup solutions currently provided by traditional diesel generators. Some successful tests with Bloom Energy equipment has been already undertaken several years ago.