02 August 2018-Bucharest, Romania. People waiting and watching in the public park Herastrau for the movie to start on the projection screen of the open air cinema

Is the rise of genAI about to create an energy crisis?

Start

The veracious demand for generative AI (genAI) tools is driving a significant increase in the use of power-sucking GPUs and TPUs in data centers, some of which are scaling up from tens of thousands to more than 100,000 units per server farm.

With the shift to cloud computing and genAI, new data centers are growing in size. It is not unusual to see new facilities being built with capacities from 100 to 1,000 megawatts — roughly equivalent to the energy requirements of 80,000 to 800,000 homes, according to the Electric Power Research Institute (EPRI).

AI-related energy consumption is expected to grow about 45% year through the next three years. For example, the most popular chatbot, OpenAI’s ChatGPT, is estimated to use about 227 million kilowatt-hours of electricity annually to process 78 billion user queries.

To put that into perspective, the energy ChatGPT uses in one year could power 21,602 US homes, according to research by BestBrokers, an online service that calculates odds for trading from big data. “While this accounts for just 0.02% of the 131 million U.S. households, it’s still a staggering amount, especially considering the US ranks third in the world for household numbers,” BestBrokers wrote in a new report.

GenAI models are typically much more energy-intensive than data retrieval, streaming, and communications applications — the main forces that drove data center growth over the past two decades, according to EPRI’s report.

At 2.9 watt-hours per ChatGPT request, AI queries are estimated to require 10 times the electricity of traditional Google queries, which require about 0.3 watt-hours each; and emerging, computation-intensive capabilities such as image, audio, and video generation have no precedent, according to EPRI.

There are now nearly 3,000 data centers in the US and that number is expected to double by 2030. While genAI applications are estimated to use only 10% to 20% of data center electricity today, that percentage is rising quickly. “Data centers are expected to grow to consume 4.6% to 9.1% of U.S. electricity generation annually by 2030 versus an estimated 4% today,” EPRI said.

No crisis yet — but energy demands are growing

Though data center power consumption is expected to double by 2028, according to IDC research director Sean Graham, it’s still a small percentage of overall energy consumption — just 18%. “So, it’s not fair to blame energy consumption on AI,” he said. “Now, I don’t mean to say AI isn’t using a lot of energy and data centers aren’t growing at a very fast rate. Data Center energy consumption is growing at 20% per year. That’s significant, but it’s still only 2.5% of the global energy demand.

“It’s not like we can blame energy problems exclusively on AI,” Graham said. “It’s a problem, but AI is a convenient scapegoat for the world’s energy problems.”

Each GPU in an AI data center can consume more than 400 watts of power while training a single large language model (LLM) — the algorithmic foundation of genAI tools and platforms. That means simply training a single LLM like ChatGPT-3 can lead to up to 10 gigawatt-hour (GWh) power consumption. That’s roughly equal to the yearly electricity consumption of over 1,000 US households.

“Interestingly, training the GPT-4 model, with its staggering 1 trillion parameters, required a whopping 62.3 million kWh of electricity over a 100-day period,” BestBroker’s report said. “This is 48 times the energy consumed by GPT-3, which, in comparison, used about 1.3 million kWh in just 34 days.”

There are hundreds of such data centers across the world, mainly managed by big tech firms like Amazon, Microsoft and Google, according to a study by the University of Washington. And the amount of energy they use is rising quickly. In 2022, total AI datacenter energy consumption in the US hit 23 trillion-terawatt hours (TWh). (A TWh represents one trillion watts of power used for one hour).

That figure is expected to increase at a combined annual growth rate of 44.7% and will reach 146.2TWh by 2027, according to IDC Research. By that point, AI datacenter energy consumption is expected to account for 18% all datacenter energy consumption.

There is already speculation — given how fast genAI has erupted onto the scene — that it won’t take that long before a crisis emerges. Tech entrepreneur Elon Musk said earlier this year that by 2025, there will not be enough energy to power AI’s rapid advances.

A two-tiered billing system?

Beyond the pressure from genAI growth, electricity prices are rising due to supply and demand dynamics, environmental regulations, geopolitical events, and extreme weather events fueled in part by climate change, according to an IDC study published today. IDC believes the higher electricity prices of the last five years are likely to continue, making datacenters considerably more expensive to operate. (The cost to build a datacenter ranges from $6 million to $14 million per megawatt, and the average life of each center is 15 to 20 years, according to IDC.)

Amid that backdrop, electricity suppliers and other utilities have argued that AI creators and hosts should be required to pay higher prices for electricity — as cloud providers did before them — because they’re quickly consuming greater amounts of compute cycles and, therefore, energy compared to other users.

Suppliers also argue they need to build out their energy infrastructure to handle the increased use. American Electric Power (AEP) in Ohio, for example, has proposed that AI data center owners be required to make a 10-year commitment to pay for a minimum of 90% of the energy they say they need monthly — even if they use less. AEP said it’s facing 15 GW of projected load growth from data centers by 2030 and wants the money up front to expand its power infrastructure.

Data center operators, not surprisingly, are pushing back. Google, Amazon, Microsoft and Meta are currently fighting the AEP proposal. The companies argued before Ohio’s Public Utilities Commission last month that special rates would be “discriminatory” and “unreasonable.”

Graham wouldn’t say whether special power rates for AI providers would be fair, but he did point to the standard of charging lower electricity rates for bulk industrial power consumers. “If you think about you and I as consumers — forget the market we’re in — you expect volume discounts,” he said. “So, I think the data center providers expect volume discounts.”

Electricity is, by far, the greatest cost of running a data center, accounting for anywhere from 40% to 60% of infrastructure costs, Graham said; to change that cost structure would have an “enormous impact” on corporate profits.

Even chip makers are eying the situation warily. Concerned about the increasing power needs, Nvidia, Intel and AMD are now all working on processors that consume less energy as a way to help address the problem. Intel, for example, will soon begin to roll out its next generation of AI accelerators, which will shift  the focus away from traditional compute and memory capabilities to per-chip power consumption.

Nuclear power as an option

In the meantime, AI data center operators are turning their attention to an unexpected power source: nuclear energy. Amazon, earlier this year, spent $650 million to buy a data center from Tesla that runs on 100% nuclear energy from one of US’s largest nuclear power plants.

And just last week, Microsoft announced it is working on a deal with Constellation Energy to reopen the Three Mile Island power plant in Pennsylvania — the site of the worst nuclear accident in US history. Under the deal, Microsoft would purchase 100% of the power from Three Mile Island for the next 20 years to feed its voracious AI energy needs.

In July, the US Energy Advisory Board released a report on providing power for AI and data centers; it offered 16 recommendations for how the US Department of Energy can help support growing demand reliably and affordably. The report considers power dynamics for AI model training, operational flexibility for data center and utility operators, and promising energy generation and storage technologies to meet load growth. 

In the report, the agency noted that electricity providers, data center customers, and other large customers had all expressed concerns about the ability to keep up with demand, and “almost uniformly, they recommended accelerating generation and storage additions, delaying retirements, making additional investments in existing resources.”

Those updates include the “uprating and relicensing of existing nuclear and hydroelectric facilities,” and demonstrating new clean, firm, affordable, dispatchable technologies as soon as possible. “In most cases, [stakeholders] see new natural gas capacity additions — in addition to solar, wind, and batteries — as the primary option available today to maintain reliability,” the report said.

“We’re going to need all sources of power, including geothermal and hydrogen,” IDC’s Graham said. “AI’s power consumption is really growing. You can draw certain analogies to cloud. The one thing about AI that’s different is the magnitude of energy consumption per server.”

Previous Story

US Department of Commerce Issues Proposal to Require Reporting Development of Advanced AI Models and Computer Clusters

Next Story

Google brings Gemini AI assistant to Workspace business plans