Can sustainable AI practices overcome the problem of AI’s power consumption?

The digital shift promised a greener economy, but AI’s energy demands are straining grids. Sustainable AI practices, infrastructure investments, and optimized models are essential to ensure a net-zero future without compromising technological growth.

A woman with a laptop stands in a large room full of hundreds of data servers.

Mark de Wolf

July 8, 2024

min read
  • The rise of generative AI and the massive amounts of electricity needed to power it threaten to knock the renewables transition off course.

  • Energy grids in the United States, United Kingdom, and elsewhere are caught in a pinch, trying to prepare for a net-zero future while addressing frequent spikes in data-center power demand.

  • Upgrades to distribution networks are in the works, but they could take decades. Generative AI firms and hyperscalers need to step up—a tech-driven problem will need a tech-driven solution.

The shift to digital promised a cleaner, greener economy, but today the embodiment of that transformation is straining energy grids and crashing data centers. Without an intervention, the artificial intelligence (AI) and renewable revolutions seem headed for a clash. Generative AI (GenAI) caught the world off guard with its incredible power—and with it, incredible power consumption. A World Economic Forum study says the energy required to run AI tasks is already rising at an annual growth rate above 26%. By 2028, GenAI could be consuming more electricity than Iceland.

Can wind, solar, and hydro really provide enough power to keep up? And if they can’t, will utilities be forced to keep fossil fuels on the front burner? Experts say an EcoAI is possible, but the way machine-learning models are trained and AI applications are coded will need to change. The energy industry also needs big investments in infrastructure to deliver clean power where and when it’s needed.

Why does GenAI use so much energy?

A bitcoin farm with rows of hard drives
Bitcoin and crypto mining were big early drivers of demand for computing power, as in this bitcoin mining farm.

Large language models (LLMs) are the engine under GenAI’s figurative hood, and they have an unquenchable thirst for fuel. A peer-reviewed study from late 2023 forecasts that apps like ChatGPT will consume between 85 and 134 terawatt hours (TWh) by 2027. That’s roughly equivalent to what the Netherlands and Sweden use in a year.

Shahab Mousavi, a Stanford University AI researcher who specializes in sustainability, says three recent shifts have made cloud computing more energy intensive. “What happened in a nutshell is that Nvidia and its competitors steadily improved their graphics processing units [GPUs] that are customized for high-performance computing and parallel processing, laying the groundwork for the data centers with incredibly high processing power that enable today’s AI systems,” he says.

“In some sense, they were lucky to ride multiple waves of demand for GPUs,” Mousavi continues. “First, they were being used for gaming and the graphics demand; then, Bitcoin came along, and crypto mining prompted a big boost. Today, alongside the prior two sources of demand, you have AI applications, particularly generative AI products, generating a new wave of demand. They execute a ton of relatively simple computations in order to train their models, which requires a lot of data processing, and the parallelizable nature of these computational tasks makes them perfect for GPUs to speed up the process relative to more conventional CPUs.”

Hyperscale data centers feel the pain first. With racks and cabinets spread over millions of square feet, they’re becoming ever more energy-intensive as AI servers process a hailstorm of prompts. The GPUs heat up and need greater quantities of electricity to make inferences and stay cool, frequently testing the capacity of local grids.

In a March 2024 speech to energy industry executives, John Pettigrew, CEO of the UK’s National Grid, said data-center power demand will mushroom by 500% over the next decade. An out-of-date distribution network is constraining the system, he warned, adding that “future growth in foundational technologies like artificial intelligence and quantum computing will require even larger-scale, energy-intensive computing infrastructure.” With more than 10,500 data centers operating globally and hundreds more under construction, that issue isn’t going away on its own.

The clean-energy conundrum

A solar panel field in the California desert borders a wind farm.
California has made deep investments in solar and wind power.

So why can’t AI transition to using more electricity from wind, solar, and hydro? In the UK this past year, enough clean energy was generated to power every household. The trouble arises when trying to use green electrons in utility demand response. In states like California and Texas, where renewable generation capacity has grown rapidly, Mousavi says: “The grid isn’t optimized to deliver renewable energy to where it needs to go. When there isn’t sufficient transmission capacity for renewable generation, particularly at peak production times like midday for solar, they have no choice but to resort to curtailment—simply disconnecting renewable generation capacity from the grid.”

He cites the extensive investments in solar plants in California’s Central Valley as a prime example. “The grid was not designed to transmit the majority of the demanded electricity in the San Francisco Bay Area from locations in the Central Valley where we’ve seen solar generation capacity spike, like the outskirts of Davis and Merced, and it has yet to be upgraded to do so,” he says. “As a result, while there’s high consumption demand in the Bay Area, there isn’t the proper grid infrastructure to deliver all the demanded renewable energy there. So what happens is they resort to unplugging installed and readily available renewable sources like solar during peak demand to maintain the stability of the electric grid and have to use traditional plants, typically called ‘peaker plants,’ in the local area of consumption at times of high demand instead.”

Infrastructure investments will help, but power system buildouts take years or even decades to unfold. What can be done about surging AI electricity demand today? One option is to ration computing capacity, as Amazon Web Services has reportedly been forced to do in Ireland, or to incentivize computing capacity to relocate and expand in areas where renewables are abundant, thus reducing the added pressure on the electric grid to bring together demand and renewable supply. Another is to keep burning fossil fuels. For all their problems, oil and gas plants are plentiful, reliable, and evenly distributed near major population centers.

The more these sources are used, however, the less urgency there is to bring new renewable-energy projects online. The United States has a long list of clean-energy projects awaiting regulatory sign-off. But as AI extends its footprint from businesses to households and personal devices, any “additionality” from new renewable generation could be delayed.

To overcome the lack of access to clean energy, data centers have embraced workarounds such as power purchase agreements (PPAs) and renewable energy certificates (RECs) to offset or write off their carbon footprint. “PPAs and RECs are two of the drivers of increased generation, and while they’ve been a boon for the renewable industry, they certainly aren’t a panacea,” Mousavi says. “They don’t do much to improve transmission or enable functional accountability mechanisms via proper allocation of emissions liability, or e-liabilities, to the end consumers of the electricity, and trading emissions attributes through mechanisms like RECs makes it virtually impossible for anyone to know their actual emissions impact.

“As discussed in the What’s Scope 2 Good For? paper, you will never know the precise emissions impact without scope 3 category 3 [S3C3], and market-based scope 2 [MBS2] emissions doesn’t allow for that,” he continues. “We have to look at improving the balance between production and demand of renewables. We’ve rapidly increased clean generation over the years but fallen short of proportionally improving the infrastructure to transmit that increased generation capacity to where the demand is.”

Changing the model

A man sits at a computer displaying code and flow charts.
Small, focused learning models are less computationally intensive and consume less power than larger, more generalized models.

While policymakers work out plans to improve electricity distribution, generative AI companies can do a few things to reduce the carbon footprint of LLMs.

Reshmi Ghosh, a machine-learning scientist at Microsoft, says that LLMs themselves might give way to smaller models designed to address specific use cases. They would be less computationally intensive and need less power as a result. “There are areas where these big, generalized models are not doing well,” she says, pointing to the biases and flawed outputs that GenAI models frequently produce. “There’s a domain of research I’m involved in where we look at how we can avoid investing in massive data sets that are so big when we only have specialized tasks to perform.”

She points to Microsoft’s work with small language models (SLMs) and its open Phi-3 suite of AI-ready data sets. At 3–5 billion parameters, these are significantly more discreet than LLMs like ChatGPT, which claims to use 175 billion parameters or more. SLMs require less training (for example, continual looping of specific operations to refine and perfect them), which will lead to more sustainable AI-development practices. It’s an area Ghosh says Microsoft is keen to explore further.

“What we’re discovering is that you don’t need to keep relearning everything in a data model, whatever size it is, because only parts of it contain the information needed for a certain task or domain,” she says. “We can make AI more efficient by being parameter efficient.”

Achieving an EcoAI

Images of charts and the letters “AI” are superimposed over a photograph of a man at a computer workstation.
Machine learning can be used to make coding more energy-efficient.

Even before ChatGPT, energy consumption was rising steadily due to the ongoing electrification of everything. The current demand spikes call for a balance of enthusiasms—supporting the shift to net zero without holding back the potential of AI. Ghosh and Mousavi say new investments and approaches will be vital to making energy infrastructure and GenAI applications more efficient, resilient, and sustainable. Greener practices in the development phase could also reduce their carbon footprint.

“Energy-efficient coding, optimized code control, plus continuous monitoring and optimizing AI models throughout the lifecycle should all be standard practice,” says Pranjali Ajay Parse, a data scientist at Autodesk. “We can also move to optimized training schedules, where we run training jobs during off-peak hours to take advantage of lower energy costs.”

Another approach is distributed training, “where workloads are spread across multiple machines,“ she continues. ”This can also improve efficiency and reduce bottlenecks.”

Machine learning itself can be applied to the problem. As far back as 2016, according to Parse, Google was using DeepMind to analyze and improve the energy efficiency of its data centers, racking up 40% reductions in energy usage for cooling. “Microsoft’s Project Natick, where data centers are deployed underwater to save on cooling, is another promising approach,” she says.

GenAI confronts society with big questions—some ethical, some legal, some economic, and some philosophical. Experts say it’s now time to factor sustainability into the mix. Debates will continue about the positive and negative impacts of LLMs, but they won’t mean much without a healthy planet to stage them on.

Mark de Wolf

About Mark de Wolf

Mark de Wolf is a freelance journalist and award-winning copywriter specializing in technology stories. Born in Toronto. Made in London. Based in Zürich. Reach him at markdewolf.com.

Recommended for you