How does, say, natural gas power formulaic emails to your coworkers, college essay cheats, or images of the pope in a puffy coat?
The answer has to do with the far-flung process that triggers every time someone taps ChatGPT or Google Gemini. It likely involves one or more of hundreds of nondescript buildings scattered throughout the world and loaded with racks of servers. Those behemoth structures have to be powered up 24/7 along with their water-intensive cooling systems, which require a steady flow of massive amounts of electricity.
And as businesses flock to build out a new generation of unprecedentedly compute-hungry AI systems—they’re called large language models for a reason—that power consumption threatens to overwhelm the country’s already-stressed electricity grid. A recent Goldman Sachs report found that the energy needed to power new data centers will grow 160% by 2030. On average, a ChatGPT query requires nearly 10x the electricity of a Google search, Goldman said.
So what can utilities and data center operators do to keep this voracious appetite from hobbling grids? Paradoxically, part of the answer might also lie in AI, albeit a different variety from the chatbots and image generators currently synonymous with the tech.
“There are a lot of opportunities, particularly around AI, to make grid operation more efficient and better,” said Heiko Claussen, co-CTO of industrial software provider Aspen Technology, which works on grid modernization efforts.
Grid upgrades
Those efforts include more accurately predicting the output of renewable sources based on weather patterns and historical performance to plan an optimal mix of supply to tap at a given time, Claussen said. AI can also be useful in improving grid maintenance: diagnosing and prioritizing alarms around problems, monitoring the condition of transformers within the grid, and dispatching repair teams, he said. There’s also potential for using AI to better manage congestion and route power to different areas, Claussen said.
Claussen said he doesn’t expect AI power demand to continue to grow linearly at its current rate forever; as chips and data centers become more efficient, it could also relieve some of the strain on grids. He also sought to draw a line between the current wave of resource-voracious generative AI and AI technology more broadly.
“AI has an unnecessarily bad rap as being power-hungry [because of generative AI],” Claussen said. “If you’re talking in our specific space, in the industrial AI space, I would argue that AI is actually helping to save power and resources. It’s a tool for optimization.”
Chasing effortless cool
As governments around the world have begun to classify data centers as critical infrastructure, more people have also started to pay more attention to the inner workings of these thousands of obscure-yet-essential buildings, according to Rithika Thomas, a senior analyst at ABI Research and an architect by training who studies how data centers can operate more efficiently.
Keep up with the innovative tech transforming business
Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.
“When you think of the cloud, everybody thinks it’s an imaginary thing that nobody has to deal with, and with a lot of people talking about it, becomes a real, physical aspect of a building, actually, whether that’s a cement box or whether it’s underground, it is a physical asset that people need to think about,” Thomas said.
One of the biggest power guzzlers within a data center is the cooling system, which can account for up to half of its energy consumption, according to the trade publication Data Center Knowledge. Thomas said some of the work to relieve this burden relies on reusing the excess heat created by data centers, whether to warm homes in the winter or even heat the swimming pools of the Paris Olympics Aquatics Centre.
Thomas said data center operators have also implemented software that can make cooling operations more efficient, better account for spotty output of renewable sources, and extend the life cycle of parts within the data center.
For instance, Google recently introduced a demand response system for its data centers built around the same tech as its existing carbon-intelligent computing platform. The system is designed to reroute non-urgent compute tasks to other data centers at times of high grid stress. Demand response can also be useful for better capturing renewable sources by, say, rerouting data center activity depending on how much the sun is shining on solar panels at a given time.
Gernot Wagner, a climate economist at Columbia Business School, said demand response programs like these are one example of how AI efficiencies in conjunction with emissions pricing and climate policy can serve to help rein in the power usage of data centers.
“There will always be emissions—as long as we’re burning fossil fuels—but we can save on some of that with energy-efficiency measures, with demand response, with other intelligent investments and ways to organize ourselves as society, and AI can certainly help with that,” Wagner said. “It’s not the solution. It’s not going to get us to zero emissions on its own, but it can certainly help make us more productive, make us more efficient, and, yes, shave off some emissions as well.”