Saturday, December 13, 2025

EARTHTALK: AI ERASING ENERGY EFFICIENCY GAINS?

AI Data Centers Erasing Energy Efficiency Gains

Enhan Yuan 

December 10, 2025

Dear EarthTalk: What’s being done to make data centers more energy efficient today, given increased demand from the growth in Artificial Intelligence (AI)? —Paul B., Reston, VA

If we don’t make data centers for artificial intelligence more efficient soon, the environmental consequences could be dire… Credit: Pexels.com.

The modern world hums with data, over 2.5 quintillion bytes generated every day, driving a growing network of data centers that consume one to two percent of global electricity. These centers operate around the clock, using tens of thousands of gigawatt-hours of power monthly. As AI expands, the energy demands of the digital economy will push tech companies to seek cleaner and more efficient solutions.

One advance comes from the University of Waterloo, where researchers developed ultra-efficient “code kernels” to speed internet traffic while cutting power. Data centers rely on the Linux operating system to route information. Computer scientist Martin Karsten and Fastly engineer Joe Damato created roughly 30 lines of code that could cut energy use by as much as 30 percent. Karsten estimates that widespread adoption could save considerable electricity worldwide—affecting nearly every online service request.

Companies are also rethinking how data centers are powered. Wind and solar offer clean energy, but their intermittentness complicates matters. Google and Microsoft are investing in nuclear energy as a steady, carbon-free power source. Despite high upfront costs and public concern, nuclear’s reliability and low emissions make it an increasingly attractive option. Hardware efficiency is also improving. Nvidia’s new “superchips” can train AI models more quickly using less electricity per task. But with over 400 billion transistors packed onto a single chip, these processors generate enormous heat—creating another environmental burden: Heat is a major byproduct of data centers and removing it takes huge amounts of water. In 2022 alone, Google and Microsoft consumed a total of 32 billion liters, largely for cooling. To reduce water dependence, some companies are testing liquid cooling systems that circulate oil-based fluids directly over hot components, conserving water and use much less energy than air cooling.

AI amplifies all of pressures. A single chatbot query can require up to 10 times the power of a standard web search, and some estimates suggest models like GPT-3 can consume a liter of water for every 40–100 responses. Large AI models are especially energy-intensive. Shifting to smaller, more efficient models, companies can reduce both costs and environmental impacts.

So, what can people do? For one, use less data. Efficiency alone often boosts consumption rather than reducing it. Cutting screen time, limiting unnecessary AI queries, and prioritizing real-world interactions can all help slow demand. Instead of endless scrolling, the most powerful response may be stepping outside—before the energy sustaining our virtual habits overwhelms the landscapes we seek to protect.

CONTACT

Will AI reduce emissions or increase power demand?

EarthTalk® is produced by Roddy Scheer & Doug Moss for the 501(c)3 nonprofit EarthTalk. See more at https://emagazine.com. To donate, visit https://earthtalk.org. Send questions to: question@earthtalk.org

Leave a Reply