As AI grows, so does its energy footprint. But could AI also be the key to a more sustainable energy future?
The age of artificial intelligence is no longer on the horizon — it’s here, embedded in our phones, our homes, our businesses, and even in the very systems that power our lives. But there’s a catch: the smarter our machines get, the more electricity they demand. As data centres expand and machine learning models become more complex, the energy appetite of AI systems is raising serious sustainability concerns.
Yet amid this high-voltage boom, a paradox is emerging: AI might also be one of our best tools for managing energy more efficiently. Here are five key figures to understand just how much energy AI consumes — and how it might help us use energy smarter.
1. Training a Single Large AI Model Can Emit Over 300 Tons of CO₂
In 2019, researchers from the University of Massachusetts Amherst revealed that training a large NLP model like GPT-2 could emit as much carbon dioxide as five cars over their entire lifetimes — approximately 284 to 626 tons of CO₂. This number is expected to grow as newer models become larger and more data-intensive.
Why does this matter? Because each training cycle involves thousands of GPU hours, often powered by fossil fuel-heavy grids.
2. Data Centres Use About 1-1.5% of Global Electricity — and Climbing
According to the International Energy Agency (IEA), data centres globally consume around 1 to 1.5% of the world’s electricity. As AI adoption accelerates, some estimates suggest this could double by 2030, especially with the rise of generative AI models like GPT, Claude, and Gemini.
These centres not only power processors but also need significant cooling infrastructure, adding to their overall consumption.
3. ChatGPT Uses Roughly 500,000 kWh Daily
With millions of users prompting it each day, ChatGPT’s energy usage is estimated at around half a million kilowatt-hours daily — enough to power roughly 17,000 U.S. homes for a day. This figure includes server operations, maintenance, and cooling.
As more services integrate LLMs into everyday apps, this number will only increase unless more energy-efficient models are adopted.
4. AI Optimisation Could Save Up to 15% of Global Energy Waste
Here’s the hopeful twist: AI can dramatically optimise energy use. From smart grids and predictive maintenance in power plants to AI-driven HVAC systems in buildings, machine learning is already helping reduce energy waste by up to 15% globally, according to McKinsey.
AI can balance renewable energy inputs like solar and wind, forecast usage spikes, and even predict outages — making grids smarter and greener.
5. NVIDIA Claims New AI Chips Are 25x More Energy Efficient
Hardware innovation plays a major role in mitigating AI’s energy impact. Companies like NVIDIA and Google are designing AI-specific chips (TPUs, GPUs) that consume significantly less power while delivering more performance.
NVIDIA’s Grace Hopper Superchip, for instance, is claimed to be 25 times more energy efficient than traditional processors when running certain large models. This could redefine the carbon footprint of AI training in the next decade.
So, Is AI an Energy Villain or a Green Hero in Disguise?
The truth is, AI is both. It is undeniably one of the most energy-hungry technologies we’ve ever created — but it’s also poised to become one of the most effective tools in our fight against energy waste and inefficiency.
The solution lies in responsible deployment, clean energy integration, efficient hardware, and using AI itself to manage the very systems it’s stressing. If we play it smart, the same algorithms that now gulp down megawatts might soon help us save them.
Discussion about this post