The Thirsty Algorithm & Green AI
Nov 21, 2025
If all we are talking about is the promise of AI and its potential to surpass human knowledge, then we might be missing the environment costs of it. Training GPT-3 required roughly 700,000 litres of water for cooling alone. A Greenpeace study estimates that data centres will consume 664 billion litres annually by 2030, compared to 239 billion litres in 2024.
The International Energy Agency’s (IEA) projects that the sector’s global electricity consumption would more than double between 2024 and 2030, reaching 945 terawatt-hours (TWh) by the end of the decade. This is equivalent to the current electricity demand of Japan. This is pushing GreenAI to the forefont of technology development agenda where data scientists need to be as focused about sustainable AI as they are about the accuracy of their models.
Data Centers are Guzzling Energy
Right now, data centers powering AI models are guzzling electricity at an alarming rate, with AI accounting for 5-15% of their total power use globally in recent years, a share that's projected to jump to 35-50% by 2030. In the US alone, data centers used about 200 terawatt-hours of electricity in 2024—enough to power a country like Thailand— and AI-specific servers might have consumed up to 76 terawatt-hours of that, rivaling the annual energy needs of over 7 million American homes.
This surge isn't slowing; Goldman Sachs forecasts a 165% rise in global data center power demand by 2030 compared to 2023, driven largely by AI's need for high-powered chips that guzzle two to four times more watts than traditional ones. And it's not just energy—training a single model like GPT-3 required around 700,000 liters of water for cooling, with indirect water use from power generation pushing the total even higher; by 2028, AI data centers could drive water consumption up 11-fold to 1,068 billion liters annually worldwide.
Making AI Greener
The kicker? This boom is straining grids and environments, especially in water-scarce spots like the Global South, where data centers exacerbate droughts without creating proportional local benefits. But here's the good news: data scientists are stepping up with sustainable algorithms that slash energy use right at the source, making AI greener without sacrificing smarts. Techniques like model pruning trim away unnecessary parameters, cutting energy by 30-50% while keeping accuracy intact—think of it as decluttering your code for efficiency. Quantization dials down precision in calculations (from 32-bit to 8-bit floats), shrinking models by 75-80% and their power draw, perfect for running AI on edge devices instead of thirsty data centers. Knowledge distillation trains small "student" models on big "teacher" ones, delivering up to 90% energy savings during runtime, and sparse models focus only on key computations, potentially reducing needs by 5-10 times over dense networks.
Avoiding Energy Waste
Then there's transfer learning, which reuses pre-trained models to cut retraining energy by 20%, and early stopping, which halts training once results plateau, avoiding wasteful extra GPU cycles that make up 70% of the power bill. Tools like the University of Michigan's Zeus framework dynamically tweak GPU power and batch sizes, achieving 75% energy cuts for deep learning tasks with barely any extra time. Even ensembles of just 2-3 efficient models—like decision trees or Naive Bayes—can save 27-37% more energy than bloated neural nets. Adaptive learning scales complexity to match energy availability, and edge computing shifts work to low-power gadgets, easing the load on central servers. These aren't pie-in-the-sky ideas; they're proven ways to optimize FLOPs and complexity, directly lowering the water footprint from cooling and power plants.
Carbon Impact as KPI
So, what can data scientists do today to push this forward? Start by baking energy audits into your metrics—treat carbon impact like another KPI alongside accuracy, using tools like CodeCarbon to track your code's footprint. Advocate for "Green AI" in teams, collaborating with hardware folks on neuromorphic chips that mimic the brain's low-energy vibe. Real-world examples are already showing the way: Google's DeepMind slashed data center cooling energy by 40% through AI optimization, while BrainBox AI's systems cut HVAC emissions in buildings by up to 40% and energy costs by 25%. Pendulum's AI refines supply chains for agriculture, reducing excess inventory by 92% and waste overall. KoBold Metals uses AI to pinpoint battery metals more sustainably, aiding the EV shift, and Amazon's piloting AI-designed carbon capture materials to offset its own data center emissions. MIT researchers are even flexing AI workloads to run more on renewables, minimizing fossil fuel reliance. By championing these practices, data scientists aren't just fixing AI's mess—they're steering it toward a future where innovation actually sustains us all.
Admissions Open - January 2026

