Greener Machine Learning: Here's how AI models can shrink their carbon footprints.

A new study suggests tactics for machine learning engineers to cut their carbon emissions. Led by David Patterson, researchers at Google and UC Berkeley found that AI developers can shrink a model’s carbon footprint a thousand-fold by streamlining architecture...

Animation showing a AI's metaphorical transition to using green energy.

A new study suggests tactics for machine learning engineers to cut their carbon emissions.

What’s new: Led by David Patterson, researchers at Google and University of California Berkeley found that AI developers can shrink a model’s carbon footprint a thousand-fold by streamlining architecture, upgrading hardware, and using efficient data centers.

What they did: The authors examined the total energy used and carbon emitted by five NLP models: GPT-3, GShard, Meena, Switch Transformer, and T5. They reported separate figures for training and inference. Generally, they found that inference consumes more energy than training.

  • The authors point to several model-design strategies that trim energy use. Transfer learning, for instance, eliminates the need to train new models from scratch. Shrinking networks through techniques such as pruning and distillation can increase energy efficiency by a factor of 3 to 7.
  • Hardware makes a difference, too. Chips designed specifically for machine learning are both faster and more efficient than GPUs. For instance, a Google TPU v2 ran a transformer 4.3 times faster and used 1.3 times less energy than an Nvidia P100.
  • Cloud computing centers with servers optimized for machine learning are twice as efficient as traditional enterprise data centers. Data centers using renewable energy sources are greener, and centers built near their energy source bring further savings, as transmitting energy over long distances is relatively expensive and inefficient.

Behind the news: The authors joined the Allen Institute and others in calling for greener AI. To this end, MLCommons, the organization behind the MLPerf benchmark, recently introduced new tools to measure a model’s energy consumption alongside traditional performance metrics.

Why it matters: Training and deploying a large model can emit five times more carbon dioxide than a single car over the course of its lifetime. As AI becomes more widespread, energy efficiency becomes ever more important.

We’re thinking: There are bigger levers for reducing carbon emissions, such as transitioning the world away from coal power. Still, as a leading-edge industry, AI has an important role in building a the green future.

Read more

Chart comparing U.S. vs. China AI language model performance, July 2024-July 2025, showing Elo ratings over time.

High Stakes for Nations in the Great AI Race: The U.S. leads in AI, but China is gaining momentum. Democratic countries should remove roadblocks to AI progress so they can build models that support human rights and the rule of law.

There is now a path for China to surpass the U.S. in AI. Even though the U.S. is still ahead, China has tremendous momentum with its vibrant open-weights model ecosystem and aggressive moves in semiconductor design and manufacturing.

By Analytics DeepLearning.AI