AI vs. The Environment

April 21, 2021
-

We tend to think of computers as small devices we can carry around in a briefcase or our back pocket. Since computers are relatively small and inexpensive to power, we forget the impact computing has on our energy consumption and the environment. Supercomputers and computationally expensive algorithms require vast amounts of energy and resources. In fact, training a single AI model can emit as much carbon as five cars can in their entire lifetimes.

Common carbon footprint benchmarks (in pounds of carbon dioxide):

  • A roundtrip flight between NYC and SF → 1,984
  • The average American in 1 year → 36,156
  • The average lifetime of a car in the US → 126,000
  • Training a neural network transformer (213M parameters) → 626,155

The environmental impact of training AI models is something that is often overlooked by researchers. Siva Reddy, a postdoc at Stanford working on an NLP model, says “What many of us did not comprehend is the scale of [our carbon footprint] until we saw these comparisons.”

Reddy continues, noting that “Human brains can do amazing things with little power consumption. The bigger question is how we can build such machines.”