Tackling AI’s Climate Change Problem

The AI industry could soon be one of the largest contributors to carbon emissions, if current trends continue.

Reading Time: 11 min 

Topics

Permissions and PDF Download

Traci Daberko

In an era defined by both the promise of technological innovation and the threat of climate change, artificial intelligence has emerged as both a valuable tool and a difficult challenge. As we use AI to tackle tough problems, we must also grapple with its hidden environmental costs and consider solutions that will allow us to harness its potential while mitigating its climate impact.

The success of OpenAI’s ChatGPT language model, which is backed by Microsoft, has sparked a technology arms race, with tech giants making enormous investments in building their own natural language processing systems. But the quest for more intelligent machines is quickly running into a web of sustainability challenges. AI has a fast-growing carbon footprint, stemming from its voracious appetite for energy and the carbon costs of manufacturing the hardware it uses. Since 2012, the most extensive AI training runs have been using exponentially more computing power, doubling every 3.4 months, on average.1

Get monthly insights on how artificial intelligence impacts your organization and what it means for your company and customers.

AI’s Environmental Costs

The environmental impact of information technology is often overlooked, even though data centers and transmission networks account for 1% to 1.5% of global electricity use. They also account for 0.6% of global carbon emissions, which need to be cut in half to achieve a net-zero emissions scenario by 2050, according to the International Energy Agency.2 A single average data center consumes the equivalent of heating 50,000 homes yearly. Electronic waste is the fastest-growing waste stream in the world, amounting to a staggering 57 million tons generated each year, about the same weight as the Great Wall of China.3

Several factors contribute to the carbon footprint of AI systems throughout their life cycles:

Large and complex models: Large language models (LLMs) require tens of thousands of cutting-edge high-performance chips for training and for responding to queries, leading to high energy consumption and carbon emissions. The greater the model’s complexity, the more task times increase, resulting in more energy consumption.4 LLMs like ChatGPT are among the most complex and computationally expensive AI models. The capabilities of OpenAI’s GPT-3 LLM are made possible by its 175 billion-parameter model, one of the largest when it was launched. Its training alone is estimated to have used 1.3 gigawatt-hours of energy (equivalent to 120 average U.S.

Topics

References (22)

1.AI and Compute,” OpenAI, May 16, 2018, https://openai.com.

2.Data Centers and Data Transmission Networks,” International Energy Agency, accessed Oct. 16, 2023, www.iea.org; and D. Patterson, J. Gonzalez, Q. Le, et al., “Carbon Emissions and Large Neural Network Training,” Arxiv, April 23, 2021, https://arxiv.org.

Show All References

Reprint #:

65227

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.