< Explain other AI papers

From FLOPs to Footprints: The Resource Cost of Artificial Intelligence

Sophia Falk, Nicholas Kluge Corrêa, Sasha Luccioni, Lisa Biber-Freudenberger, Aimee van Wynsberghe

2025-12-08

From FLOPs to Footprints: The Resource Cost of Artificial Intelligence

Summary

This paper investigates the often-overlooked environmental impact of creating artificial intelligence, specifically focusing on the raw materials needed to build the specialized computer hardware that powers AI training.

What's the problem?

As AI models get more complex, they require more and more computing power, and therefore more specialized hardware like GPUs. While we often think about the energy used to run these systems, this study points out that we also need to consider the environmental cost of *making* the hardware itself – the mining, processing, and eventual disposal of the materials. The problem is that we don't really know how much material is needed for AI training, and if increasing AI performance is worth the environmental cost.

What's the solution?

Researchers analyzed the materials that make up a specific type of GPU, the Nvidia A100, identifying all 32 elements it contains. They found it’s mostly made of common heavy metals, not rare precious ones. Then, they combined this material breakdown with data on how much computing power the GPU provides and how long it lasts. By looking at the requirements for training a large language model like GPT-4, they estimated the total amount of materials needed, and how much that could weigh in toxic elements. They also explored how improving software efficiency and extending the lifespan of the hardware could reduce the material demands.

Why it matters?

This research is important because it shows that even small improvements in AI performance can require a surprisingly large amount of raw materials. It highlights that simply making AI faster isn't enough – we also need to think about making it more sustainable. The findings emphasize the need to consider resource efficiency when developing future AI technologies, and to balance progress with environmental responsibility, meaning we need to find ways to build and use AI that doesn't deplete resources or create excessive waste.

Abstract

As computational demands continue to rise, assessing the environmental footprint of AI requires moving beyond energy and water consumption to include the material demands of specialized hardware. This study quantifies the material footprint of AI training by linking computational workloads to physical hardware needs. The elemental composition of the Nvidia A100 SXM 40 GB graphics processing unit (GPU) was analyzed using inductively coupled plasma optical emission spectroscopy, which identified 32 elements. The results show that AI hardware consists of about 90% heavy metals and only trace amounts of precious metals. The elements copper, iron, tin, silicon, and nickel dominate the GPU composition by mass. In a multi-step methodology, we integrate these measurements with computational throughput per GPU across varying lifespans, accounting for the computational requirements of training specific AI models at different training efficiency regimes. Scenario-based analyses reveal that, depending on Model FLOPs Utilization (MFU) and hardware lifespan, training GPT-4 requires between 1,174 and 8,800 A100 GPUs, corresponding to the extraction and eventual disposal of up to 7 tons of toxic elements. Combined software and hardware optimization strategies can reduce material demands: increasing MFU from 20% to 60% lowers GPU requirements by 67%, while extending lifespan from 1 to 3 years yields comparable savings; implementing both measures together reduces GPU needs by up to 93%. Our findings highlight that incremental performance gains, such as those observed between GPT-3.5 and GPT-4, come at disproportionately high material costs. The study underscores the necessity of incorporating material resource considerations into discussions of AI scalability, emphasizing that future progress in AI must align with principles of resource efficiency and environmental responsibility.