Assessing the Energy Impact and Carbon Footprint of AI Model Training: A Case Study Using GPU Servers
Keywords: Carbon footprint, CO2eq emissions, power consumption, generative AI, large language models.
Abstract
The rapid evolution of Large Language Models (LLMs) and generative Artificial Intelligence (AI) technologies has significantly increased power consumption, raising concerns about their environmental impact. This study proposes a methodology to estimate local Carbon Dioxide Equivalent (CO2eq) emissions by recording GPU power consumption and utilizing carbon intensity data from Electricity Maps. Additionally, a case study involving a server equipped with two NVIDIA H100 GPUs is presented. The analysis reveals that users tend to train AI models during periods when energy is predominantly sourced from thermal power plants, potentially increasing the carbon footprint as AI adoption becomes more widespread.
Más información
| Fecha de publicación: | 2024 |
| Año de Inicio/Término: | 27–29 noviembre 2024 |
| Idioma: | Inglés |
| URL: | https://clagtee.fi.mdp.edu.ar/full-papers-search-engine/papers/ID071.pdf |