The hidden cost of AI: Unpacking its energy and water footprint

The hidden cost of AI: Unpacking its energy and water footprint 1536 857 A. Garg, I. Kitsara, S. Bérubé

Summary

This blog post, co-authored by Arti Garg (Chair, IEEE P7100 Working Group), Irene Kitsara (European Standardization Initiatives Director, IEEE), and Sarah Bérubé (Policy Analyst, AI Policy, OECD), is cross-published with the OECD. It summarises key takeaways from The Hidden Cost of AI: Unpacking Its Energy and Water Footprint, an event co-organised by the OECD and IEEE. Insights from the discussion will contribute to the IEEE P7100 Working Group’s efforts to develop a technical standard for measuring AI’s environmental impact.

This blog post is cross-published with OECD at the following link. A recording of the event is available on OECD.ai YouTube channel. 

On 12 February 2025, the OECD and IEEE co-organised an event on the margin of the French AI Action Summit with a diverse group of experts to discuss AI’s growing environmental challenges. The event had three main sessions about critical sustainability concerns: the environmental cost of inference, the impact of data centres on the electricity grid, and AI’s water footprint.

Session 1: The environmental cost of inference

While AI has revolutionized countless industries, its environmental impact is a pressing issue. Energy-intensive training of AI models receive a lot of attention, but it is the inference stage when trained AI models are put into use whose significant long-term environmental cost is becoming clearer. Discussions focused on how inference, which could account for more energy consumption over time than training considering the level of scale, presents a growing challenge for sustainability.

Growing energy consumption from AI inference

Panellists recognised AI inference’s substantial energy requirements, especially as models scale and complexify. One speaker pointed out that, for some models, inference’s cost now outweighs the training cost after 50 million usages and that current incentives do not encourage companies to optimise AI’s energy use. This leads to companies prioritising financial gain over sustainability, underscoring the need for stronger regulatory frameworks to balance technological progress with environmental responsibility. The moderator stressed that optimising the inference process could minimise energy consumption, by reducing, for instance, unnecessary computations at the model level.

The role of developers in optimizing AI inference to reduce the carbon footprint

Developers play a crucial role in minimising the environmental impact of AI. One speaker advocated for critical assessments of software development choices and prioritisation of smaller, more efficient AI models. One panellist said this ties into the growing demand for “green skills” in AI-related professions. Skills such as energy management and environmental, social, and governance (ESG) policy, are becoming increasingly valuable in reducing AI carbon footprint.

Evaluating energy efficiency claims

One speaker also mentioned that while some models like DeepSeek are more energy efficient during training, there are still uncertainties about how efficient they are during the inference phase, especially when they provide longer reasoning time for completing queries. There is still no research on the environmental impact of these recent developments and transparent evaluations are needed to realistically gauge AI’s environmental cost.

Session 2: The impact of data centres on the electricity grid

As AI expands, data centres, the backbone of AI systems, are placing increasing pressure on the world’s electricity grids. These massive facilities consume vast amounts of energy, which raises concerns about sustainability, grid stability and reliability.

Data centres as major energy consumers

The moderator highlighted the difficulty of obtaining reliable estimates on the energy consumption of data centres, and the extent to which energy consumption can be specifically accounted to AI, making it challenging to fully understand the magnitude of the problem. However, speakers expressed the view that while AI contributes to growing energy demand, it is not the largest driver of global electricity consumption. For example, electrification of the transportation sector outpaces AI in terms of electricity use. One panellist mentioned the importance of real-time cloud energy tracking to reduce uncertainty in calculations of emissions.

Renewable energy and carbon offsetting efforts

Some companies, like Google, have made a commitment to clean electricity in data centres by 2030. However, one speaker warned that even with these efforts, renewable energy supplies may not be sufficient to meet demand by 2025. Given AI’s 24/7 energy needs, there is a risk that it may continue to rely on fossil fuels unless there are comprehensive policy changes. One participant noted that the role of nuclear energy, including SMRs (small modular reactors) could play in AI sustainability is often overlooked. Another shared initiative which focuses on reusing heat generated by data centres like AWS’s projects in Ireland. However, scaling these efforts responsibly is a significant challenge.

Modulate energy demand

Other aspects were also mentioned, including location of data centres and use of mobile devices for inference which could help offload energy demands, considering the charging of mobile phones during the night.

Session 3: AI water usage

AI-driven data centres don’t just consume massive amounts of energy—they also require vast quantities of water for cooling purposes in data centres, as well as other parts of the AI lifecycle, including production of AI-specific hardware or water-intensive electricity generation. In regions where water is scarce, this adds strain on local water supply and raises serious concerns. The moderator highlighted that water consumption hasn’t received the same level of attention as energy consumption, that calculations are complex, without commonly accepted metrics, and are not included in regulatory compliance requirements, which may explain why it remains under-measured.

AI’s hidden water footprint

One speaker explained that AI models, particularly large-scale ones, require significant amounts of water to cool servers. Data centres often use fresh water for cooling, which leads to substantial water loss due to evaporation. This not only affects the environment but also poses risks to local communities that depend on this water for drinking and agricultural use. To further exasperate the issue, AI also has an indirect water footprint. For example, semiconductors consumption uses water, so does generating electricity for AI operations.

Measuring AI’s water footprint

A key challenge in addressing AI’s water usage is the lack of standardised tools for measuring water consumption. One speaker explained that the use of water in data centres can be divided into embodied use and operational use. Embodied use, this is to say water used in manufacturing hardware, represents 30% of AI-related consumption. Operational use refers to water used to cool data centres and AI model processing. It equals to 70% of AI-related water consumption. However, one speaker pointed out that many companies use energy consumption as a proxy for water use, but these methods have an error margin of up to 500%, which often leaves companies unaware of their water consumption’s true scale.

Geographical and ethical considerations

One speaker signalled that current global water allocation is already unsustainable: we are currently not on track to meet the Sustainable Development Goal 6 water targets, and that AI is an exacerbating factor. AI water consumption is a social justice issue. Many data centres are located in water-scarce regions, which can make local water scarcity worse and create competition for resources. All speakers emphasised that water is a local issue.

The Pricing Paradox: Water as an undervalued resource

One participant raised an important question: should water be priced differently to incentivize conservation? Speakers argued that water is often treated as a “free” resource, but this mindset fails to account for the risks associated with water scarcity. As such, there is a differentiation between the cost and the value attributed to water. By adjusting water pricing to reflect its true cost, companies and consumers could be incentivized to use it more efficiently.

Transparency and accountability

To address the issue of AI’s water usage, one speaker suggested that companies should disclose their water consumption as part of their sustainability efforts. A representative from industry noted that new data centres use non-potable water sources, such as canal or seawater, to reduce their reliance on freshwater sources, while others have on-premises wastewater treatment to reduce the use of freshwater in data centres.

The path forward: Collaboration is key

Throughout the event, there was a recurring call for collaboration across all sectors. The OECD emphasised the need for multi-stakeholder cooperation, involving (local) governments, industries, academia, citizens including indigenous populations, and standard-setting bodies like IEEE, to ensure that AI development aligns with sustainability goals. This sentiment was echoed by IEEE, who stressed the importance of standardized metrics to measure and manage AI’s environmental impact comprehensively and invited participants to join the IEEE P7100 standardization effort.

The discussion ended, participants recognised progress made and what is left to do. The complexity of AI’s environmental, energy, and water usage challenges requires ongoing research, policy innovation, and technological advances. The good news is that as AI continues to evolve, as does our understanding of its environmental footprint. By working together, we can create a future where AI is both innovative and sustainable.

The environmental impact of AI is an urgent issue that requires collective efforts. As it continues to shape the future, we must prioritize sustainability at every stage of its lifecycle, from model development to deployment. By aligning our efforts and sharing knowledge, we can mitigate the environmental costs of AI and build a greener, more sustainable future for all.

Photo: OECD

Watch the event recording