Artificial intelligence is rapidly reshaping global electricity demand, with recent estimates suggesting that AI systems alone could soon consume power on a scale comparable to that of the United Kingdom.
Yet as AI-driven workloads become a central growth engine for data centers, their environmental footprint remains poorly quantified. Most public assessments focus narrowly on the energy cost of individual AI models, while the broader carbon and water implications of operating AI infrastructure at scale are obscured by limited disclosure from data center operators.
According to the International Energy Agency, AI systems accounted for roughly 15 percent of global data center electricity demand in 2024, with other research placing the figure closer to 20 percent by year-end. With manufacturing capacity for AI hardware expanding, total AI-related power demand could reach around 23 gigawatts by the end of 2025, nearly half of total data center electricity consumption. This trajectory has significant environmental implications, given that global data centers were associated with an estimated 182 million tons of CO₂ emissions in 2024 and consumed approximately 560 billion liters of water in 2023, figures that do not distinguish between AI and non-AI workloads.
The core challenge lies in data availability. Data center operators and large technology firms rarely separate AI workloads from other computing activities in their environmental disclosures. As a result, researchers are forced to approximate AI impacts using aggregated performance metrics such as total electricity consumption, power usage effectiveness, and water usage effectiveness. While these metrics offer a starting point, they mask wide variation across locations, workloads, and power grids.
Using IEA data, the implied average carbon intensity of electricity consumed by data centers in 2024 was about 396 grams of CO₂ per kilowatt-hour, slightly below the global average due to the concentration of data centers in relatively cleaner grids in the United States and Europe. Applying this intensity to estimated AI power demand suggests that AI systems alone could emit between 32.6 and 79.7 million tons of CO₂ in 2025. However, these figures carry substantial uncertainty, as regional grid carbon intensities vary widely and are not disclosed at the level needed to link specific AI deployments to actual emissions.
Company sustainability reports offer partial insight but remain inconsistent. Firms such as Google, Microsoft, Meta, and Apple acknowledge that AI is driving rapid growth in electricity consumption, yet none report AI-specific environmental metrics. Meta provides the most granular disclosure, separating data center electricity use and emissions by location, while others report only company-wide figures or omit key data such as electricity consumption or indirect water use. Even among best-in-class reporters, environmental performance varies significantly depending on grid mix and cooling strategies.
Water use presents an even more opaque challenge. Direct water consumption for cooling is often disclosed, but indirect water use embedded in electricity generation is rarely reported. While the IEA estimated indirect water consumption of about 1 liter per kilowatt-hour for data centers, analysis of disclosed data from US-based data centers operated by Meta, Apple, and Google suggests indirect water intensities closer to 3 to 5 liters per kilowatt-hour. If these higher intensities are more representative, total indirect water consumption from global data centers could be significantly underestimated.
Applying these revised water intensity estimates to projected AI electricity demand implies that AI systems alone could account for between 312 and 765 billion liters of water consumption in 2025, approaching the scale of global bottled water consumption. The uncertainty is amplified by regional differences, as water intensity of electricity generation can range from less than one liter to more than ten liters per kilowatt-hour depending on technology and location.
The growing mismatch between AI’s rising environmental footprint and the limited transparency of data center operators has implications for policymakers and regulators. Without clearer disclosure requirements, it remains difficult to assess whether efficiency gains from improved hardware and cooling are being outpaced by sheer growth in demand. As AI becomes a structural driver of electricity and water use, the absence of standardized, workload-specific reporting risks undermining both climate policy and resource planning.
Mandating more granular disclosure of electricity consumption, emissions, and water use at the data center and workload level would not eliminate uncertainty, but it would significantly narrow it. As AI systems scale from experimental tools to core infrastructure, the environmental consequences of operating them can no longer be treated as a black box.


