There is a growing recognition that the development of AI has an environmental dimension. AI is not just a software, something immaterial, but every chatbot response, image generation, or model update depends on chips mined from the earth, data centres gulping down water and electricity, and transmission grids stretching across landscapes.
This is where the paradox comes in. The AI industry increasingly frames efficiency gains as a step toward sustainability, energy-efficient chips, greener data centres, and smarter workloads. However, history tells us a different story: efficiency doesn’t necessarily reduce environmental impact, but often increases it. This is called Jevons paradox, and it’s already visible in AI. The more efficient and accessible AI becomes, the more it’s used, and its larger environmental footprint grows.
So instead of asking whether AI can become environmentally clean, we should ask what AI’s real environmental impacts are, and how they can be governed so that AI growth doesn’t undermine our climate goals.
Mapping AI’s environmental dimensions
We need specific and contextual solutions to tackle AI’s environmental challenges. That’s why, instead of talking abstractly about “environmental impact”, we should talk more specifically about different material dimensions. This blog post outlines some key dimensions of AI’s environmental impact. Still, it can only scratch the surface. AI operates in a broader ecosystem of emerging technologies, from Internet of Things (IoT) and robotics to digital platforms and cloud services, each carrying its own material, energy, and waste implications.
Material extraction and production impacts of chips
Generative AI begins not with algorithms but with the ground beneath our feet. Chips are made from cobalt, lithium, rare earths, gallium, and silicon, materials pulled from mines with extraction practices that can cause severe ecological harm. The production process creates waste, and the growing demand for semiconductors raises questions about managing the end-of-life phase. The Technology Review has highlighted the ways Generative AI is contributing to rising volumes of e-waste, a problem also underscored in academic papers. The environmental burden deepens as chips are upgraded and discarded at an accelerating pace.
Energy supply and emissions of power plants
The AI surge is reshaping energy systems. In the U.S., states are now competing to attract new power plants to meet data centre demand. In Europe, grid operators warn of looming electricity crunches. Each new gigawatt of demand ties AI to the politics of energy, whether it’s fossil fuel plants locking in carbon emissions or renewable energy sources that still require vast mineral inputs to build. AI accelerates debates about who gets priority access to limited clean power.
Infrastructure and resource constraints of transmission grids
Even if all new energy came from renewables, the infrastructure needed to deliver this energy presents its own challenges. Even when power generation shifts towards renewable sources, transmission grids, too, require copper, aluminum, steel, and land, all in short supply. Rising component prices and supply chain pressures have hindered development, delaying decarbonisation and adding to environmental strain. While grids do not directly emit pollutants, they embody environmental costs through their material demands and land use.
Energy and water consumption of data centres
At the centre of AI sit data centres: a facility equipped with computing power designed for training and deploying AI models. They are energy-intensive, water-intensive and deeply embedded in global digital infrastructure. The Environmental and Energy Study Institute has documented how rising data centre energy needs are upending power grids and adding to climate risks. At the same time, data centres consume significant amounts of water for cooling. Metrics such as Water Usage Effectiveness (WUE), conceptualised by The Green Grid, have been developed to measure and improve efficiency and progress. But they often hide rebound effects: as efficiency improves, more workloads are run, cancelling out gains.
The environmental accounting of data centres is contested. Google’s recent release of environmental impact data on its Gemini model was welcomed as a step forward. Yet critics quickly pointed out gaps, from uncounted training runs to reliance on renewable energy certificates that may not reflect real decarbonisation.
Moving towards environmentally successful AI
AI is inseparable from the physical world, from cobalt mines to chip foundries, from power plants to sprawling data centres. Every layer of its infrastructure extracts, consumes, and emits. Together, these impacts form a web that no amount of efficiency tweaking can untangle, as the rebound effects described by Jevons paradox remind us.
Reconciling AI development with environmental goals will require systemic approaches. The EU has already advanced strategies such as the European Critical Raw Materials Act, the Waste from Electrical and Electronic Equipment Directive, and its 2025 Water Resilience Strategy. Yet these frameworks must contend with AI’s accelerating trajectory, which is part of a wider wave of digitalisation alongside robotics, IoT, and other emerging technologies.
If AI is to be environmentally successful, it cannot be left to market forces or corporate PR. It requires political choices:
- Clear accounting of environmental costs across the whole value chain.
- Binding rules that prevent rebound effects from cancelling out efficiency gains.
- A willingness to set boundaries: not every use case justifies the resource burn.
The paradox is simple: efficiency alone won’t save us. AI will only contribute to environmental wellbeing if it’s developed within limits, guided by justice, and aligned with broader visions of sustainability. Otherwise, the so-called green AI risks becoming just another accelerant of environmental breakdown.