The Hidden Cost of Artificial Intelligence: AI’s Growing Environmental Footprint
Few revolutions are bloodless. The AI upheaval may yet kill our planet.
19.7.2025
As new frontiers of social progress are floated alluringly on the horizon, the Artificial Intelligence boom increasingly evokes the imagery of a sci-fi novel. Industry leaders and governments alike are championing AI as the next great driver of human advancement, but this is already coming at a significant cost to our planet.
While the potential of AI may be exciting, running the data centres — huge, factory-like complexes full of servers and networking equipment — required to develop and support AI applications is incredibly resource intensive. In 2024, the electricity consumption of data centres worldwide was 415 terawatt-hours (TWh), equal to that of Italy and the Netherlands combined. Around 20% of this usage was attributable to centres supporting AI-driven applications.
It’s estimated that energy consumption by data centres will increase by 165% before the end of 2030, and that data centres could consume over 1,500 TWh of electricity per year (equivalent to the expected annual consumption of India, to keep it in terms of nation states) by 2034. AI is forecast to drive around 70% of this demand.
The probable result is greater consumption of fossil fuels. In May, OpenAI CEO Sam Altman told a US Senate hearing that natural gas will likely power much of the short-term increase in electricity production. Elsewhere, Elon Musk’s xAI is currently in dispute with Memphis residents for powering its Colossus supercomputer — the largest of its kind in the world — with high-polluting mobile gas turbines. The turbines, which work by burning methane, emit high levels of formaldehyde, nitrogen oxides, and other smog-forming chemicals, and Musk has already faced legal challenges for installing and operating them without valid permits.
Concerns regarding the conduct of prominent AI industry executives are compounded by the government that ostensibly regulates them. Over half of the world’s data centers are currently located in the USA, but, rather than attempt to temper the resultant surge in demand for domestic energy, Donald Trump has sought to meet it by burning more coal. An April Executive Order to reinvigorate America’s “beautiful, clean coal industry” prolonged the operational lifespan of multiple coal plants that had been slated for closure, in part to service the growing energy needs of AI data centres.
In many developed economies, the impact of AI on wider trends in energy consumption is already visible. At the start of 2021, net electricity generation in the USA and European Union had largely plateaued after decades of increasing. Now, demand is spiking again, primarily due to data centre expansion. In the US, this is amounting to a 2% increase in demand annually, with some studies projecting a 50% rise in total US energy demand by 2050.
While governments scramble to bolster the capacities of their electrical grids, communities are left to battle tech behemoths for whatever resources can be guaranteed. The M4 corridor, for example, contains some of the most in-demand prospective housing land in the UK — yet the Greater London Authority banned new developments in 2022 due to acute pressure on the grid caused by nearby data centres.
Elsewhere, the issue is water. Many data centres cool their servers through a method called evaporative cooling — a process that requires freshwater and loses about 80% of the water it draws (in contrast, residential water usage loses around 10%). Applications like ChatGPT may have streamlined so many aspects of our day-to-day lives that even a simple Google search now feels cumbersome, but few users realise that a single ‘conversation’ with the large language model consumes around half a litre of water.
On average, a 100-megawatt data centre consumes roughly as much water as 6,500 households, while global water consumption by data centres is believed to be around 560 billion litres per year. Again, this figure is projected to rise steeply — to around 1,200 billion litres per year by 2030.
The need for freshwater also puts data centres in direct competition with residential users for access to limited public water resources. A recent Bloomberg investigation found that two-thirds of new data centres are built in areas already facing water scarcity, adding that a number of US data centres are “fully or partially powered by water-hungry power plants.”
The proliferation of bigger, more resource-hungry data centres worldwide shows little sign of abating. In the US, Senate hearings have been dominated by industry leaders like Altman and Microsoft President Brad Smith — the need to beat China in any AI arms race presented as justification for repressing meaningful oversight of AI companies. In Britain, the Labour government has enthusiastically welcomed the data centres of foreign multinationals, despite minimal evidence of associated economic benefits in the longer-term, while the EU has stated its desire to turn Europe into an “AI continent”. Nations like India and China have also set out their stalls to become global leaders in AI development.
Progress needn’t come at such a high environmental cost. Prior to the public release of ChatGPT in 2022, there was a trend in the AI research space towards smaller, more efficient models that could deliver AI applications with relatively small datasets and considerably less processing power. Even now, as unwieldy, exorbitantly expensive AI models that require huge amounts of data are de rigeur, researchers outside the tech juggernauts are successfully making smaller models more effective.
Of course, companies like OpenAI and Google are incentivised to project an expensive ‘scale at all costs’ approach — and its concomitant hunger for resources — as the only effective model because it obstructs potential rivals. If data-guzzling large language models are perceived to be the only means of creating AI products, it precludes the emergence of any competitor without the backing of massive venture capital or state-led investment.
But maybe we — and, more pertinently, legislators across the world — need to ask ourselves: should a handful of private tech giants shape the discourse on a technology that could profoundly impact so many aspects of our society, and perhaps even the trajectory of our species?
Without far greater regulation of how we develop and sustain AI, it seems inevitable that they will.