The AI of the storm

    In just a few years since OpenAI launched their flagship model with a user-friendly, natural language interface, ChatGPT, “artificial intelligence” (AI) has moved from the realms of science fiction into apparent daily reality. 

    AI appeared capable of robot feats unimaginable a few years ago: writing whole new works of natural language text; composing music; generating instant artworks. But behind this shift in computer power is a troubling, and still poorly-recognised, environmental impact. A new briefing from Opportunity Green summarises the case.

    Data centres, the hardware foundations of today’s digital economy, are hugely resource intensive. That digital economy is built on the collection, storage, and transmission of data on an unimaginable scale. 

    Industrial

    Even before the artificial intelligence explosion following OpenAI’s launch of ChatGPT in late 2022, the volumes involved were mindblowing: global internet traffic has grown from 156Gb transferred every second in 2002 to 150,000Gb a second in 2022. 

    That immense collection of ones and zeros, the digital core of the data economy, in turn provided the raw materials to train today’s machine learning (ML) models. 

    The key to their operations is scale – of vast troves of data; banks and banks of dedicated silicon chips in specialised data centres; and fast telecoms networks spread across the world. But the scale which makes AI’s semi-magical abilities possible is also what turns it into a growing environmental problem. 

    Whatever the bold claims made twenty years ago, at the dawn of the internet age, about the “weightless economy” to come or how we would all be “living on thin air”, data in reality has a hard physical presence. 

    It is stored and processed on banks of dedicated silicon chips, produced in extraordinary volumes by some of the most exacting and technically sophisticated industrial processes we have access to. 

    Explosive

    The chips run on electricity – in trivial amounts for a single integrated circuit, like the one in your phone or laptop, but which in the vast numbers needed for large-scale data processing, become extraordinary. 

    The latest, AI-dedicated server racks contain 72 specialised chips from manufacturer Nvidia. The largest “hyperscale” data centres, used for AI tasks, would have about 5,000 of these racks

    And as anyone using a laptop for any period of time knows, even a single chip warms up in operation. To cool the servers requires water – gallons of it. Put all this together, and a single hyperscale data centre will typically need as much water as a town of 30,000 people – and the equivalent amount of electricity. 

    The new and very specific power demands from data centre growth will push national energy grids closer to the edge.

    The Financial Times reports that Microsoft is currently opening one of these behemoths somewhere in the world every three days.

    Even so, for years, the explosive growth of the digital economy had surprisingly little impact on global energy demand and carbon emissions. Efficiency gains in data centres—the backbone of the internet—kept electricity consumption in check. 

    Consumption

    But the rise of generative AI, turbocharged by the launch of ChatGPT in late 2022, has shattered that equilibrium. AI elevates the demand for data and processing power into the stratosphere.

    The latest version of OpenAI’s flagship GPT model, GPT-4, is built on 1.3 trillion parameters, with each parameter describing the strength of a connection between different pathways in the model’s software brain. 

    The more novel data that can be pushed into the model for training, the better – so much data that one research paper estimated machine learning models will have used up all the data on the internet by 2028.

    Today, the insatiable demand for computing power is reshaping national energy systems. Figures from the International Monetary Fund show that data centres worldwide already consume as much electricity as entire countries like France or Germany. It forecasts that by 2030, the worldwide energy demand from data centres will be the same as India’s total electricity consumption. 

    In principle, that accelerating demand for electricity could be met by renewables, and many of the Big Tech companies, like Google, have targets to meet 100 per cent renewable consumption. 

    Mines

    But the reality is far dirtier. To continue expansion at this breakneck pace requires the rapid installation of new supplies. And data centre electricity demand is quite different to homes, offices, and factories, which typically have peaks and troughs throughout the day as people arrive home and turn on the TV, or leave for the day and turn the lights off, or as machinery is powered up and down. 

    Data centre demand is constant, and required 24 hours, with as little downtime as possible. That, in turn, creates a necessity for the kind of continual, regular electricity supply that renewables is not currently always best for.

    The result is that data centre demand is increasingly being met by fossil fuels. In the US, 60 per cent of new power demand is being supplied by natural gas, while in Germany and Poland, coal plants are being kept online to feed the AI boom. 

    Microsoft is expanding data centre operations near Hambach, one of Germany’s last remaining deep coal mines. The company has refused to comment on the centre’s power source

    Meanwhile, the climate-sceptical Trump administration in the US freely admits that expanding coal generation is critical to keeping the country’s edge in data centre operations.

    Power-plants

    The result is that an industry which currently accounts for around 0.5 per cent of global greenhouse gas emissions, on the International Energy Authority’s (IEA’s) conservative estimates, is set to become one of the fastest-growing contributors to climate change over the next decade. 

    Even as other, older industries scale back on their emissions, and real progress can be seen in switching to renewables production, data centre energy demands pose a significant risk to national decarbonisation strategies.

    There is, as yet, little awareness of this from national governments. Data centres do not even feature in the UK’s Committee on Climate Change most recent carbon budget – despite the UK government’s enthusiasm for their rapid expansion. 

    Work by Beyond Fossil Fuels suggests that data centre growth in Europe on current trends would add the equivalent of Germany’s entire fossil fuel energy system to European GHGs by 2030. Switching this demand to renewables would then, in turn, squeeze supplies to other industries and threaten decarbonisation plans in other sectors. 

    The energy problem is that data centres, as rocket-boosted by AI, represented a new source of rapidly-growing energy demand that power-plants, grid systems and even decarbonisation policies were not designed to cope with. 

    Profits

    Already, the strains are starting to show: the world’s first “byte blackout” was narrowly avoided in Virginia in summer 2024, when a lightning strike knocked out a transformer near a cluster of 20 data centres. 

    Sensing a sudden drop in electricity from the main grid, the data centres switched to their on-site back up generators, causing a surge in supply back onto the main grid that threatened blackouts across the state and caused grid operators to scramble for emergency procedures. 

    As we’ve seen with the dramatic, country-wide power outage across Spain and Portugal, national grid systems, many built decades ago, are already creaking. 

    The new and very specific power demands from data centre growth will push them closer to the edge. Fossil fuels and, in some cases, nuclear power, can represent a supposedly quick solution to a hard problem.

    Big Tech business models are locked into expansion – the source of profits for the data economy, over decades, has been to expand the amount of computing power available, and then wait for demand to fill it. 

    Strains

    For AI operators, dramatic growth in the scale of their operations means the possibility of creating new markets for new products, faster than their competitors. 

    But this creates a familiar problem: even if computing hardware efficiency improves rapidly – and ML models are, in general, becoming more efficient – that efficiency tends to only mean more demand for the models’ output. 

    In the same way that more efficient aeroplanes have created more demand for flying, so, too, have more efficient computers created more demand for compute – and, behind the humming servers, that means more energy demand and more greenhouse gas emissions. 

    Nor are the benefits of those emissions and all that processing immediately clear. It’s obvious who the winners from Big Tech are: the vast corporations, and their owners, chiefly based in the US, that exercise near-monopoly powers over the digital economy and that have generated such incredible returns. 

    But data centres themselves create few jobs locally, while their resource demands – not only for electricity but, across the world, for water – impose severe strains on local inhabitants. 

    Decarbonisation

    Big Tech’s unequal ecological exchange is still under-researched, but its operation creates a semi-colonial dynamic: the masters of the system at the core, engaged in extractive operations across the rest of the world, squeezing its scarce resources.

    Opportunity Green’s paper is just the start of our research. We think there is a pressing need for a better-informed public discussion about the role of compute, and its environmental impacts – with a proper accounting of the potential benefits set against a clearer understanding of the downsides. 

    The challenge to energy systems and decarbonisation plans, in particular, raised by breakneck data centre expansion requires urgent attention, in three broad area:

    • National and supra-national assessments should be made of the likely paths of energy use and subsequent emissions from data centre expansion, and these scenarios considered against the national and supra-national frameworks for emissions control. Current estimates on emissions from data centres are subject to wide variation, and these could be usefully refined.
    • The legal, regulatory and policy changes needed to move data centre operations as close as possible to 100 per cent renewables, taking a whole-lifecycle approach from construction into operation and including support for battery storage. These could include consideration of the contribution of the users of data centres (Big Tech) to the tax system and allocation of funds to grid expansion/decarbonisation.
    • Finally, to develop policy for minimising data collection, processing and storing, alongside a robust hierarchy of data centre applications. This will build on recommendations in the National Engineering Policy Centre’s report on minimising the footprint of data centre operations. The problem of demand and purpose of data centre expansion is inseparable from the issue of their resource and emissions impact, and should be confronted directly by policy.

    This Author

    Dr James Meadway is the senior director of economics at Opportunity Green. He is also the presenter of the Macrodose podcast.

    Discussion