“Carbon emissions from 20 AI systems were greater than those from 137 countries" - the true environmental cost of AI will blow your mind

“Carbon emissions from 20 AI systems were greater than those from 137 countries" - the true environmental cost of AI will blow your mind

Here's how much energy your next ChatGPT query will use.


Across the UK, but particularly in the south-east of England, giant and featureless buildings are springing up. Despite their size, they’re designed to be unobtrusive, and you won’t spot any signage telling you what they are.

Some 477 already stand, with another 100 to come in the next five years. In Blyth, near Newcastle, another 10 such structures covering 54 hectares will be constructed on the site of the former Blyth Power Station at a cost of £10 billion, with work due to start in 2031.

Though they look like warehouses, these aren’t for storing goods. Instead, they are “computers in sheds”, said Countryfile presenter Tom Heap in a recent edition of BBC Radio 4’s Rare Earth: “Millions of computers, just like the ones we use but formed as servers, in racks in huge barns.”

Owned by mega corporations such as Amazon, Google and Microsoft, these data centres process internet traffic ranging from simple web searches to streaming services, cloud storage, online banking systems and cryptocurrencies. 

The accelerating proliferation of data centres across the world is, though, primarily driven by the boom in the use of so-called artificial intelligence (AI). Modelling processes such as human language and behaviour is complex and demands an enormous amount of computing power. Hence the construction of huge barns full of servers devouring copious electricity.

Researchers at the Massachusetts Institute of Technology estimate that a single enquiry to ChatGPT (one of the most commonly used AI programmes, created by OpenAI) consumes about five times more electricity than a simple web search. It’s also been estimated that training a large language model such as OpenAI’s GPT-3 uses electricity equivalent to the power consumed annually by 130 US homes.

“Computers are in essence just incredibly efficient electric heaters,” explains Heap. “For every ChatGPT query you make, it takes one pint of water to cool the system.”

Why is this? Well, ChatGPT-3.5 has roughly 175 billion parameters – internal variables and values that the AI uses to make predictions or generate outputs. The more parameters a model has, the more complex and powerful it is – but the more computational power required to train it. It’s estimated that ChatGPT-4 has up to 1.8 trillion parameters, so training it produces emissions perhaps 12 times higher than its predecessor.

Extrapolated to a global level, “Data centres are consuming anywhere up to 2–4% of the world’s electricity supply,” says Heap. As AI becomes more widely used by governments, companies and individuals, and its applications become more complex, that figure is rising at an incredible rate.

The lack of a standardised method for measurement makes it difficult to quantify the exact impact of AI’s energy demands, but researchers at Zhejiang University in China found that carbon emissions from 20 AI systems in 2022 were greater than those from 137 individual countries.

Climate change: what's the problem and what needs to be done?

Could AI help environmental and societal problems?

Energy consumption isn’t the only concern. Data centres require vast areas of land that could be used for building houses. Perhaps even more alarming, energy-greedy AI is also a major contributor to climate change. However, some believe that AI could help solve environmental and societal problems.

Take organic farming, which aims to produce food as sustainably as possible. There’s vast potential for AI to help farmers model the impacts of weather and pests on crops, to devise ways to improve soil health without the use of chemical fertilisers or pesticides.

The Soil Association’s Adrian Steele urges caution over the use of AI, though. “It is boosting the production and consumption of fossil fuels at a gigantic scale,” he says. “The question for us as farmers is how best to utilise this new technology: will it be a route to increasing the scale of monocultural production with fewer people, or will it facilitate the dream of empowering small-scale growers to regenerate local supply chains?”

The big data companies are aware of the environmental problem. Google, for example, uses AI algorithms to optimise data-centre cooling systems and reduce energy consumption, lowering carbon emissions. Microsoft has trialled underwater data centres, including
a major project off Orkney where, beneath the sea, between 2018 and 2020 it proved easier to cool the processors and maintain the system in a more stable atmosphere. And of course electricity is increasingly produced by renewable sources.

“AI has the potential to significantly benefit the environment by enabling more efficient resource management, advancing renewable energy technologies, and improving climate change modelling,” one Zhejiang University researcher said. “But it is a double-edged sword.” Fergus Collins

This website is owned and published by Our Media Ltd. www.ourmedia.co.uk
© Our Media 2026