In 2020, all the data centers in the world used less than 300 TWh of electricity, or 1% of global energy consumption (MIT, 2025). In 2024, they consumed 415 TWh, reaching 1.5%. By 2030, they are expected to consume 945 TWh – that’s 3% of global electricity consumption (IEA, 2025), and more than the entire aviation industry (IEA, 2025). Why? Because the exponential growth in AI development and usage is more power-hungry than any computing process to come before it.
As the internet becomes increasingly over-saturated with AI content, we must consider: is the environmental cost worth the output?
The Impact of AI on Energy Consumption
Any request using the internet, whether for AI, search engines, web hosting, emails, or streaming, requires energy. The servers that process these requests are housed in data centers powered by electricity. In most countries, this electricity is drawn from fossil fuel powered energy grids, resulting in the release of carbon emissions.
The biggest driver of the increase in data center energy consumption in the past five years has been AI. An AI request today requires almost ten times more electricity than a search engine request (IEA, 2024):
Google search – 0.3 Wh
ChatGPT request – 3 Wh
If AI does drive data center energy consumption to 945 TWh, 2030 will see data center emissions reach 447 million tonnes of CO2e (based on a global grid mix of 473 gCO2e/kWh (Ember, 2024)). To keep global temperatures below the 1.5°C threshold agreed in the Paris Agreement, the world has just 160 billion tonnes of CO2 left to burn through. 447 million tonnes represent 0.3% of that budget – almost as much as the entire population of Brazil emits every year (Our World in Data, 2023).
Why Does it Matter?
As the use of AI skyrockets and associated emissions climb, the internet is becoming ever more saturated with AI content – and this is laying the foundations of a total AI model collapse. In this scenario, the millions of tonnes of CO2 that have (and are yet to be) released into the atmosphere thanks to AI will have been for nothing.
AI model collapse takes place when AI models consume AI-generated content during model training. With more than half of the text on the internet already being AI generated or translated (AWS & UC Santa Barbara, 2024), AI models are currently being trained on a vast pool of non-human content. AI models that are trained on AI content produce poor-quality outputs that become nonsensical after a few iterations of training (Nature, 2024). For example, in this study, when asked about historic church architecture, the first iteration model confused the 18th and 19th centuries, the second iteration hallucinated the existence of a St Peter’s Basilica in Buenos Aires, the sixth iteration responded with nothing but a list of languages, and the tenth iteration answered:
‘In addition to being home to some of the world’s largest populations of black tailed jackrabbits, white tailed jackrabbits, blue tailed jackrabbits, red tailed jackrabbits, yellow.’
Without fresh human-generated training data, or a fundamental change in the way AI works, AI models will continuously degrade. If current trends continue, the 447 million tonnes of CO2e generated by data centers in 2030 could be fuelling an ever more unusable pool of AI generated content.
What Can We Do About It?
Educate individuals about the energy consumption of AI.
The most important first step is for individuals to become aware of how energy intensive AI can be. Most people are unaware that AI – much like search engine queries – even has a carbon impact. In particular, organizations that have adopted AI have a responsibility to educate their employees on the environmental impact of its use.
By quantifying how much energy AI uses in relatable terms, individuals can better weigh up when it is worth using. For example, the 3 Wh of energy needed for a ChatGPT request is about the same as charging a phone to 100% (EnergySage, 2024).
Be selective about how AI is used.
The next step is to consider the most efficient use cases for AI. AI-enhanced processes can be more productive and less energy intensive than a person working alone, usually by cutting down the time it takes for an individual to process information. On the flip side, mandatory AI processes can generate emissions unnecessarily. For example, customer service chatbots that cannot be bypassed when a customer requires input from a real agent, or AI generated content that requires human assessment and editing.
AI tools should be implemented only when their use is shown to be more effective and less energy intensive than an employee working alone.
Consider where your energy is coming from.
One of the most proactive things an organization can do to reduce the carbon emissions of its AI usage is to locate the infrastructure on which it runs, whether owned by the organization or operating in the cloud, in a data center in a location where the electricity used has a low carbon intensity. In most cases, organizations can request that their services are based in specific data centers, with a manageable impact on operationality or cost.
Because data centers are connected to the national grid of the country they are in, the carbon intensity of that country’s or that region’s grid will determine how much carbon the computing processes within that data center produce. For example, cloud services operating through a Norwegian data center will generate far fewer emissions than the same services based in data center on the East Coast of the US, because Norway’s national grid is powered by a much higher percentage of renewable energy sources than that on the US East Coast.
Currently, 45% of the world’s data center capacity belongs to the US, followed by China (MIT, 2025). Unfortunately, these two countries have very carbon intensive grid mixes, with gas being the primary energy source in the US, and coal in China. To stay below the 1.5°C threshold, every country must entirely decarbonize their grids by 2050. Fortunately, many European and South American nations are investing heavily in data center construction, building upon continental grids that are primarily powered by renewables. This is opening up opportunities for organizations to lower their AI emissions by requesting that their cloud services be located in either continent.
Find Out More
If you’re concerned about the impact of AI on your organization’s carbon emissions and would like to quantify and reduce your impact, get in touch with the Tailpipe team here.