Global demand for cloud computing, driven in large part by the expansion of AI, is on an ever-increasing upward trajectory. This requires a constant expansion of data center capacity, both at existing sites and from new builds. Attempting to mitigate this expansion is the pursuit of the sustainable data center: facilities that can meet the world’s need for on-demand digital services whilst also being highly energy and water efficient, compatible with local environments, and emitting zero carbon.
The first challenge: Power Usage Effectiveness
Data centers have historically faced one issue after another when it comes to sustainability. Early hyperscale data centers, built to accommodate the rollout of cloud technology in the late 2000s, were designed primarily with output in mind: approximately 1.5 times as much energy was consumed by auxiliary infrastructure like cooling, lighting, and power supply (including wastage) than by servers themselves (Statista, 2025). This metric was coined Power Usage Effectiveness, or PUE, by The Green Grid in 2007, and has been a benchmark for sustainable data center design for the last two decades. PUE expresses the ratio of energy used for computing equipment versus the total energy consumption of the data center, where a PUE of 1 would represent 100% of a data center’s energy being used purely on computing equipment. Improvements in the energy efficiency of hardware in the following years resulted in the average global PUE being around 2 (meaning that as much energy was being consumed by overheads as computing equipment).
The next step: Cooling
The next big issue to solve was the enormous energy cost of cooling data center hardware. By the mid 2010s, air-based cooling was the second largest consumer of data center power, after the computing equipment. The enormous density of high-power hardware that data centers require generates vast amounts of heat: the safe range for server rack temperature reaches 27°C, with cooling. The advent of liquid cooling systems provided an alternative to the more energy intensive air-based methods. Liquid cooling lowers a data center’s PUE as liquid is better at transferring heat away from the servers than air alone, so less energy is needed overall. With air-cooling, up to 20% of a server’s power is needed to power its fans, whilst liquid cooling replaces server-specific fans with pumps that are shared by dozens of server racks (Data Center Frontier, 2024). Liquid cooling contributes in large part to the record-low PUEs of hyperscale cloud service providers like AWS (1.15), Microsoft (1.16), and Google (1.09), in contrast to the global average of 1.56 – due, in large part, to air cooling still being the predominant cooling method across the 10,823 data centers around the world.
The inevitable side effect: Water scarcity
Liquid cooling may have lowered power consumption, and therefore emissions, but it created another sustainability issue: water scarcity. An open-loop liquid cooling system in a hyperscale data center can consume 19 million liters of water a day, the same as a town of 50,000 people (EESI, 2025). This puts stress on public water supplies and contributes to the depletion of freshwater habitats, an issue compounded by the popularity of water-scarce areas as data center construction zones, due to the sensitivity of data center hardware to humidity. Public backlash to headlines about the impact of water scarcity on local residents has led to the large cloud service providers investing in recycling and wastewater cooling systems; a notable example being Microsoft’s new Fairwater datacenter in Wisconsin, which (in theory) will recycle the water in its cooling system so efficiently that its yearly water consumption will be no more than a typical restaurant.
The unavoidable issue: Exponential capacity
With Microsoft’s energy-efficient PUE ratio and minimal water consumption, the Fairwater data center seems like a huge accomplishment in the pursuit of the sustainable data center. But even this cannot escape the fundamental crisis in sustainability facing the current data center industry: capacity. Fairwater might perform well on efficiency and water metrics, but it is just one of a vast crop of new data centers that will consume more power than ever before. As a specialist AI data center, Fairwater boasts hundreds of thousands of NVIDIA Blackwell GPUS, the newest development in AI processing hardware. At a bare minimum, this data center will consume as much power every day as a large US town:
100,000 NVIDIA Blackwell B200 GPUs at 50% utilization for 24 hours will consume 1,383,216 kWh of power (see the Tailpipe methodology for details).
Adding in Microsoft’s average PUE of 1.15 results in an overall power consumption figure of 1,590,698 kWh.
The average US household consumes 30kWh per day, and the average US town is comprised of between 10,000 and 50,000 homes.
1,590,698 kWh is as much power as 53,023 homes will consume per day.
This trend is not specific to Fairwater: in 2025, data centers saw an average capacity increase of 17.7% (Visual Capitalist, 2025), and the number of hyperscale data centers worldwide has more than doubled since 2020 (SRG, 2025). The efficiency gained in the last two decades may well be outpaced by the sheer increase in compute capacity in the coming years: AWS, Microsoft, and Google have publicly released plans to increase the number of data centers they own by 78% in the future (The Guardian, 2025). Greater data center capacity means greater demands on power grids, which leads to the final, and perhaps greatest issue impeding the sustainable data center: energy matching.
The false equivalence: Energy matching
Energy matching is a strategy used by organizations to offset the carbon emissions that they generate through their operations. The vast majority of data centers are powered by the national grid of the country that they are located within (rather than generating their own power), which means that their carbon emissions are directly proportional to the quantity of fossil fuels burned in powering the local grid. AWS, Microsoft, and Google all meet their self-imposed carbon reduction targets by matching the quantity of electricity that they purchase from fossil fuel-fed grids with the purchase of an equivalent quantity of renewable energy. This is beneficial in increasing the overall supply of renewable energy available to consumers and supporting the renewable economy, but it does nothing to reduce the carbon emissions actually generated by power-hungry data centers. As capacity increases, cloud service providers run the real risk of hiding their increasing carbon emission figures behind energy matching:
Google’s 2025 Sustainability Report:
For the last eight years, since 2017, we’ve matched 100% of our global electricity use with renewable energy purchases.
Google’s actual emissions: 51% increase from 2019 baseline year.
Amazon’s 2024 Sustainability Report:
100% of electricity consumed by Amazon was matched with renewable energy sources in 2024, for the second consecutive year.
Amazon’s actual emissions: 33.4% increase from 2019 baseline year.
Microsoft’s 2025 Sustainability Report:
We will reduce our Scope 1 and 2 emissions to near zero against a 2020 baseline by increasing energy efficiency, decarbonization of our operations, and reaching 100% renewable energy by 2025.
Microsoft’s actual emissions: 23.4% increase from 2020 baseline year.
Please note, actual emissions figures are taken from Google, Amazon, and Microsoft’s annual sustainability reports for all their business services; they do not publish emissions figures for their cloud provision alone.
Finally, a good example
So, what’s the solution? One example of a truly sustainable data center can be found in the EcoDataCenter facility in Falun, Sweden, which boasts the impressive combination of a PUE of 1.2 alongside a WUE of 0.7 (EcoDataCenter Sustainability Report, 2025), less than half of the industry average of 1.8 liters/kWh (Data Center Knowledge, 2025).
It achieves this via a sustainable liquid cooling system that is balanced with natural air cooling, as well as efficiency by design principles in its hardware configurations. Designed for AI workloads with a total capacity of 80 MW, the Falun EcoDataCenter has the same compute capacity as many of the major clouds service providers’ data centers, and yet relies on 100% on-site renewable energy, rather than being powered by the local grid. On top of this, the facility transfers its excess server heat to a local thermal power plant, increasing the local supply of renewable energy.
It achieves this via a sustainable liquid cooling system that is balanced with natural air cooling, as well as efficiency by design principles in its hardware configurations. Designed for AI workloads with a total capacity of 80 MW, the Falun EcoDataCenter has the same compute capacity as many of the major clouds service providers’ data centers, and yet relies on 100% on-site renewable energy, rather than being powered by the local grid. On top of this, the facility transfers its excess server heat to a local thermal power plant, increasing the local supply of renewable energy.
This facility demonstrates the potential that data centers have to not only meet the computing needs of the digital age, but to also contribute to reducing global carbon consumption to help in the fight against climate change.
To build a truly sustainable data center, cloud service providers need to prioritize low PUE, implement a closed-loop water cooling system, draw power directly from 100% renewable energy sources, and find ways of harnessing the inevitable heat they generate. This is an ambitious project that is out of the realm of possibility for many existing data centers due to their existing infrastructure, but it must be prioritized for new builds as the inevitable rollout of new data centers continues.
Find Out More
Tailpipe reduces organizations’ cloud carbon emissions by recommending data centers that are based in low-carbon regions. For a detailed explanation of how Tailpipe can reduce an organization’s cloud spend and carbon emissions, see Tailpipe’s Recommendations.
To discuss what Tailpipe can do to measure and reduce your cloud computing spend and emissions, get in touch with us here.
We use cookies to improve your experience on our site. By using our site, you consent to cookies.
Websites store cookies to enhance functionality and personalise your experience. You can manage your preferences, but blocking some cookies may impact site performance and services.
Essential cookies enable basic functions and are necessary for the proper function of the website.
Google reCAPTCHA helps protect websites from spam and abuse by verifying user interactions through challenges.
Statistics cookies collect information anonymously. This information helps us understand how visitors use our website.
Google Analytics is a powerful tool that tracks and analyzes website traffic for informed marketing decisions.
Service URL: policies.google.com
You can find more information in our Cookie Policy and Privacy Policy.