Are Data Centers Bad for the Environment?
It Depends. Done Properly, They Could Be a Net Positive.
Since the early 2000s, a significant and ever-expanding fraction of human interaction — from banking and retail to entertainment and dating — has been mediated by vast arrays of computer servers siloed into electricity-devouring warehouses called data centers. The recent surge in artificial intelligence (AI) has supercharged this trend, as AI requires more processing power than conventional computing and thus larger, more voracious colonies of cerebrating silicon.
The largest investors in artificial intelligence — the so-called ‘hyperscalers’ like Amazon, Google, OpenAI, Microsoft, and Anthropic — are building gargantuan data centers around the world to service demand for their AI products. Meta Platforms, the owner of Facebook and Instagram, is currently constructing a 5-GW data center in Louisiana.1 On its own, it will consume three times as much electricity as the entire city of New Orleans and sprawl across a campus five times larger than its famed French Quarter.
There are currently just over 4,000 data centers in the United States, with a combined 75 GW of total facility power (TFP) capacity.234 Additionally, there are 875 data centers planned or under construction, representing a further 241 GW of TFP capacity.5
The speed and scale of this build-out have provoked a crescendoing backlash. Senator Bernie Sanders of Vermont has called for a nationwide moratorium on new data centers, motivated at least in part by their supposed environmental harms.67 The idea is rapidly gaining traction in the Democratic Party’s progressive wing, with Rep. Alexandria Ocasio-Cortez’s companion legislation in the House attracting 12 co-sponsors.
Progressive politicians are cresting a wave of public sentiment. A recent poll from the Pew Research Center showed 39% of Americans think data centers are bad for the environment, compared to 4% who think they are good, and 14% who claim they are neither.8 A Quinnipiac University poll has found that 65% of Americans oppose the construction of data centers in their communities, with only 24% in favor.9
The notion that AI data centers constitute a uniquely grave planetary menace has become firmly embedded in the popular imagination. Social media posts lambasting AI’s environmental effects regularly go viral — including the one above, which (earnestly) implies AI usage was directly responsible for last month’s heatwave in California. Demonstrations against AI data centers have erupted across the globe, where protest signs charmingly remind the assembled that “you can’t drink data.”10
But are data centers truly harmful to the environment?
The science paints a much more complex picture than the public debate. The worst data centers are powered by coal or other fossil fuels and use cooling water inefficiently; they do indeed cause environmental harm. The best data centers are powered by low-carbon energy like wind, solar, or geothermal and use very little water in their cooling operations or even none at all. These data centers are at worst neutral for the environment, and can actually be beneficial as — if properly tariffed and taxed — they provide revenues to finance electrical grid upgrades needed for the energy transition away from fossil fuels.
This more nuanced portrait suggests banning data centers on environmental grounds is not necessary. Rather, data centers should be regulated to ensure they both disclose their impacts and comply with strict emissions and water-use standards.
Gardez votre sang-froid
The environmental concerns surrounding data centers can be broken down into three basic categories: their land use, their water use, and the emissions associated with their energy use.
Of the three issues, land use is clearly the most trivial and easily dismissed. According to the most recent estimates, data centers in the United States occupy about 291 square kilometers of land, which is only 0.0032% of land in the United States.11 For comparison, this is less than the amount currently used for growing blueberries.12 Almond production takes up 17 times more land than data centers, and golf courses use 31 times more land (both also use enormous amounts of water and cause pesticide runoff pollution).1314
Even if data centers multiplied 100-fold over the next few decades, they would still occupy only a fraction of 1% of the United States’ land area. This compares to soybean production at around 3.7%, grain production at around 6%, and beef production at nearly 30%.1516 On a list of economic activities that consume land, data centers don’t even crack the top 100 and likely never will.
Concerns about data center water usage, however, are more serious. All computers – from a laptop singeing your leg to the servers housed in data centers – generate waste heat. For data centers to function properly, this heat must be removed from the building. The most commonly employed cooling method for large data centers is known as evaporative cooling — which cools indoor air by transferring its heat to water and then evaporating it into the ambient atmosphere.
Evaporative cooling is a water-consumptive process. Its total water consumption depends on the exact method employed and the cooling needs of the data center (which can vary with climate). The industry-standard estimate for Water Use Efficiency (WUE) in legacy (mostly non-AI) data centers is 1.9 L/kWh.17 The newer, even larger data centers used for AI are much more efficient — dragging the weighted national average down to only 0.36 L/kWh in 2024.18
In addition to the water used directly in cooling, data centers also consume water indirectly via their demand for electricity. Most electricity in the world is still made by boiling water to generate steam, which in turn spins the turbine of a generator. While the steam itself is recycled, copious amounts of water are consumed to cool the system and condense the steam back into liquid. Coal generation plants consume between 1.5–2.5 L/kWh of water for this purpose, and fossil gas plants between 0.4–1 L/kWh. Nuclear is the most wasteful thermal generation technology at between 2.2–3.2 L/kWh. Water usage for solar, on the other hand, is less than 0.03 L/kWh (for cleaning the panels), and the figure for wind is essentially zero.19 On average, 75% of data center water use comes via the electric power plants (coal, fossil gas, hydro, and nuclear) that provide their electricity.2021
Yet even when counting both direct usage for cooling and indirect usage from electricity — data centers represent only 0.3% of all consumptive water use in the United States.22 Golf courses use 50% more water, and the almond-industrial complex swallows more than four times as much.232425 Agriculture in aggregate accounts for ~90% of all water consumption in the United States.26 The amount of water wasted by the agriculture industry through inefficient irrigation practices alone is 61 times greater than the entire consumption of Big Tech and its data centers.2728
It is true that in places with high data center concentration, like Northern Virginia, AI water usage share can approach 3%. But this number is still not sufficiently large to meaningfully threaten local resources. The widely propagated narrative that water consumption from AI data centers is important enough to impair agricultural production or drinking water supplies is not supported by evidence.
There is one wrinkle in the conclusion that AI water usage is entirely a chimeric threat born of moral panic, however. The leaders of artificial intelligence companies like OpenAI and Anthropic promote AI as the most transformative technology ever invented and promise its economic returns will eclipse those of every other industry in human history. If this is true, a 20-fold increase in the data center fleet over the next half-century is not entirely outside the realm of possibility. Were data centers to indeed vigintuple while still predominantly using evaporative cooling and thermal power plants, their water usage would cause serious, localized ecological impacts in areas with high data center concentration.
This worst-case scenario, however, is vanishingly unlikely given improvements in cooling technology and a rapidly changing electricity generation mix. Hyperscalers are increasingly using closed-loop systems for cooling, cutting water use drastically. Remarkably advanced liquid-to-chip technology — in which cooling fluid circuits are built directly into each chipset — has already debuted and will likely bring AI cooling WUEs close to zero in the next few years.
On the power generation side, the increasing use of wind and solar energy to generate electricity will also reduce data center water consumption. Some AI hyperscalers are foolishly planning to build off-grid data center campuses powered by on-site fossil gas generators — walling themselves off from a greening electric grid. Most data centers, however, will benefit from a grid that is gradually decreasing both its carbon emissions and its water footprint.
The AI data center water issue isn’t “fake,” to cite the designation of one famous essay on the subject. Yet, at present, it isn’t an especially serious problem either. It is best thought of as an embryonic threat — one that is best nipped in the bud by forcing data centers to be water efficient and to power themselves with low-carbon, low-water sources of energy like wind and solar.
Methanum non incendendum est
The other serious environmental concern over data centers is their effect on climate change. Globally, data centers account for just 0.5% of carbon emissions.29 That is less than half the carbon footprint of a single American oil company, Exxon (1.3% of global CO2 emissions), less than one-eighth the footprint of the world’s largest oil company, Saudi Aramco (4.3%), and around one-sixteenth the footprint of the global cement industry (~8%).
Today’s absolute numbers are relatively small. But the trend line for data center carbon emissions is more worrisome than that for water use. The International Energy Agency (IEA) projects data center greenhouse gas (GHG) emissions will hit 1% of the global total by 2030. If AI usage growth goes hyperbolic (and AI boosters keep promising us it will), a 10-fold increase in data center capacity at current carbon intensity levels would be catastrophic for climate change. Under those conditions, data centers could account for between 10% and 15% of global GHG emissions by 2040 and push emissions higher than baseline projections by about 10%.30
There is perhaps some scope for reducing data center emissions by increasing the energy efficiency of AI chips.31 But a much more promising and far simpler strategy exists — don’t power them with fossil fuels.
Historically, Big Tech companies generated as much electricity as possible from clean sources. The economic imperatives of the AI race, however, have turned once-responsible corporate citizens into ferocious, fossil-gulping atmospheric arsonists. Companies like Amazon have faced criticism at several sites nationwide for over-use of emissions-intensive diesel generators, which are supposed to be deployed strictly during emergencies but are increasingly being relied upon as a primary source of power.3233
Elon Musk’s xAI, maker of the Grok LLM chatbot, has taken this concept a step further — assembling 27 gas turbines intended for temporary use at remote oil and gas fields into a makeshift off-grid power plant at a data center in Southaven, MS. The project has attracted significant opposition from local residents who cite the noise and air pollution.34 Undeterred, the company plans to expand its fleet to 41 turbines.35
Microsoft — a sustainability leader that pioneered carbon removal and waterless cooling technologies — is planning to build a massive 1.4-GW fossil gas plant in West Virginia.36 Google — which is building a state-of-the-art low-carbon data center in Minnesota — is also building a dirty, gas-powered data center in northern Texas that will produce more carbon emissions than the entire city of San Francisco.37
Notably, the dirtiest projects are located in states with notoriously weak environmental standards, suggesting the hyperscalers have already provoked a nationwide race-to-the-bottom that imperils decades of work to limit air pollution and carbon emissions.
In some cases (notably xAI), the return to fossil generation reflects a genuinely insouciant IDGAF attitude toward climate change. In others (Google and Microsoft), it is probably more of a scattershot redundancy strategy designed to ensure generation access if renewables and battery storage fail to scale to meet demand. Regardless of the motivation, this hodgepodge, ‘all-of-the-above’ approach will simply not work if our goal is to prevent data center carbon emissions from exploding.
Yet there are two reasons why a blanket moratorium specifically would be counterproductive. First, data centers have historically played a vital role in the development of new low-carbon green technologies. Around 2010, data centers began signing power purchase agreements (PPAs) that helped scale wind and solar energy in the United States. Although the AI boom has Big Tech flirting with fossil gas again, it is still voluntarily subsidizing GreenTech to a substantial degree.
Google, for example, is heavily invested in geothermal and iron-air batteries via Fervo Energy and Form Energy, respectively. These companies are expected to make significant contributions to the global fight against climate change — and neither may have survived to the present day without Google and its data centers. The next such companies may never exist were data center construction to be halted.
Second — and far more importantly — the AI data center boom can be harnessed to directly combat climate change. Burning fossil fuels for home heating accounts for around 5% of US carbon emissions. Burning gasoline for road transport accounts for an additional 16% of emissions. The only way to eliminate these emissions is by electrifying home heating with heat pumps and replacing gas-powered cars with electric vehicles (EVs).
The problem, however, is that the current electric grid does not have enough capacity to handle the additional load. Upgrades to electrical transmission and distribution infrastructure are necessary, and they will be extremely expensive, with the cost normally passed on to consumers. Thankfully, these same upgrades will also be necessary for the AI data center build-out, and — as it happens — AI companies have a lot of money.38
Compelling AI data center builders to fund the infrastructure necessary for their own use and for heat pumps and EVs turns a negative environmental impact into an enormously climate-positive 3-for-1 deal — one policymakers around the world would be foolish not to make.
In delay there lies no plenty
Done poorly, the AI data center build-out could increase carbon emissions by around a tenth, significantly setting back the fight against climate change. Done well, it could be used to decarbonize industries representing one-fifth of carbon emissions, accelerating progress toward net-zero.
The data center boom’s profoundly contingent climate trajectories make the Sanders–Ocasio-Cortez moratorium an especially suboptimal policy from an environmental perspective.39 Lawmakers should not be standing still — they should be actively steering data centers away from a climate-negative impact toward a climate-positive one.
Moreover, the restrictionist campaign is arguably an error of political logic. AI data center construction is an opportunity to force Big Tech companies to subsidize the fight against climate change and lower electricity bills for everyone else. The progressive and populist move (or even, more simply, the rational move) would be to take advantage of this moment — not to let it pass by.
To that end, lawmakers should ditch plans for moratoriums and instead enact a policy framework that requires new data centers to pay for their own new, low-carbon generation. This concept has been baptized BYONCE (Bring Your Own New Clean Energy) by energy policy experts, a winking nod to an obscure Texan recording artist once famous for fronting the ’90s girl group Destiny’s Child.40
In the United States, 100% BYONCE mandates should be adopted by states now and expanded to the federal level when the political winds allow. Regulators should also mandate data centers have best-in-class Water Use Efficiency scores (say, under 2.5 L/kWh – including both direct and indirect use).41 And the worst practices of the data center industry — such as running dirty diesel generators or building massive off-grid fossil gas plants — should be proscribed outright. Additionally, hyperscalers should be coaxed into paying for the transmission and distribution upgrades necessary for both their own needs and the wider needs of the energy transition.
The toolkit to green data centers exists and includes mature renewables like wind and solar, new renewables like enhanced geothermal (EGS), closed-loop and liquid-to-chip cooling, and wonkish policy concepts like demand response, virtual power plants (VPPs), large load tariffs, speed-to-power incentives, and 24/7 carbon-free energy matching.
It is indisputably possible to power data centers with 100% low-carbon energy, reduce their water use by 90%, and use their construction to underwrite electrical grid upgrades. Any specific data center regulation, law, or policy mixture should be in service of these three very laudable and attainable goals that collectively deliver critical environmental and economic benefits to the public. Eventually, the data center industry’s Scope 3 emissions — the carbon impacts embedded in cement, steel, and chip manufacture — must also be addressed with similar mandates.
There are many reasons why humanity might ultimately decide to halt the development of AI — from the existential risks it poses to our species’ survival to concerns about mass labor displacement, and the exacerbation of income inequality. The environment, however, need not be one of them. AI data centers are not inordinately harmful to the environment. There is a clear pathway to dramatically reduce their impact and transform them into engines of the energy transition away from fossil fuels.
Should society decide to continue developing artificial intelligence, the consequent environmental issues are completely solvable. Rather than bemoaning their existence and exaggerating their scale — we should simply employ our natural intelligence and solve them.
Get Earthview in your inbox:
And consider a paid subscription to access premium content and support the work we do!
Total Facility Power (TFP) is a measure of data center power capacity that takes into account cooling and other ancillary loads in addition to the load drawn by the IT hardware itself. It is a more accurate measure of a facility’s impact on the grid than nameplate IT capacity.
The total figure for US data center land footprint is estimated at 72,000 acres (291 square km). This is a bespoke calculation done by breaking the total number of data centers in the United States into 7 tiers and multiplying by average site size across each tier.
In the United States, golf courses use about 10,117 square km of land — an area almost four times the size of Rhode Island
https://monarchsintherough.org/monarchs-in-the-rough-experiencing-fast-start/
In the United States, almonds use about 5,463 square km of land — an area about the size of Delaware.
https://esmis.nal.usda.gov/sites/default/release-files/zs25x846c/mc87rn20c/w37656321/ncit0525.pdf
In 2021, researchers estimated the average WUE for the US electric grid at ~7.0 L/kWh. This figure is higher than the estimates for the individual thermal power components due to the outsized water usage of hydroelectric dams, transmission losses, and scope 3 consumption of utilities.
The latest estimates for United States golf course industry water usage is 1.63 million acre-feet annually (531.1 billion gallons).
https://journals.ashs.org/view/journals/horttech/35/5/article-p848.xml
The latest estimates for United States almond industry water usage is between 4.7 and 5.5 million acre-feet annually (1.53–1.79 trillion gallons).
https://www.c-win.org/cwin-water-blog/2024/9/23/california-almond-water-usage-updated
These figures are calculated using the widely cited figure of 20% irrigation losses. This only includes water wasted through leaky pipes, uncovered canals, poorly targeted delivery methods like flood or sprinkler irrigation, etc. An equal amount of water is wasted via food waste (i.e. the water left in uneaten food).
These are my own calculations using a simple 10x multiplication of current data center emissions and comparing to the IEA’s STEPS moderate transition path.
Any efficiency gains risk being eaten away by Jevons paradox, the observation that improvements in resource efficiency often lead to increasing consumption as falling costs stimulate more demand.
The value of AI compute can be up to 30 times higher than the cost of electricity — giving Big Tech companies an economic rationale to pay above-market rates for electricity that no other sector can match.
The Sanders–Ocasio-Cortez moratorium is — additionally — a suboptimal policy for mitigating any of the other risks of Artificial Intelligence. See Nat Purser’s Pausing isn’t policy:
BYONCE is most closely associated with Jesse Jenkins of the Princeton ZERO Lab and colleagues. See:
24/7 carbon-free electricity matching accelerates adoption of advanced clean energy technologies
Flexible Data Centers: A Faster, More Affordable Path to Power
And the following podcast:
A 2.5 L/kWh standard would have the additional benefit of forcing nuclear to use co-generation to improve its WUE.






