Much of the public discussion about artificial intelligence today revolves around jobs, creativity and productivity. Analysts argue that AI will automate routine work, augment human ingenuity and boost economic growth, while critics warn of job displacement and inequality. These debates are important and grounded in real economic shifts. Yet an equally consequential issue receives comparatively less sustained attention: the environmental cost of AI. The infrastructure that powers AI — sprawling hyperscale data centres — consumes enormous electricity and water, generates intense heat, and exerts mounting pressure on local ecosystems and communities.
Growing electricity demand
The rapid growth in AI uses — from generative assistants to predictive analytics — has made data centres one of the fastest-growing consumers of electricity globally. According to the International Energy Agency’s Energy and AI report, global data centres consumed about 415 terawatt-hours (TWh) of electricity in 2024, roughly 1.5% of all global electricity demand. This figure could nearly double to about 945 TWh by 2030, a level comparable to the total annual electricity consumption of a major industrial economy such as Japan. AI workloads — both training and inference — are the fastest-growing drivers of this demand.
In the United States, the IEA estimates that data centres already account for about 4.4% of national electricity consumption in 2023, up from around 1.9% in 2018, with projections to rise further by the end of the decade. In practical terms, planned AI-focused facilities can require massive amounts of electricity — some needing over 1 gigawatt (GW) of power, enough to supply around 850,000 average US homes. This strain on grids has even led some companies to plan their own gas-fired power generation to meet demand, raising concerns over fossil fuel dependence.
The size of this electricity footprint also dwarfs traditional computing. Generative AI queries are significantly more energy-intensive than conventional internet searches. Engineering estimates have placed the energy consumption of a standard Google search at approximately 0.3 watt-hours (Wh), whereas a single generative AI query can use around 2.9 Wh — nearly 10 times more energy per query.
Heat production and cooling
Every watt of electricity used by a server ultimately converts into heat. In traditional data centres, heat generation was manageable with existing cooling systems. But AI clusters — with dense racks of high-performance GPUs — create much more heat per square foot of server space. Cooling systems, including air conditioning, evaporative chillers and liquid cooling, are themselves major energy consumers, often accounting for 30% or more of a data centre’s total power use.
Unlike typical commercial buildings, data centres operate 24/7 at high intensity. The heat generated must be continuously removed to prevent hardware failure, meaning that electricity demand for cooling scales directly with AI workload density. As AI adoption grows, so does the energy “overhead” needed for thermal management, multiplying total grid demand beyond computational loads.
Water consumption
Electricity and heat are only part of the story. Cooling systems often require large volumes of freshwater. According to the World Economic Forum, US data centres consumed approximately 17.4 billion gallons of water for cooling in 2023, and this total is projected to quadruple by 2028 if current growth patterns persist.
It is estimated that average AI data centres use “billions of gallons of water” annually, and larger AI facilities under construction may consume as much electricity as 2 million households while using similarly vast quantities of water. A comprehensive study by researchers at Cornell University paints a stark picture: if AI data centre expansion continues at current rates, US facilities could collectively drain 731 to 1,125 million cubic metres of water per year by 2030, enough to support the annual residential water use of 6 to 10 million Americans. Locating data centres in water-scarce regions such as Nevada and Arizona amplifies the strain.
In drought-prone regions such as parts of India, similar concerns have been raised. A Outlook India report predicts that 60–80% of India’s data centres will face high water stress this decade, with total water use in India’s growing centre network projected to rise from about 150 billion litres in 2025 to about 358 billion litres by 2030.
Community impacts
Beyond abstract resource graphs, communities are already experiencing AI infrastructure’s environmental toll. In the US, public opposition to data centre expansion is growing sharply amid concerns about energy, water and quality of life.
In Loudoun County, Virginia, often called the “data centre capital of the world”, residents have protested the rapid arrival of hyperscale facilities, citing not only electricity and water strain but increased noise, industrial traffic and reduced quality of life.
Carbon emissions and grid strain
Electricity demand from AI data centres is environmentally benign if it is met by renewable energy. In many regions, however, significant portions of power still come from fossil fuels. Studies estimate that US data centre emissions could rise by 24–44 million metric tons of CO2 annually by 2030 if AI growth continues without major decarbonisation — equivalent to adding millions of cars onto American roads. Global analyses suggest that even if some parts of the data centre sector adopt renewables, total emissions from them could increase substantially by 2030 as AI workloads scale faster than the grid decarbonises. The OECD and other research bodies warn that this trend could prolong reliance on fossil fuels and delay coal plant retirements in some regions.
e-waste and supply chain pressures
AI hardware — high-performance GPUs and AI accelerators — has a rapid turnover cycle of often 2–3 years, far shorter than typical enterprise hardware. This rapid obsolescence contributes to growing electronic waste (e-waste), which is already a major global issue with tens of millions of tons generated annually. Without strong recycling programmes and supply-chain reforms, this waste threatens both environmental health and resource sustainability in producing and consuming regions alike.
Sustainable development
Addressing AI’s environmental footprint requires multi-layered action. One important lever is clean electricity sourcing. Tech companies such as Microsoft and Google have signed long-term renewable power purchase agreements to offset rising energy demand, and newer facilities such as Google’s Texas data centre are being designed to minimise water use and prioritise clean energy first.
Cooling innovation can reduce both energy and water consumption. Closed-loop liquid cooling, air-cooled designs and heat reuse systems (such as channeling waste heat into district heating networks) can cut environmental costs compared to conventional evaporative chillers. Improved hardware efficiency and software optimisation — such as smaller model footprints, dynamic power scaling and better thermal management — are essential to reduce electricity per unit of computation.
Besides, locating new data centres in regions with abundant renewable energy and low water stress, along with careful grid and water planning, can reduce environmental impacts. Finally, robust policy frameworks — including sustainability reporting, water withdrawal limits, emissions standards and community consultation — are vital to ensure that AI growth does not undermine ecological resilience or public health.