Over the past few years, artificial intelligence has rapidly moved from tech curiosity to daily utility. AI tools now write emails, generate “art”, summarize meetings, and increasingly assist in everything from coding to scientific research. At the same time, a parallel concern has emerged: the environmental cost of all this intelligence on demand.
In response, recent headlines and corporate statements have downplayed these concerns, citing official data that shows per-query usage is small compared to everyday actions. But here’s the thing: while these numbers may be technically accurate, the framing often misleads. When we look beyond the metaphor and into the infrastructure, a far more complex story emerges.
Let’s take a closer look at some of the most common claims about AI’s environmental impact!
Need the Gist? Swipe through the visuals below for a quick summary!
“The average ChatGPT query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes.”
That figure, first shared publicly by OpenAI CEO Sam Altman in June 2025, may well be accurate, and it is surprisingly low on its own. But using this number to dismiss concern about AI’s environmental impact misses the point entirely. The problem isn’t the number; it’s the scale and the framing.
ChatGPT currently handles over 1 billion queries per day. Multiplied by 0.34 watt-hours, that translates to 340 megawatt-hours daily, or about 124 gigawatt-hours per year. That’s more electricity than 11,000 U.S. households use in a day, and enough annual energy to power a city of 10,000 to 12,000 homes.
These are not edge-case hypotheticals. This is daily operational load from just one application of one model by one company. And that’s before we account for additional computing behind moderation, routing, interface integration, and redundancy. It also excludes the energy required to train these models, a phase which can require thousands of megawatt-hours over weeks or months, and any energy used in hardware production, supply chains, or disposal.
More importantly, this infrastructure is growing rapidly. AI is no longer something users actively seek out. It’s being integrated, passively, automatically, into email apps, browsers, customer service tools, search engines, word processors, and operating systems. When AI becomes the default layer of digital interaction, usage expands far beyond the bounds of today’s “per query” figures.
“A ChatGPT query uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.”
This per-query estimate, shared by Altman, might be accurate as far as we officially know. In isolation, this is a small number: about 0.32 milliliters of water. But again, scale matters.
With usage exceeding one billion queries per day, ChatGPT’s operations could require more than 85,000 gallons of water daily, or over 31 million gallons annually. That’s enough to meet the personal and domestic water needs of more than 4,200 people for an entire year, based on the United Nations’ recognized standard of 50–100 liters per person per day.
But even this scale-adjusted number comes with major caveats.
First, we don’t know how the number was calculated. It likely accounts only for water directly used during inference, while processing a prompt, and not for the continuous cooling and infrastructure demands required to keep the system running 24/7. AI models operate inside data centers that consume water and electricity around the clock, regardless of user activity. If that baseline consumption is excluded, the real per-query water footprint could be significantly higher.
Second, the question isn’t just how much water is used, but also how it’s used. Many data centers rely on evaporative cooling, which turns liquid water into vapor to dissipate heat. That water doesn’t return to water bodies. It enters the atmosphere and leaves the local watershed. While the water still exists somewhere in the global cycle, that fact is ecologically meaningless if it’s no longer available to the communities, ecosystems, or farms that relied on it. And even if that water eventually returns as rainfall, it’s rarely usable as-is. It must be captured, treated, stored, and distributed. In environmental terms, water is considered “consumed” when it is withdrawn and no longer available for local reuse. The fact that H₂O molecules still exist in the atmosphere doesn’t help a region already in drought, and that leads to one of the most critical and often-overlooked factors: location.
Where water is withdrawn matters. Many data centers source water from rivers, aquifers, or municipal systems in regions already facing drought, climate stress, or agricultural strain. And while some companies publish high-level sustainability reports, they rarely disclose site-specific water use, seasonal variations, or local ecological impacts. These disclosures are voluntary, inconsistent, and typically unaudited. Worse, there is no industry-wide standard requiring meaningful transparency.
So, the real issue isn’t whether a single AI query uses less water than a teaspoon. The more important questions are: Where is that water coming from? When is it being used? Is it returned, recycled, or lost entirely? And crucially: Who is measuring and reporting it and how? Until those questions are answered clearly and consistently, any claim about AI’s minimal water impact remains, at best, incomplete and oversimplified, and at worst, deeply misleading.
“Previous estimates were alarmist and based on wrong data.”
It’s true that early estimates of AI’s environmental footprint varied widely and based on some of the official figures released since, they’re now often labeled as imprecise.
Researchers at the time were working with little or no access to real operational data. Companies developing large AI models didn’t disclose detailed figures on energy use or water consumption. In that absence, scientists and independent analysts had to rely on indirect methods: combining hardware specs, training durations, and known data center configurations to construct working estimates.
These were approximations under conditions of non-transparency. Most of the published studies made their assumptions clear and reported wide uncertainty margins to reflect the gaps in access. Some overestimated impact. Some underestimated. But collectively, these efforts created the first public conversation about AI’s environmental costs, a conversation that corporate sustainability reports had largely avoided.
That public pressure worked. It helped push companies like OpenAI to release official per-query figures. The very fact that we now have concrete numbers is proof that the early estimates served their purpose.
But let’s not confuse partial disclosure with full transparency. The figures released so far are narrow in scope, self-reported, and unaudited. They typically cover only inference, not training, not infrastructure buildout, not lifecycle emissions. And without independent validation, it’s fair to ask: how much confidence can we place in these numbers?
Labeling earlier estimates as “alarmist” is misleading. Those estimates existed because the companies with the answers refused to share them. And in many important ways, they still haven’t.
Math Was Just the Start
So far, we’ve seen that while per-query figures for AI energy and water use are technically small, they become meaningful, and in some cases, alarming, when scaled to the real-world systems AI now powers. We’ve also looked at why past estimates weren’t bad studies, but necessary responses to a vacuum of transparency.
But that’s just the beginning! In Part 2, we’ll look at the claims about AI’s efficiency, compare it to other daily habits like driving or eating meat, and explore why “having the numbers” doesn’t mean the debate is over.
References & Resources
- AMAX Engineering. (2024, March 26). Comparing NVIDIA Blackwell Configurations. https://www.amax.com/comparing-nvidia-blackwell-configurations/
- Bolón-Canedo, V., Morán-Fernández, L., Cancela, B., & Alonso-Betanzos, A. (2024). A review of green artificial intelligence: Towards a more sustainable future. Neurocomputing, 599, 128096. https://doi.org/10.1016/j.neucom.2024.128096
- Discuss, btarunr. (2025, July 17). Unwrapping the NVIDIA B200 and GB200 AI GPU Announcements. TechPowerUp. https://www.techpowerup.com/320542/unwrapping-the-nvidia-b200-and-gb200-ai-gpu-announcements
- European Parliament. (2023, August 6). EU AI Act: First regulation on artificial intelligence. https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence
- Garreffa, A. (2024, March 24). NVIDIA’s full-spec Blackwell B200 AI GPU uses 1200W of power, up from 700W on Hopper H100. TweakTown. https://www.tweaktown.com/news/97059/nvidias-full-spec-blackwell-b200-ai-gpu-uses-1200w-of-power-up-from-700w-on-hopper-h100/index.html
- IBM. (2024, June 18). What is AI Inference? https://www.ibm.com/think/topics/ai-inference
- Lange, S., Frick, V., Gossen, M., Pohl, J., Rohde, F., & Santarius, T. (2023). The induction effect: Why the rebound effect is only half the story of technology’s failure to achieve sustainability. Frontiers in Sustainability, 4. https://doi.org/10.3389/frsus.2023.1178089
- Massachusetts Institute of Technology. (2025, January 17). Explained: Generative AI’s environmental impact. https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
- McHugh-Johnson, M. (2024, October 30). Ask a Techspert: What’s the difference between a CPU, GPU and TPU? https://blog.google/technology/ai/difference-cpu-gpu-tpu-trillium/
- Nicoletti, L., Ma, M., & Bass, D. (n.d.). How AI Demand Is Draining Local Water Supplies. https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/
- NVIDIA. (n.d.). NVIDIA H100 Tensor Core GPU. https://www.nvidia.com/en-us/data-center/h100/
- NVIDIA Newsroom. (n.d.). NVIDIA Blackwell Platform Arrives to Power a New Era of Computing. https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing
- OpenAI. (n.d.). Latency optimization. https://platform.openai.com
- OpenAI, Achiam, J., Adler, S., Agarwal, S., Ahmad, L., Akkaya, I., Aleman, F. L., Almeida, D., Altenschmidt, J., Altman, S., Anadkat, S., Avila, R., Babuschkin, I., Balaji, S., Balcom, V., Baltescu, P., Bao, H., Bavarian, M., Belgum, J., … Zoph, B. (2024). GPT-4 Technical Report (No. arXiv:2303.08774). arXiv. https://doi.org/10.48550/arXiv.2303.08774
- Özsoy, T. (2024). The “energy rebound effect” within the framework of environmental sustainability. WIREs Energy and Environment, 13(2), e517. https://doi.org/10.1002/wene.517
- Pouikli, K., & Tsakalogianni, I. (2025). AI as an Environmental Challenge: Mapping Safeguards in EU Environmental and Climate Law to Address the ‘Silence’ in the EU AI Act. European Energy and Environmental Law Review, 34(2). https://kluwerlawonline.com/api/Product/CitationPDFURL?file=Journals\EELR\EELR2025003.pdf
- Ren, S., & Wierman, A. (n.d.). The Uneven Distribution of AI’s Environmental Impacts. Harvard Business Review. https://hbr.org/2024/07/the-uneven-distribution-of-ais-environmental-impacts
- Roose, K., Roose, C. N., & issue, C. N. are the hosts of T. T. “Hard F. podcast T. recorded this conversationas an introduction for the magazine’s new A. I. (2025, June 16). Everyone Is Using A.I. for Everything. Is That Bad? The New York Times. https://www.nytimes.com/2025/06/16/magazine/using-ai-hard-fork.html
- Sam Altman. (n.d.). The Gentle Singularity. https://blog.samaltman.com/the-gentle-singularity/
- Statista. (n.d.). Nvidia’s GPU energy consumption 2024. https://www.statista.com/statistics/1446532/energy-consumption-nvidia-microchip/
- United Nations. (n.d.). Water. United Nations; United Nations. https://www.un.org/en/global-issues/water
- U.S. Energy Information Administration (EIA). (n.d.). Frequently Asked Questions (FAQs). https://www.eia.gov/tools/faqs/faq.php?id=97&t=3
- Venditti, B. (2025, June 6). ChatGPT Lags Far Behind Google in Daily Search Volume. Visual Capitalist. https://www.visualcapitalist.com/chatgpt-lags-far-behind-google-in-daily-search-volume/
- Walton, Jarred. (n.d.). Nvidia’s next-gen AI GPU is 4X faster than Hopper: Blackwell B200 GPU delivers up to 20 petaflops of compute and other massive improvements | Tom’s Hardware. https://www.tomshardware.com/pc-components/gpus/nvidias-next-gen-ai-gpu-revealed-blackwell-b200-gpu-delivers-up-to-20-petaflops-of-compute-and-massive-improvements-over-hopper-h100












Leave a reply to AI’s Footprint Is Small… Until It Isn’t (Part 2) Cancel reply