The Curious Companion: ChatGPT and the Environment

Curious Reader!

Welcome to this week’s ChatGPT Curious companion newsletter.

What you came for is below, and you can CLICK HERE to listen to the episode if you decide you prefer earbuds to eyeballs.

Happy reading!

Quick heads up: This first Curious Companion email is a little different. I’m launching the podcast with three full episodes, and rather than cram everything into a single email, I’ve featured the full breakdown only for Episode 2 here, largely because ya’ll told me it’s the episode you’re the most curious about.

If you want to check out Episodes 1 and 3 (and their write-ups), I’ve got those linked for you below. Future emails will stick to the usual one-episode format, but for launch week… we going big.

👉🏾 Episode 1: What You Actually Need to Know About ChatGPT
👉🏾 Episode 3: Is ChatGPT Killing Creativity?

I very intentionally decided to make this topic episode number two, as environmental concerns around AI get a lot of press, understandably so. But, are these headlines a distraction more than anything else?

After about a week of researching (ya’ll, I was in the computer), here’s where I landed: Using ChatGPT isn’t an environmental free pass, but neither is it the moral failing that uniformed (or misinformed) folks make it out to be. To borrow one from Ramit’s book, if we truly care about the environment and want to protect it, let’s ask the $30,000 questions, not the $3 ones.

Of note, this episode has A LOT of references, and they can be found here.

Let’s Talk Energy

We’re starting here because it gets the most attention. The episode focuses heavily on energy, with water and carbon covered later. Why? Because energy sets the tone and the other two trends tend to follow it. Disclaimer: I’m gonna ask you to put on your engineer hat for this episode and bear with me as we discuss units of energy, specifically Wh (Watt-hours), but I promise to keep the math simple.

Lastly, for any of you who have done your own research, it’s worth noting that in June of this year, Sam Altman (CEO of Open AI) published a blog post with specific figures for ChatGPT’s energy and water use. I didn’t base this episode solely on those numbers because of the very obvious conflict of interest #duh. It’s in his best interest to downplay impact.

So… how much energy does a ChatGPT query actually use?

The most common stat cited is 0.3 Wh per query (that’s for the input and the response). But how that number came to be, and whether it’s trustworthy, deserves a closer look.

  • Most reporting on this issue traces back to an article by Alex de Vries, published in Joule (Nov 2023).
  • That article leaned on a Reuters piece from February 2023, quoting Alphabet’s chairman (Alphabet is the parent company of Google. Conflict of interest much?) saying LLMs use 10x more energy than a standard Google search
  • This 10x figure was based on an estimate published in a…wait for it…a 2009 Google blog post, that stated a standard Google search used ~0.3 Wh.
  • To Alex’s credit, he did look to back this 10x statement up by referencing the research from, Semi Analysis, and independent research group

Ignoring any conflicts of interest or outdated numbers, if we consider all of this information to be accurate, then a ChatGPT query would use 3 Wh…but here’s where it breaks down:

The 10x claim from Semi Analysis was based on 2,000-token outputs. The problem here is that the average ChatGPT response is closer to 200 tokens, not 2000. (Psst, if you don’t remember what tokens are, refer back to episode 1.)

To put this in normal speak, 2,000 tokens = ~3 pages of single-spaced text. The average ChatGPT response? Closer to 200 tokens = ~ 150 words = ~ a solid paragraph, single-spaced = ~ half a page double-spaced.

After all that math, here’s the thing: If we reduce the average output size by a factor of 10, and account for Google’s likely update in energy efficiency since 2009 (it’s very reasonable to say that it’s improved by 10x), the numbers mostly cancel each other out. Which lands us right back at ~0.3 Wh per query, a number we long accepted for Google and nobody batted an eye! Not to mention the fact that ChatGPT is objectively doing more per query for the user (providing an answer as opposed to a list of links).

Worth noting: Google now uses AI in its search responses, AND there are different models of ChatGPT that require different amounts of energy. All that to say, we need more transparency if we want truly accurate numbers.

So what does 0.3 Wh look like in real life?

Let’s say you’re a high volume ChatGPT user: 10 sessions a day, 10 prompts per session = 100 interactions/queries. (Of note, the average user has about 8-14 interactions/queries per day.)

  • 100 queries x 0.3 Wh = 30 Wh/day
    • That’s equivalent to:
      • Watching a streaming show, aka Netflix and chill, for 16 minutes
      • Driving an electric vehicle ⅛ of a mile
      • Driving a regular car (20-30mpg) about 1/3 of a football field
      • Driving my Jeep that I love so much but that only gets 12mpg? Maybe 50 feet
      • Heating your house with forced air for 21 seconds

Let’s change the numbers and give ourselves some wiggle room, given the lack of transparency and the fact that we don’t really know with 100% certainty how much energy a single ChatGPT query uses. Let’s multiply the per query cost by 10x, and calculate usage based on 3 Wh/query.

  • 3 Wh/query x 100 queries = 300 Wh/day
    • That’s:
      • Netflix and chill for 2 hours and 40 minutes
      • Driving an EV 1.2 miles
      • Driving a regular car just over 3 football fields
      • Driving my jeep 500 feet (a little over 1 ½ football fields)
      • Heating a home for 3 minutes and 36 seconds

Allow me to channel my inner Missy Elliot for a second and put my thing down, flip it, and reverse it:

Let’s compare the energy cost of an entire day of ChatGPT usage for a high volume user (30-300 Wh based on the above scenarios) with the energy cost of just a single hour for common household appliances:

  • Energy cost for 1 hour of:
    • Heating your home with forced air: 5000 Wh
    • Heat your home with a heat pump: 1000 Wh
    • Running the dishwasher: 1800 Wh
    • Running the clothes dryer: 3000 Wh

I don’t want this to be a whataboutism episode or imply that ChatGPT has no impact. The point here is that we can be mindful of ChatGPT usage AND if we are truly concerned about the environment we can and should look to cut back on/change things that will have a much more significant impact.

At Scale? Sure, It Adds Up, But…

ChatGPT now handles an estimated 2.5 billion prompts per day. And the bulk of energy use in AI is shifting toward inference (usage), not just training.

Still, ChatGPT is not the biggest AI energy hog in these data centers you hear so much about:

  • 80% of AI energy today goes to recommender systems! Think: Netflix suggestions, Instagram ads, YouTube feeds, search ranking
  • Other major uses of AI energy: Computer vision + speech (content moderation, auto captions, real time translations), autonomous vehicle training (Waymo) vision, fraud scoring, search ranking
  • Other things “on the shelf”: Streaming video libraries, app data, object storage (videos, podcasts, photos, backups, software updates)

AI’s overall environmental footprint is absolutely growing. ChatGPT gets all the smoke because it was the one that kicked off the AI arms race and showed that AI could be useful, impressive, and marketable.

Now we have Microsoft, Google, Meta, and Amazon reprioritizing AI at EVERY level.

Again, we need more transparency from these companies.

What About Water?

So I went into the weeds quite a bit for energy usage because it gets all the press, but what I found when digging into water usage for ChatGPT largely mirrored that same trend:

  • The majority of us are blissfully unaware of how much energy and water we actually use on a daily basis and with day to day tasks
  • We need more transparency from these companies

An important concept to understand about a water footprint are the 3 scopes:

  • Scope 1: On-site server and facility cooling
  • Scope 2: Thermoelectric power generation (off-site cooling during electricity production)
  • Scope 3: Hardware manufacturing and transport

85% of water use is typically off-site (Scope 2), and it’s much harder to estimate and often under-reported or simply not reported at all.

A 2025 article, “Making AI Less Thirsty, reported data that showed it took anywhere from 10-50 requests in order for GPT3 to consume a 500ml bottle of water. (An average of 33 requests).

Now I’m absolutely not saying this is nothing, but 1 bottle of water per 33 requests seems massive because we never think of anything else in terms of water bottles consumed.

Most users? 8–14 prompts/day. That’s about a half a bottle per day. High volume users averaging 100 queries per day? That’s about 3 bottles a day.

That feels high, but again, at the risk of this becoming a whataboutism episode, here’s what else happens daily (per person, US averages):

  • Electricity usage = 11.4 kWh/day
    • Requires ~2 gallons of water per kWh
    • That’s 22.8 gallons/day = 173 water bottles
  • “At the faucet” water usage/person = 80 gallons/day
    • Showers, toilet, teeth brushing = 606 bottles
  • Food production (esp. beef) = 1800–2200 gallons/day
    • That’s ~15,000 bottles of water per person per day

Perspective matters!!

Again, I don’t want this to become a whataboutism episode, but mainstream articles are out there saying that by 2028, AI in the US could require as much as 720 billion gallons of water annually just to cool AI servers. Yes, that’s a lot of water, BUT they fail to mention that irrigation uses 27 trillion gallons per year (that’s 75.7 billion gallons per DAY), and thermoelectric power uses 1 trillion gallons per year.

So…Is Carbon a Concern?

Yes. But the logic is the same.

Carbon footprint =

  1. How much electricity is used
  2. How that electricity is produced

AI’s carbon impact depends entirely on the source of the electricity. That’s the variable that matters.

If ChatGPT were to be powered by renewables, the impact would drop dramatically. And while usage is increasing, so is model efficiency.

The bigger conversation here isn’t “is ChatGPT bad?”, it’s “what’s powering our tools?”. Spoiler, coal ain’t the way!

Big Takeaways

  • We need better studies and more transparency.
    Academic papers, MSM, and Substack posts all tend to cite each other and use largely hypothesized data, making it hard to get a clear or current picture.
  • Clickbait headlines overstate the impact.
    If it bleeds, it leads, but again, we need more transparency, especially as models, infrastructure, and hardware continue to change.
  • ChatGPT’s footprint is relatively minor.
    Its energy and water use pale in comparison to what’s already baked into our daily lives.
  • AI’s environmental impact isn’t about the tech, it’s about the power.
    What matters most is the energy source behind the models. Renewable energy drastically reduces impact.
  • Efficiency is improving, but usage is scaling.
    Which means the biggest dial mover is still how these systems are powered
  • Want to reduce environmental impact?
    Focus on the big dial movers: clean energy, a lower-impact diet, and efficient transport.
  • Also: ChatGPT is useful.
    It objectively does more than a Google search.
  • Agentic models are a growing concern.
    AI using AI to complete tasks means significantly more compute, and it’s something to keep an eye on. (Yes, I will do a future episode on this.)

This isn’t a permission slip to go wild. But it’s also not a reason to disengage. If you’re worried about the environment:

  • Use ChatGPT mindfully (fewer, more intentional prompts)
  • Choose energy-efficient ChatGPT models when you can
  • Take real-world action: Walk, get an e-bike, use LEDs, turn off the lights, shorter showers, turn off the AC, less Netflix in the background, advocate, vote, look at your energy bill, change your diet, follow and learn from an environmentalist who you like – PICK ONE!

If we want the rewards and the benefits, we have to lean into responsibility and participate in shaping things.

Curiosity and informed action beat performative guilt every time.

How I Used ChatGPT This Week

Each episode I include a section where I briefly discuss how I used ChatGPT that day/week.

This time I used ChatGPT extensively to help me gather sources, run numbers, and pressure test claims for this episode. Of note, I did use the more power hungry o3 model as it was a better fit for helping me work through ideas as I attempted to become an engineer and an environmentalist.

This was another dense episode. Big thank you for sticking with it.

Questions, comments, concerns, additions, subtractions, requests? Hit reply or head to the website (chatgptcurious.com) and use that contact form. I’d love to hear from you.

Catch you next Thursday.

Maestro out.

Feeling curious AND generous? Click here to support the podcast.

AI Disclaimer: In the spirit of transparency (if only we could get that from these tech companies), this email was generated with a very solid alley-oop from ChatGPT. I write super detailed outlines for every podcast episode (proof here), and then use ChatGPT to turn those into succinct, readable recaps that I lightly edit to produce these Curious Companions. Could I “write” it all by hand? Sure. Do I want to? Absolutely not. So instead, I let the robot do the work, so I can focus on the stuff that I actually enjoy doing and you get the content delivered to your digital doorstep, no AirPods required. High fives all around.

Did someone forward you this email?
Click here to join the ChatGPT Curious newsletter family.

Stay curious.

Similar Posts