The Environmental Price of AI
Vocab level: B2
You're not logged in yet, your progress will not be saved!
Login now
or
Create an account
Loading...
Loading...
The data centers answering your AI questions are pulling a whole lot of electricity from local grids
and a ton of water to stay cool.
And the computers inside are made from rare earth metals that have to be mined out of the ground.
There's limited data out there, but some big AI companies have given us at least some indication of the resources their AI systems use.
In its latest environmental report, Google said the electricity consumed by its data centers grew 27% in 2024 compared to the year prior,
although it said emissions were falling thanks to investments in clean energy and making its technology more efficient.
Open AI CEO Sam Alman has written that one ChatGPT query uses just about 0.34 Watt hours
about what a high efficiency light bulb uses in a couple of minutes.
But OpenAI also says that more than 2.5 billion messages are sent to ChatGPT daily.
So if you do the math, every day Chat GPT is using enough energy to keep that light bulb running for more than 9500 years.
Google estimates the average text question to its chatbot Gemini uses slightly less, about 0.24 Watt hours.
Part of the challenge is that AI systems demand a lot more resources than earlier computing models.
- Next exercise: New landmarks added to UNESCO's world heritage sites
- Previous exercise: Why rare Auroras are easier to see this week