The Environmental Toll of a Single ChatGPT Query Is Absolutely Wild

Aqua Cola Just how much resources are eaten up when you ask OpenAI’s ChatGPT to write a simple 100-word email? The answer may alarm you. The answer is about equivalent of a full bottle of water and enough power to light up 14 LED bulbs for one hour, according to The Washington Post’s consultation with UC Riverside researcher Shaolei Ren — an appreciable environmental toll on its own, but a staggering one when you multiply it out to the number of users worldwide. Say one out of every ten working Americans were using ChatGPT just once a week to write an email. In Ren’s estimate, over a one year period that would mean ChatGPT would guzzle 435 million liters of water and burn 121,517 megawatt-hours of power, which translates into all the water drunk up by every household in Rhode Island for a day and a half and enough electricity to light up all the households in Washington DC for 20 days. And that’s just today’s usage. With big tech so confident in the explosive potential of AI that Microsoft is looking to bring an entire nuclear plant back online to fuel its AI datacenters, those figures could come to look laughably low. Thirst Traps The reason ChatGPT consumes so much water is due to the fact that AI data centers emit tons of heat when running calculations. In order to cool these facilities, they require a tremendous amount of water to bring down the temperatures coming from these servers. In…The Environmental Toll of a Single ChatGPT Query Is Absolutely Wild

Leave a Reply

Your email address will not be published. Required fields are marked *