AI's Electricity Use Is Spiking So Fast It'll Soon Use as Much Power as an Entire Country

All That Power AI chatbots like OpenAI’s ChatGPT and Google’s Bard consume an astronomical amount of electricity and water — or, more precisely, the massive data centers that power them do. And according to the latest estimates, those energy demands are rapidly ballooning to epic proportions. In a recent analysis published in the journal Joule, data scientist Alex de Vries at Vrije Universiteit Amsterdam in the Netherlands found that by 2027, these server farms could use anywhere between 85 to 134 terawatt hours of energy per year. That’s roughly on par with the annual electricity use of Argentina, the Netherlands, or Sweden, as the New York Times points out, or 0.5 percent of the entire globe’s energy demands. Sound familiar? The much-lampooned crypto industry spiked past similar power consumption thresholds in recent years. It’s a massive carbon footprint that experts say should force us to reconsider the huge investments being made in the AI space — not to mention the eye-wateringly resource-intensive way that tech giants like OpenAI and Google operate. We Hunger Coming to an exact figure is difficult, since AI companies like OpenAI are secretive about their energy usage. De Vries settled on estimating their consumption by examining the sales of Nvidia A100 servers, which make up an estimated 95 percent of the AI industry’s underlying infrastructure. “Each of these Nvidia servers, they are power-hungry beasts,” de Vries told the NYT. It’s a worrying trend that’s leading some experts to argue that we should take a step back and reevaluate the…AI's Electricity Use Is Spiking So Fast It'll Soon Use as Much Power as an Entire Country

Leave a Reply

Your email address will not be published. Required fields are marked *