ChatGPT reportedly consumes approximately 3 watt-hours of power to respond to a single query, which is 10x more than the average power needed when using Google search. A new report by Epoch AI counters these stats as an overestimate, indicating that the OpenAI chatbot uses less power than previously assumed.
According to the report, ChatGPT running GPT-4o only consumes 0.3 watt-hours when generating a response for a query. While speaking to TechCrunch, Joshua You, a data analyst at Epoch AI, indicated:
“The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car.”
The data analyst revealed that his research and analysis on ChatGPT’s power consumption was prompted by dated research and overestimated stats. He indicated that the assumed “universal” stats on ChatGPT’s power consumption were based on the assumption that OpenAI was using old and inefficient chips to run its AI models.
According to You:
“Also, some of my colleagues noticed that the most widely reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high.”
To this end, Epoch AI’s ChatGPT power consumption estimate isn’t cast in stone because it’s also an approximation that doesn’t include key AI capabilities like the chatbot’s image generation.
ChatGPT will get power-hungry as OpenAI leans more on reasoning models
Perhaps more interestingly, the data analyst indicated that he doesn’t expect ChatGPT’s power consumption to rise, but as the models become more advanced, they’ll require more energy.
This is especially true as top AI labs, including OpenAI, seem to be leaning more toward reasoning models that think harder for the hardest problems, which, in turn, requires more energy.
As generative AI rapidly advances and gains broad adoption, it’s becoming more apparent thatit demands an exorbitant amount of electricity, money, and water to run.
Over the past few years, multiple reports have emerged indicating that Microsoft Copilot and ChatGPT consume one water bottle for cooling when generating a response for a query. This is on the heels of a previous report indicating that Microsoft and Google’s power consumption surpasses the electricity usage of over 100 countries.
More recently, a separate report detailed that OpenAI’s GPT-3 model consumes four times more water than previously thought, while GPT-4 consumes up to 3 water bottles to generate a mere 100 words. AI models seemingly become more power and resource-hungry as they become more advanced. However, as it now seems, ChatGPT might not be as power-hungry as previously thought.