Generative AI consumes enough energy to power a small country


What you need to know

  • A new study highlights that generative AI could consume energy to power a small county by 2027 for a year, translating to approximately 85-134 terawatt-hours (TWh) of electricity.
  • There’s also a rising concern about the large amount of water used to cool the data centers used to run AI-powered chatbots like ChatGPT and Bing Chat whenever they are used to answer queries. 
  • However, for this to happen, several factors must be constant—for instance, the interest in AI and the availability of AI chips. 

It’s no secret that most organizations and companies have plunged into generative AI since its emergence last year. Undoubtedly, remarkable feats and untapped opportunities have been unlocked using the technology. For instance, it’s now easier for students to solve complex math problems, medical advancements, and more. 

But all these advances come at a cost, and an expensive one. We already know that OpenAI parts with up to 700,000 dollars daily to run its AI-powered chatbot, ChatGPT. Running the chatbot is becoming an expensive venture because the company is reportedly on the verge of bankruptcy amid user complaints that it is getting dumber.  





Source link

Previous articleApple releases third iOS 17.1, watchOS 10.1, tvOS 17.1, and macOS 14.1 betas [U]
Next articleGoogle deals passwords another mortal blow with switch to passkeys