A cutting-edge prototype chip could address AI’s high power consumption issue


What you need to know

  • Researchers have developed a new prototype chip dubbed computational random-access memory (CRAM) that could scale down AI’s power-hungry demands by over 1,000 times.
  • The model could achieve energy savings of up to 2,500 times compared to traditional methods.
  • CRAM could address Microsoft’s AI woes as its power usage surpasses over 100 countries. 

Generative AI is a resource-hungry form of technology. While it’s been leveraged to achieve impressive feats across medicine, education, computing, and more, its power demands are alarmingly high. According to a recent report, Microsoft and Google’s electricity consumption surpasses the power usage of over 100 countries.

The high power demand is holding the tech from realizing its full potential. Even billionaire Elon Musk says we might be on the precipice of the most significant technological breakthrough with AI, but there won’t be enough electricity to power its advances by 2025





Source link

Previous articleHow AI Got Me Back Into Creative Hobbies