What you need to know
- A new report claims ChatGPT costs an estimated $700,000 a day to run.
- Microsoft has recently been rumored to be developing its own AI chips.
- Microsoft’s own chip could reduce costs by a third or more.
AI is all the craze currently, with millions of users utilizing ChatGPT on a daily basis. With such high demand, it’s no surprise to hear that ChatGPT costs quite a bit of money to run. Chief Analyst at research firm SemiAnalysis, Dylan Patel, estimates that ChatGPT currently costs OpenAI $700,000 per day, or 36 cents per query.
That’s a lot of money, and it’s not helped by the fact that AI companies, including Microsoft, need to buy GPUs from the likes of NVIDIA in bulk orders to help maintain their AI services as demand continues to increase. Industry analysts estimate that OpenAI may need to order 30,000 additional NVIDIA GPUs this year to aid its commercial efforts.
Building your own AI chip can help decrease costs, and that looks to be exactly what Microsoft is doing. Microsoft’s own AI chip is codenamed Athena, and is reportedly already in a good enough spot that it’s being tested with select groups internally at Microsoft. The company plans to distribute its own AI chip among its Azure AI services next year.
Industry insiders say they don’t expect Microsoft to outright replace NVIDIA, but rather lessen their dependence on them. Since ChatGPT runs on Azure services, reducing costs will help OpenAI run ChatGPT for cheaper too, as well as other AI services that choose Azure as a host.
Microsoft has been on an aggressive AI push these last few months, after Google fumbled its own AI product launch in the form of Bard. Just recently, reports stated that some Google employees have no confidence in Bard, feeling that Google rushed the product to market in response to Microsoft’s Bing AI efforts.