Earlier this year, OpenAI unveiled its new and shiny AI-powered image generator, ChatGPT-4o. Compared to its predecessors like DALL-E 3, the new model seemingly generates photorealistic images with a high level of accuracy. Hey, it can even generate a glass of wine filled to the brim.
But perhaps the take-home (at least for most users) is the Studio Ghibli meme trend. While the viral trend raised copyright concerns, everyone was quick to hop onto the bandwagon at the risk of FOMO and transformed their images into classic animations.
However, as we’ve recently come to learn, the obsession over Ghibli memes doesn’t come cheap. At some point, OpenAI CEO Sam Altman took to his X (formerly Twitter) account to announce the delayed rollout of the new image generator to free users.
The executive attributed the troubles to a shortage of GPUs. “Our GPUs are melting,” added Altman. As a result, the ChatGPT maker temporarily introduced rate limits while it worked on enhancing efficiency.
Despite the seemingly critical changes and measures, OpenAI’s ChatGPT gained over one million new users in under one hour, predominantly due to the “biblical demand” for ChatGPT-4o’s Ghibli memes.
While the measures OpenAI took to ensure everything was running seamlessly weren’t entirely out of pocket, CEO Sam Altman recently revealed that the company had to “do a lot of unnatural things (via vitrupo on X).”
Sam Altman says the Ghibli moment forced OpenAI to “do a lot of unnatural things.”That surge exposed the limits and the need for Stargate to exist.“More compute means we can give you more AI.” pic.twitter.com/TisSzXY8tUMay 20, 2025
According to Sam Altman:
“I’ve seen viral moments but I’ve never seen anyone have to deal with an influx of usage like this. It was unprecedented wild. And also like making an image is not exactly like a low compute task. The way we do it with the new image gen.”
The executive indicates that OpenAI was forced to do a lot of unnatural things, including borrowing compute capacity from research and slow down some features. He attributed the issue to GPU shortages.
He indicated that if the company had enough resources, it would be able to handle a surge in demand for its services. “More compute means we can give you more AI,” concluded Altman.