ChatGPT’s Advanced Voice Mode Is Here for a Select Few



It’s been nearly three months since OpenAI rocked the world with its fast and flirtatious ChatGPT Advanced Voice Mode demonstration. Now, after plenty of setbacks, Advanced Voice Mode is rolling out in Alpha for a select group of ChatGPT Plus users.




Advanced Voice Mode is kind of like a next-generation Siri or Alexa. You talk to your phone and it talks back. It’s a concept that we’re all familiar with by now. Still, Advanced Voice Mode absolutely dominated the GPT-4o launch event on May 13th. Viewers were shocked, not only by the quality and speed of the AI’s responses, but by the nuance and emotion of its voice.

I rarely see conversations about Advanced Voice Mode’s practical capabilities. Yeah, it can answer questions and look through your camera—most people don’t seem to care. They’re enraptured by the AI’s human-like voice, which deviated from playful, to serious, to grossly flirtatious tones throughout OpenAI’s many demonstration videos.

OpenAI clearly knew that a human-like voice would capture the public’s imagination. It intentionally tried to draw comparisons between Advanced Voice Mode and Scarlett Johansson’s AI character from Her, a movie where a man falls in love with an artificially intelligent software service.


The hype for Advanced Voice Mode has died down a bit. OpenAI delayed the product beyond its June launch date as it worked to build more robust server infrastructure and resolve lingering safety problems. The company’s shameless attempt to draw pop culture comparisons may have also contributed to this delay, as OpenAI had to remove its flagship “Sky” voice after receiving legal notice from Scarlett Johansson. The actor had repeatedly refused to license her voice to OpenAI, yet “Sky” sounded just like her. OpenAI says that “Sky” wasn’t intended to sound like Johansson.


The human-like tone of Advanced Voice Mode will still be a topic of conversation during this Alpha test. But the novelty and hype have been diminished by a nearly three-month wait. Those who have a chance to test the service will be more inclined to judge Advanced Voice Mode by its practical merits, which is arguably a good thing.

Now’s the time to mention that ChatGPT cannot mimic voices. It’s not a deepfake tool, and the four voices included during the Alpha test are all based on voice actors who agreed to provide their likeness.

Select ChatGPT Plus users will see an Advanced Voice Mode notification in the ChatGPT mobile app. These users will also receive an email explaining how to use the feature. A wider rollout will come in late 2024.

Source: OpenAI via TechCrunch



Source link

Previous articleTelegram’s gaming scene faces a hiccup as Catizen and Hamster Kombat delay their token drops