OpenAI fears humans will become ’emotionally reliant’ on ChatGPT’s human voice


OpenAI, the maker of ChatGPT, has revealed concerns users may develop emotional dependency on the chatbot’s forthcoming voice mode.

The ChatGPT-4o mode is currently being analysted for safety ahead of a rollout to the community. It enables users to converse naturally with the assistant as if it were a real person.

With that comes the risk of emotional reliance, and “increasingly miscalibrated trust” of an AI model that would be exacerbated by interactions with an uncannily human-like voice that can take account of the user’s emotions through tone of voice.

Save big on the PlayStation VR 2 with this Amazon deal

Save big on the PlayStation VR 2 with this Amazon deal

The PS VR 2 has plummeted to just £423.50 on Amazon. Save £97.49 on the 4K gaming headset when you shop today. That’s 18% off the 2023 headset’s £529.99 RRP.

  • Amazon
  • Was £529.99
  • £423.50

View Deal

The findings of the safety review (via Wired), published this week expressed concerns of language that expressed shared bones between the human and the AI.

“While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time,” the review reads. It also says the dependence on the AI might affect relationships with other humans.

“Human-like socialization with an AI model may produce externalities impacting human-to-human interactions. For instance, users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships. Extended interaction with the model might influence social norms. For example, our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions,” the document adds.

Furthermore, the review pointed out the possibility of over-reliance and dependence.

“The ability to complete tasks for the user, while also storing and ‘remembering’ key details and using those in the conversation, creates both a compelling product experience and the potential for over-reliance and dependence.”

The team said there’ll be further study on the potential for emotional reliance on the voice-based version of ChatGPT. The feature drew mainstream attention earlier this summer due to the voice’s startling resemblence to the actor Scarlett Johansson. The actor, who actually played an AI being its user fell in love with in the film Her, refused the offer to voice OpenAI’s assistant.

However, the end result ended up sounding suspiciously like her anyway, despite CEO Sam Altman’s insistance the voice wasn’t cloned.



Source link

Previous article5 Quick Ways to See a PC Game’s FPS (Frames Per Second)
Next articleBitcoin bounces back above $60,000 as cryptocurrency’s rise continues to end week: CNBC Crypto World