ChatGPT keeps hallucinating—and that’s bad for your privacy



After triggering a spike in VPN service downloads following a temporary ban about a year ago, OpenAI faces troubles in the European Union once again. The culprit this time? ChatGPT‘s hallucination problems.

The popular AI chatbot is infamous for making up false information about individuals—something that OpenAI is admittedly unable to fix or control, experts say. That’s why Austria-based digital rights group Noyb (stylized as noyb, short for “none of your business”) filed a complaint to the country’s data protection authority on April 29, 2024, for allegedly breaking GDPR rules.





Source link

Previous articleBitcoin Price Rally May Be Over, Token Could Fall Nearly 50%, Forecaster Says
Next articleDell XPS 14 (9440) vs. HP Spectre x360 14 (2024): Which should you choose?