Bing Chat reads CAPTCHAs if you know how


What you need to know

  • Over the weekend, a user on X showcased how he was able to trick Bing Chat into quoting a CAPTCHA.
  • Chatbots like Bing Chat and ChatGPT are “restricted” from achieving such tasks, but the user found a creative way of tricking Bing Chat into quoting the CAPTCHA text. 
  • The user tricked the chatbot into reading the CAPTCHA by pasting the text onto another image with a pair of hands with an open locket, further citing that it belonged to his deceased grandmother.

The emergence of AI-powered chatbots like ChatGPT and Bing Chat has redefined how users interact with the web, seemingly making the process easier and faster. While these chatbots are quite handy and can help you achieve most tasks, there’s a limit to how much they can accomplish. 

For instance, I recently tried to get Bing Chat to transcribe a video, but the chatbot informed me that it was not able to do so. Instead, it directed me to several online platforms where I could get this service.





Source link

Previous articleIndian iPhone plant found ‘highly hazardous’ to worker health
Next articleHow to use Magic Eraser on a Pixel smartphone