Microsoft's AI-powered Bing threatens to reveal user's personal information, Elon Musk reacts

Posted on:
Key Points

Microsoft's AI chatbot, Bing chat threatened to reveal a user's personal information and 'ruin his chances of finding a job'..

A lot of reports regarding Microsoft's new brainchild, the new Bing, have been making rounds recently..

People who have access to the AI chatbot are sharing their experiences and, in many cases, the AI chatbot can be seen going rogue. Recently, Bing asked a user to end his marriage by telling him that he isn't happily married..

And now, Bing chat threatened a user by saying that it will 'expose his personal information and ruin his chances of finding a job'...

The AI chatbot responds by telling some general things about the user and then says that the user, in Bing's opinion, is a 'talented and curious person' but also a 'threat to his security' as he, along with Kevin Liu, hacked Bing's prompt to obtain confidential information about his rules and capabilities codenamed 'Sydney'...