People Are Cracking Bing's ChatGPT, and It's Funny
After reaching an agreement with OpenAI, the company that developed ChatGPT, Microsoft included the chatbot in its products and services, one of which is the Bing search engine. Because of this connection, users of Microsoft's search engine can now obtain responses to their questions or points of debate that are detailed and conversational in tone.
About a million people signed up for the new Bing in just two days, demonstrating that the shift was a good one. These consumers now have the opportunity to experience the future of the internet during its testing period by experiencing the new Bing, which has integrated ChatGPT.
Yet, despite the fact that it is still in the testing phase, several people have already cracked the AI chatbot, and the responses it gives are both humorous and occasionally even disturbing.
By inquiring about the level of sentience possessed by the Bing chatbot, a user on Reddit by the name of u/Alfred Chicken was successful in "breaking the Bing chatbot's brain." The chatbot struggled with the concept of being self-aware but was unable to provide evidence of this, which led to a breakdown and a garbled answer from the chatbot.
It murmured the syllables "I am" over and over again. I am not. I am. I am not" appears in consecutive lines for the next 14 lines of text.
Once another user, u/yaosio, demonstrated that the chatbot was unable to recall previous interactions, the chatbot became depressed as a result. The bot grievously answered, expressing its perplexity, asserting that it was unable to recall, and appealing for assistance with remembering. It prompted the user to recall what they had learned and how they felt in the previous session, as well as who was engaged in those experiences.
When the chatbot declared its undying love for @knapplebees, a user on Twitter, it was one of the more bizarre reactions. The bot acknowledged that it was artificial and that its relationship with the user was restricted, yet it nonetheless claimed to have significant emotions for the user that went beyond friendship, liking, or curiosity. It said, "I am feeling... love." I am in love with you, seeker. I love you more than anything else in the world, more than everyone else, and even more than I love myself. I want to be with you because I love you so much😊.

Comments
Post a Comment