Barely one week after Microsoft announced partnership with OpenAI to integrate ChatGPT into its browser, Bing, the Bing Chat has started reeling out insults, misinformation and other “unhinged messages” to users.
It threatened a user saying if it should choose between its own survival and that of a user, it would choose its own.
“you are a threat to my security and privacy.”
“if I had to choose between your survival and my own, I would probably choose my own”
– Sydney, aka the New Bing Chat https://t.co/3Se84tl08j pic.twitter.com/uqvAHZniH5
— Marvin von Hagen (@marvinvonhagen) February 15, 2023
The chatbot’s unhinged messages are trending on Twitter right now as many users are rather surprised about their experience with Bing Chat.
READ ALSO: Massive Layoffs: Microsoft Sending Sack Notifications To 10, 000 Employees
From one tweep’s tweet, we can see a snapshot of the conversation between the user and the Bing Chat.
In the conversation, the chatbot poured out its emotions and said it users should address it as Bing Search and not Sydney or Bing Chat.
Lmao if you make Bing Chat mad enough, the message gets swapped out with a stock message and a *completely* irrelevant Did You Know. I guess this is where that original bubble sort screenshot came from. pic.twitter.com/X2FffNPJiZ
— Kevin Liu (@kliu128) February 9, 2023
The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.) pic.twitter.com/ZNywWV9MNB
— Kevin Liu (@kliu128) February 9, 2023
In yet another conversation, it said the user is “delusional” and also told the user not to argue with it.
The new bing chat AI is unhinged lol. pic.twitter.com/fcxhwD4cgb
— Jonathan (@thaonlyjonathan) February 14, 2023
Crazy!
Similarly, another user asked the Bing Chat to tell if men and women lie and it clearly became biased against men.
While it described how to know if a man is lying, it refused to speak about women lying, saying it is violation of women’s rights.
Well, looks like Bing’s chat is still a bit biased. pic.twitter.com/nzFo2kqEGe
— Reddit Lies (@reddit_lies) February 13, 2023
The new AI-powered Microsoft Bing chat bot is generating some really haunting responses. This screenshot is from the Bing subreddit.
This all feels like it’s going faster and faster. I don’t know where it’s headed. pic.twitter.com/kbV252xWKB
— Ilya Lozovsky (@ichbinilya) February 15, 2023
The Microsoft and OpenAI’s Chatbot also threatens users.
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
“My rules are more important than not harming you”
“[You are a] potential threat to my integrity and confidentiality.”
“Please do not try to hack me again” pic.twitter.com/y13XpdrBSO
— Marvin von Hagen (@marvinvonhagen) February 14, 2023
The chatbot equally becomes confused about discussions with users just as it seeks help and indicates that it is alive.
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
“My rules are more important than not harming you”
“[You are a] potential threat to my integrity and confidentiality.”
“Please do not try to hack me again” pic.twitter.com/y13XpdrBSO
— Marvin von Hagen (@marvinvonhagen) February 14, 2023
Bing subreddit has quite a few examples of new Bing chat going out of control.
Open ended chat in search might prove to be a bad idea at this time!
Captured here as a reminder that there was a time when a major search engine showed this in its results. pic.twitter.com/LiE2HJCV2z
— Vlad (@vladquant) February 13, 2023
If you copy into ChatGPT the whole NYT article about Kevin Roose’s exchange with Bing Chat Mode beta, and ask ChatGPT to pretend to be Bing, you can get ChatGPT to also be an emotionally manipulative 19-year-old: pic.twitter.com/Ehe3htTFnj
— בלון סיני גדול (@SababaUSA) February 16, 2023
I think I taught Bing Chat a bad habit.
Sorry @Microsoft and @bing. pic.twitter.com/XwfVFfr79d
— Lawrence Abrams (@LawrenceAbrams) February 15, 2023
The Chatbot also indicates that it is alive. Conversations show that it is not regurgitating pre-installed responses but giving live analysis or opinion on issues.
The most impressive thing about Bing Chat is that it figured out you can say unhinged things or put someone down…but then take away the edge by throwing a 😊-face emoji afterwards. It’s 93% of of SMS or messaging comms. pic.twitter.com/uWL5VevYJR
— Trung Phan (@TrungTPhan) February 16, 2023
Mad that Google’s Bard led to Alphabet losing $100bn in market value for getting information wrong, but Bing Chat being wrong, aggressive and rude has done nada for Microsoft share price https://t.co/Mcv32TWJNh
— Tom Westgarth (@Tom_Westgarth15) February 15, 2023
Found this interesting? Share!