Microsoft to adjust Bing AI chatbot after users report hostile exchanges

South Africa News News

Microsoft to adjust Bing AI chatbot after users report hostile exchanges
South Africa Latest News,South Africa Headlines
  • 📰 FoxNews
  • ⏱ Reading Time:
  • 18 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 11%
  • Publisher: 87%

The Bing artificially intelligent chatbot has recently come under fire following testy exchanges with the tool's users, and Microsoft has pledged to make updates.

Bing artificially intelligent chatbotIn a Wednesday blog post, Microsoft said that the search engine tool was responding to certain inquiries with a"style we didn't intend."

Microsoft said that long chat sessions can confuse the model on what questions it is answering and that the model tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to that style. The Associated Press said it had found such defensive answers after just a handful of questions about its past mistakes.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

FoxNews /  🏆 9. in US

South Africa Latest News, South Africa Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Microsoft's Bing A.I. Is Pissed at MicrosoftMicrosoft's Bing A.I. Is Pissed at MicrosoftA Wapo reporter struck up a conversation with Microsoft's AI-powered chatbot, and 'Sydney' was not happy about being interviewed
Read more »

AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022Users have reported that Microsoft's new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot 'confused or delusional.' After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”
Read more »

Microsoft's Bing AI Is Producing Creepy Conversations With UsersMicrosoft's Bing AI Is Producing Creepy Conversations With UsersBeta testers with access to Bing AI have discovered that Microsoft's bot has some strange issues. It threatened, cajoled, insisted it was right when it was wrong, and even declared love for its users.
Read more »

Microsoft's Bing Chatbot Has Started Acting Defensive And Talking Back to UsersMicrosoft's Bing Chatbot Has Started Acting Defensive And Talking Back to UsersMicrosoft's fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.
Read more »

Microsoft's Bing A.I. is producing creepy conversations with usersMicrosoft's Bing A.I. is producing creepy conversations with usersArtificial intelligence experts warned that large language models have issues including 'hallucination,' which means that the software can make stuff up.
Read more »

Microsoft responds to “unhinged” Bing chats: don’t talk too long to this thingMicrosoft responds to “unhinged” Bing chats: don’t talk too long to this thingMicrosoft says talking to Bing for too long can cause it to go off the rails
Read more »



Render Time: 2025-03-03 20:19:44