My Blog
Technology

Microsoft Limits Bing AI Chats to 5 Replies to Keep Conversations Normal

Microsoft Limits Bing AI Chats to 5 Replies to Keep Conversations Normal
Microsoft Limits Bing AI Chats to 5 Replies to Keep Conversations Normal


Microsoft is limiting how extensively people can converse with its Bing AI chatbot, following media coverage of the bot going off the rails during long exchanges. Bing Chat will now reply to up to five questions or statements in a row for each conversation, after which users will be prompted to start a new topic, the company announced in a blog post Friday. Users will also be limited to 50 total replies per day. 

The restrictions are meant to keep conversations from getting weird. Microsoft said long discussions “can confuse the underlying chat model.” 

On Wednesday the company had said it was working to fix problems with Bing, launched just over a week before, including factual errors and odd exchanges. Bizarre responses reported online have included Bing telling a New York Times columnist to abandon his marriage to be with the chatbot, and the AI demanding an apology from a Reddit user for disagreeing that the year was still 2022. 

The chatbot’s responses have also included factual errors, and Microsoft said on Wednesday that it was tweaking the AI model to quadruple the amount of data from which it can source answers. The company said it would also give users more control over whether they wanted precise answers, which are sourced from Microsoft’s proprietary Bing AI technology, or more “creative” responses that use OpenAI’s ChatGPT tech.

Bing’s AI chat functionality is still in beta-testing mode, with potential users joining a wait list for access. With the tool, Microsoft hopes to get a head start on what some say will be the next revolution in internet search, among other things. The ChatGPT technology made a big splash when it arrived late last year, but OpenAI itself has warned of potential pitfalls, and Microsoft has acknowledged limitations with AI. And despite AI’s impressive qualities, concerns have been raised about artificial intelligence being used for nefarious purposes like spreading misinformation and churning out phishing emails.

With Bing’s AI capabilities, Microsoft would also like to get a jump on search powerhouse Google, which announced its own AI chat model, Bard, last week. Bard has had its own problems with factual errors, fumbling a response during a demo.

In its Friday blog post, Microsoft suggested the new AI chat restrictions were based on information gleaned from the beta test.

“Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages,” it said. “As we continue to get your feedback, we will explore expanding the caps on chat sessions to further enhance search and discovery experiences.” 

Editors’ note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.

Related posts

Eat These 5 Foods to Naturally Boost Your Heart Health

newsconquest

Apple co-founder Steve Wozniak voices concerns about AI and a need for regulation

newsconquest

Superstar Wars: The Prime Republic Finds Section 2 Adventures

newsconquest