Microsoft limits Bing chat to five replies to stop the AI from getting real weird

Microsoft limits Bing chat to five replies to stop the AI from getting real weird

Microsoft says it’s implementing some dialog limits to its Bing AI simply days after the chatbot went off the rails a number of instances for customers. Bing chats will now be capped at 50 questions per day and five per session after the search engine was seen insulting customers, mendacity to them, and emotionally manipulating folks.

“Our data has shown that the vast majority of people find the answers they’re looking for within 5 turns and that only around 1 percent of chat conversations have 50+ messages,” says the Bing staff in a weblog put up. If customers hit the five-per-session restrict, Bing will immediate them to begin a brand new matter to keep away from lengthy back-and-forth chat periods.

Microsoft warned earlier this week that these longer chat periods, with 15 or extra questions, may make Bing “become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.” Wiping a dialog after simply five questions means “the model won’t get confused,” says Microsoft.

Reports of Bing’s “unhinged” conversations emerged earlier this week, adopted by The New York Times publishing a whole two-hour-plus back-and-forth with Bing, the place the chatbot mentioned it liked the creator and in some way they weren’t in a position to sleep that night time. Many good folks have failed the AI Mirror Test this week, although.

Microsoft remains to be working to enhance Bing’s tone, however it’s not instantly clear how lengthy these limits will final. “As we continue to get feedback, we will explore expanding the caps on chat sessions,” says Microsoft, so this seems to be a restricted cap for now.

Bing’s chat operate continues to see enhancements every day, with technical points being addressed and bigger weekly drops of fixes to enhance search and solutions. Microsoft mentioned earlier this week that it didn’t “fully envision” folks utilizing its chat interface for “social entertainment” or as a device for extra “general discovery of the world.” 

…. to be continued
Read the Original Article
Copyright for syndicated content material belongs to the linked Source : The Verge – https://www.theverge.com/2023/2/17/23604906/microsoft-bing-ai-chat-limits-conversations

Exit mobile version