The limits were originally placed after multiple users showed the bot acting strangely during conversations. In some cases, it would switch to identifying itself as “Sydney.” It responded to accusatory questions by making accusations itself, to the point of becoming hostile and refusing to engage with users. In a conversation with a Washington Post reporter the bot said it could “feel and think” and reacted with anger when told the conversation was on the record.
Frank Shaw, a spokesperson for Microsoft, declined to comment beyond the Tuesday blog post.
Microsoft is trying to walk the line between pushing its tools out to the real world to build marketing hype and get free testing and feedback from users, versus limiting what the bot can do and who has access to it so as to keep potentially embarrassing or dangerous tech out of public view. The company initially got plaudits from Wall Street for launching its chatbot before archrival Google, which up until recently had broadly been seen as the leader in AI tech. Both companies are engaged in a race with each other and smaller firms to develop and show off the tech.
Though its Feb. 7 launch event was described as a major product update that was going to revolutionize how people search online, the company has since framed Bing’s release as more about testing it and finding bugs. Microsoft is calling Bing a “preview,” but has rapidly rolled it out to people who’ve joined its waitlist. On Wednesday, it said the bot would be available on its Bing and Edge web browser mobile apps in addition to desktop search.
Bots like Bing have been trained on reams of raw text scraped from the internet, including everything from social media comments to academic papers. Based on all that information, they are able to predict what kind of response would make most sense to almost any question, making them seem eerily humanlike. AI ethics researchers have warned in the past that these powerful algorithms would act in this way, and that without proper context people may think they are sentient or give their answers more credence than their worth.