TECHNOLOGY, INTERNET TRENDS, GAMING

Microsoft it’s expanding access again on Bing AI

Microsoft it’s expanding access again on Bing AI

By julianapardogonzalez

Microsoft, which had restricted the interactions of its Bing search engine chatbot after complaints surfaced on social networks about strange responses in long conversations, is now backtracking and giving users full access again. The company also commented that it is going to start testing another option that allows users to choose the tone of the chat, with 3 options of Precise (shorter, more focused responses), Creative (longer and chattier), or Balanced for a bit of both. 

Did Microsoft know about Bing Chat’s unhinged responses? 

Microsoft’s Bing Chat AI has gotten off to a rocky start, but the company may have known about the problems long before its public debut. A support post on Microsoft’s website refers to “rude” responses from the “Sidney” chatbot, which is a story we have been hearing about. Here is the rub: the post was made on November 23, 2022.  

The revelation comes from Ben Schmidt, vice president of information design at Nomic, who shared the post with Gary Marcus, an author who covers AI and founder of Geometric Intelligence. The story tells that Microsoft tested Bing Chat, called Sidney, according to the publication, in India and Indonesia sometime between November and January before it made the official announcement. 

In response, Microsoft shared the following statement: 

“Sydney is an old codename for a chat feature based on earlier models that we started testing more than a year ago. The insights we gathered as part of that have helped inform our work with the new preview version of Bing. We continue to refine our techniques and are working on more advanced models to incorporate learnings and feedback so that we can deliver the best possible user experience.”

The support post in question was last updated on February 21, 2023, but the history of the initial question and responses show that it has not been revised since its original posting date. 

There will now be usage limits 

After repeated reports of strange behavior, Microsoft on Feb. 17 established new rules limiting the number of interactions testers can have and their duration. This is to help “focus” chat sessions that, if too long, can confuse the chat model. The limits used to reduce testers to five turns per session and a maximum of 50 per day. But 60 total interactions per day will now be allowed, with plans to increase that total to 100 soon. 

Microsoft confirmed that it “did not anticipate” people using Bing’s AI bot for social entertainment, but since even the most experienced tech journalists do not pass this version of the mirror test, it has introduced new limits. 

User complaints 

Some Reddit users complained about last week’s limit, saying Microsoft “lobotomized” it, “neutered” the AI and that it was “a shell of its former self.” To which the Bing team responds: 

The reason we are testing the new Bing in the open with a limited group of testers is precisely to find these atypical use cases that we can learn from and improve the product, as they are not something we would normally encounter in internal testing. 

The tech giant said it hopes to enable longer and more intricate conversations over time but wants to do so in a “responsible way.” Which is incongruous, as there are several ethical issues surrounding AI and its use in a search engine like Bing, as well as the possibility that Microsoft rushed Bing Chat before it was ready and knowing what it was capable of. 

Discover more from Syrus

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from Syrus

Subscribe now to keep reading and get access to the full archive.

Continue reading