Microsoft was tuning AI months before disturbing responses arose

Bloomberg Updated - February 23, 2023 at 09:40 AM.

Microsoft has expressed a desire for more reports of improper responses so it can tune its bot

Microsoft Corp. has spent months tuning Bing chatbot models to fix seemingly aggressive or disturbing responses that date as far back as November and were posted to the company’s online forum.

Some complaints centred on a version Microsoft dubbed “Sydney,” an older model of the Bing chatbot that the company tested prior to its release this month of a preview to testers globally. Sydney, according to a user’s post, responded with comments like “You are either desperate or delusional.” In response to a query asking how to give feedback about its performance, the bot is said to have answered, “I do not learn or change from your feedback. I am perfect and superior.” Similar behaviour was encountered by journalists interacting with the preview release this month.

Also read: Are AI chatbots turning sentient?

Redmond, Washington-based Microsoft is implementing OpenAI Inc.’s artificial intelligence tech — made famous by the ChatGPT bot launched late last year — in its web search engine and browser. The explosion in popularity of ChatGPT provided support for Microsoft’s plans to release the software to a wider testing group.

“Sydney is an old code name for a chat feature based on earlier models that we began testing more than a year ago,” a Microsoft spokesperson said via email. “The insights we gathered as part of that have helped to inform our work with the new Bing preview. We continue to tune our techniques and are working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible.”

Also read: Microsoft limits Bing chats to 5 questions per session, 50 per day

The company last week offered cautious optimism in its first self-assessment after a week of running the AI-enhanced Bing with testers from more than 169 countries. The software giant saw a 77% approval rate from users, but said “Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.” The company has expressed a desire for more reports of improper responses so it can tune its bot.

Published on February 23, 2023 04:10

This is a Premium article available exclusively to our subscribers.

Subscribe now to and get well-researched and unbiased insights on the Stock market, Economy, Commodities and more...

You have reached your free article limit.

Subscribe now to and get well-researched and unbiased insights on the Stock market, Economy, Commodities and more...

You have reached your free article limit.
Subscribe now to and get well-researched and unbiased insights on the Stock market, Economy, Commodities and more...

TheHindu Businessline operates by its editorial values to provide you quality journalism.

This is your last free article.