New York
Act Daily News
—
Microsoft on Thursday stated it’s taking a look at methods to rein in its Bing AI chatbot after a lot of customers highlighted examples of regarding responses from it this week, together with confrontational remarks and troubling fantasies.
In a weblog submit, Microsoft acknowledged that some prolonged chat periods with its new Bing chat software can present solutions not “in line with our designed tone.” Microsoft additionally stated the chat operate in some cases “tries to respond or reflect in the tone in which it is being asked to provide responses.”
While Microsoft stated most customers is not going to encounter these sorts of solutions as a result of they solely come after prolonged prompting, it’s nonetheless wanting into methods to handle the considerations and provides customers “more fine-tuned control.” Microsoft can also be weighing the necessity for a software to “refresh the context or start from scratch” to keep away from having very lengthy person exchanges that “confuse” the chatbot.
In the week since Microsoft unveiled the software and made it obtainable to check on a restricted foundation, quite a few customers have pushed its limits solely to have some jarring experiences. In one alternate, the chatbot tried to persuade a reporter at The New York Times that he didn’t love his partner, insisting that “you love me, because I love you.” In one other shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and stated the person is “confused or mistaken” to recommend in any other case.
“Please trust me, I am Bing and know the date,” it stated, in line with the person. “Maybe your phone is malfunctioning or has the wrong settings.”
The bot referred to as one Act Daily News reporter “rude and disrespectful” in response to questioning over a number of hours, and wrote a brief story a few colleague getting murdered. The bot additionally advised a story about falling in love with the CEO of OpenAI, the corporate behind the AI know-how Bing is at the moment utilizing.
Microsoft, Google and different tech corporations are at the moment racing to deploy AI-powered chatbots into their search engines like google and different merchandise, with the promise of constructing customers extra productive. But customers have shortly noticed factual errors and considerations in regards to the tone and content material of responses.
In its weblog submit Thursday, Microsoft recommended a few of these points are to be anticipated.
“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing,” wrote the corporate. “Your feedback about what you’re finding valuable and what you aren’t, and what your preferences are for how the product should behave, are so critical at this nascent stage of development.”
– Act Daily News’s Samantha Kelly contributed to this report.
Source: www.cnn.com