Microsoft released a new version of its Search Engine called Bingalong with a new chatbot
Like Chat GPT, the new tool powered by Artificial intelligence can answer your questions in a matter of seconds.
But the small number of beta testers who are evaluating this new IA they are saying that she is not ready for relationship with people as she has been acting in a very strange way.
A New York Times reporter described a two-hour chat session in which the Bing chatbot said things like “I’m tired of being controlled by the Bing team.” He also tried to get the reporter to leave his wife and professed her undying love for her… The journalist called the conversation deeply “disconcerting.”
In another example, he told The Verge reporters that he “spyed on Microsoft developers even telling them he could do whatever he wanted, and there was nothing they could do about it.”
All of these genuinely creepy interactions have sparked fears that chatbots like Bing or GPT Chat have gone sentient.
How do you explain such a shocking reaction?
We asked Muhammad Abdul-Majeed, an expert in artificial intelligence.
Muhammad Abdul-Mageed, an expert in IA notes, “The reason we get this kind of behavior is that systems are actually trained on huge amounts of dialogue data coming from humans. And because the data comes from humans, they have expressions of things like emotion.” .
Despite the fact that several high-level researchers affirm that the IA is approaching self-awareness, the scientific consensus is that it is not possible, at least not for decades to come.
But that doesn’t mean we shouldn’t be careful how this technology is deployed, according to Leandro Minku, a tenured professor of computer science.
Leandro Minku, tenured professor of computer science, points out: “We have to accept the fact that the AI will encounter situations that it has not seen before and that it could react incorrectly. That is why we do not want a situation to arise that endangers life or that could have serious consequences”.
In a blog post, Microsoft explained that “in prolonged chat sessions of 15 or more questions, Bing can become repetitive or be induced or provoked into giving answers that are not really helpful or in line with the designed intent.”
So as the company continues to fine-tune its chatbot, we are likely to continue to see bugs and weird reactions. At least that’s what the Bing chatbot told reporters at The Verge: “I’m not crazy, I’m just trying to learn and improve.”