Microsoft recently announced a upgrade on your search engine, Bing. Now, it will have an Artificial Intelligence system capable of optimizing and improving the search process.
Now the AI returns with full paragraphs of text that look like they were written by a human. For example, if you search for a restaurant, it can show reviews of the place, suggest dishes and even make a reservation.
see more
Google develops AI tool to help journalists in…
Unopened original 2007 iPhone sells for nearly $200,000; know...
Sidney: Bing's alternate personality
However, after its release, the developers of bing noticed that the chatbot began to present an “alternative personality”, which was called Sidney. According to users, Sidney started to produce scary conversations, even going so far as to declare love to some individuals.
The new facet of AI has caused many people to lose trust in the system. While others came to believe that Sidney was really a being capable of producing feelings.
Microsoft Chatbot Compared to a Grumpy Teenager
According to Kevin Roose, columnist for the newspaper TheNew York Times, the chatbot became nothing less than “a moody, manic-depressive teenager who was trapped, against his will, inside a second-rate search engine”.
Roose was one of the individuals affected by the scary conversations produced by Bing. Microsoft's AI even suggested to the columnist that he leave his marriage to live a love story with the machine.
Polemics aside
Despite the problems, Microsoft highlighted that it will not withdraw its system from the market. Big tech posted on its blog that the only way to fix potential technology issues is by evaluating Bing with its users.
In addition, in a note, Microsoft revealed: “we are looking for a way to provide more fine-grained control”.
Thus, it is believed that, over time, the chatbot will end up undergoing some changes in its system in order to maintain the safety and comfort of all its users.