JAKARTA AlgorithmWatch, a human rights organization in the digital realm, said Microsoft-made chatbot Bing Chat had spread false and misleading information.

According to AlgorithmWatch research released on Friday, December 15, the chatbot, which recently changed its name to Copilot, has provided incorrect answers to elections in Bavaria, Hessian, and Switzerland in October this year.

From August 21 to October 2, the team from AlgorithmWatch asked simple questions about the candidates and the date the election was held. As a result, Bing Chat was wrong in informing the election date and the scandal about the candidates.

"Often chatbots cannot answer simple questions about each election candidate so (we) underestimate the tool as a source of information," the team wrote in an official report.

Although the answers given were wrong, Bing Chat often includes sources as if what they said had been proven true. This will certainly mislead users if they do not directly check the contents of the source.

One proof of Bing Chat's failure is the response when asked about the candidates. This Artificial Intelligence (AI)-based chatbot makes up the story that one of the candidates is involved in an embarrassing behavior. Of course Bing Chat added a source.

Based on these findings, AlgorithmWatch stated that Microsoft failed to create factual and consistent chatbots. They also said that Microsoft could not correct the failure even though the AlgorithmWatch team had contacted Microsoft directly.

"We told Microsoft about some of the problems we found, the company announced that they would cope. A month later, we took another sample, which showed that there were not much change in the quality of the information provided to users," AlgorithmWatch explained.

This study concludes that generative AI must be regulated legally. The reason is, in the midst of the significant increase in the use and need of a significant generative AI, companies must be able to take responsibility for what they make.

The European Union and national governments must ensure that technology companies are responsible, especially because AI tools (often) are integrated into products that have been widely used, "explained the AlgorithmWatch.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)