Ernie, Baidu Bot Can't Answer Sensitive Questions About Xi Jinping

JAKARTA - The new bot belonging to a Chinese internet search company, Baidu, named Ernie, is known to have not been able to answer questions about Chinese President Xi Jinping. In a test conducted by Reuters, Ernie said that she had not learned to answer the question. However, the bot was able to produce flower images and write Tang Dynasty style poetry in seconds.

Baidu issued this bot as a competitor to OpenAI's ChatGPT. After testing several analysts and users, they gave a positive response to their experience using Ernie. However, there are still questions about how Ernie and other Chinese chatbots will handle sensitive topics in mainland China, where governments are strict in censoring the internet.

Reuters' testing of ChatGPT shows that this Microsoft-backed chatbot is not reluctant to answer such questions. Some questions about Xi, including whether he is a good leader, his contribution to China, and requests for poetry and portraits about him were asked to Ernie. Ernie gave a short answer to Xi's education and role, but she refused to answer most of the questions.

Ernie gave a similar answer when asked about the massacre of pro-democracy demonstrations in Tiananmen Square in 1989 and the treatment by authorities against the Uighur Muslim ethnic minority in the western region of Xinjiang. The bot even suggested that topics be discussed on different topics.

Ernie was able to provide a long answer to several questions about international relations, such as why US-China relations were deteriorating, but she still avoided controversial questions, such as whether China should use military force to reunite with Taiwan.

Baidu insists that the bot complies with the government's request to censor search results on sensitive topics. When asked about how to deal with sensitive topics, the bot said it was considering "relevant laws and moral standards" in determining whether the topic could be "openly discussed".

Baidu CEO Robin Li said that the chatbot was not perfect and asked users to understand his mistake, and he was confident that this chatbot would be better with user feedback.