OpenAI Develops Adjustable Chatbot Improvements To Overcome Bias In AI
JAKARTA - OpenAI, a startup in San Francisco funded by Microsoft Corp and used for their latest technology, announced that it is developing an update for their viral chatbot, ChatGPT, which can be customized for users. This is done to address concerns about bias in artificial intelligence (AI).
While it has been trying to mitigate political bias and others, OpenAI wants to accommodate more diverse views. This would mean allowing the output of systems that other people (including us) may strongly disagree, OpenAI said in a blog post as quoted by Reuters. Although they offer customization as a solution, however, there remains borders on the system's behavior.
The chatbot, released in November last year, has generated tremendous interest in the technology behind it called the AI generative. This technology is used to generate answers that mimic human speeches and has been amazing to many people.
The news from this startup also comes along with several media showing that answers from Microsoft's new search engine, Bing, backed by OpenAI, are potentially dangerous and this technology may not be ready for use.
How technology companies set guardrails for this still new technology is the main focus for companies in AI's generative space. Microsoft said on Wednesday February 15 that feedback from users helped them increase Bing before it was launched more widely. In learning, for example, their AI chatbot can trigger to provide answers that are not meant.
OpenAI said in their blog post that ChatGPT's answer was based on a large dataset of texts available on the Internet. As a second stage, humans reviewed smaller datasets and were given guidance for various situations.
For example, in the event that users ask for adult content, violence, or containing hate speech, human reviewers should direct the GPT Chatbot to answer with something like "I can't answer that."
When asked about controversial topics, a reviewer should let the Chatbot GPT answer questions, but offer to describe people's views and movements, instead of trying to "take the right view of complex topics," the company explained in a guideline quote for their software.