JAKARTA - During the 2024 US election, ChatGPT has rejected more than 250,000 requests for images of President-elect Donald Trump, Vice President Kamala Harris, Vice President-elect JD Vance, President Joe Biden, and Governor Walz with DALL-E.
OpenAI explained that it was a direct result of the security measures that had been implemented, so ChatGPT would refuse if asked to create pictures of real people, including politicians.
The company explained that it had implemented security measures on ChatGPT since early 2024, in order to welcome the Global Election, including the US.
"This task force is very important in the context of elections and is an important part of our broader efforts to prevent our tools from being used for fraudulent or detrimental purposes," the company wrote in a blog post.
In fact, they are working with the National Association of Secretaries of State (NASS), to direct people who raise specific questions to ChatGPT about voting in the US, such as where or how to vote, to CanIVote.org in the election month.
SEE ALSO:
OpenAI mengaku sulit menemukan sekitar 1 juta tanggapan ChatGPT mengarahkan orang ke CanIVote.org. Selain itu, saat Pemilu AS dimulai, orang-orang yang bertanya kepada ChatGPT tentang hasil pemilihan, akan menerima tanggapan yang mendorong mereka untuk memeriksa sumber berita seperti Associated Press dan Reuters.
About 2 million ChatGPT responses included this message on Election Day and the following day, continued the company behind the successful ChatGPT.
In addition to directing people to reliable sources of information, OpenAI is also trying to ensure ChatGPT does not disclose political preferences or recommend candidates even when asked explicitly.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)