JAKARTA Gemini, Google's Artificial Intelligence (AI), has been in the spotlight over the past few days for producing inappropriate images in the search for Nazi-era German soldiers.

From the images seen, Gemini includes people with colored skin or women in Nazi army clothes. Gemini also produced other images that were inaccurate and deviated from history.

For an error in making human images at Gemini, Google decided to disable the feature. The human AI image maker feature will be turned off until Google releases a fixed Gemini version.

We are working on overcoming the latest problems in the Gemini image-making feature. While we are doing this, we will temporarily stop making pictures of people and will soon re-release a better version," Google said on platform X on Thursday, February 22.

The human image maker tool on Gemini has indeed been deactivated. When Gemini users want to produce an image, the AI model will reject the command and state that the feature has been withdrawn for repair.

We are working to improve Gemini's ability to produce human images. We hope this feature returns soon and will notify you in a release update if the feature reappears," Gemini's response, quoted by The Verge.

Before disabling the human image maker feature, Google apologized on its official social media account. They admit that the diversity input in Gemini's algorithm makes the tool produce images that violate history.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)