أنشرها:

The internet Watch Foundation (IWF) says their "worst dream come true" when pedophiles use artificial intelligence (AI) to flood the internet with sexual images of celebrities as children.

IWF said the AI model was used to 'change' the faces of celebrities and make photos of actors' children appear sexual.

Real children's faces and bodies are used to train special AIs capable of producing thousands of new abuse images.

According to the IWF, many of these images are so realistic that even a trained investigator will find it difficult to distinguish them from the original photos.

Researchers found nearly 3,000 synthetic images had been posted on one of the darknet forums in one month, including a picture of a well-known female singer.

According to the IWF, there is a growing trend in which one image of a abuse victim is known to be taken and used to produce more images of victims in different abuse settings.

IWF also shared disturbing comments from pedophiles on the darknet forum praising image quality and discussing how to produce more images.

The IWF warns that in the meantime no child is physically tortured in making this image, but there is significant psychological damage to victims and normalized predatory behavior.

Most AI images are classified as Category C, the lightest category, but one in five images is Category A, as the most severe form of abuse.

The IWF also noted that an increasing number of quality AI images had reached new levels.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)