The Meta Supervisory Board Reviews Handling Of AI-made Sexual Images
JAKARTA Tuesday, April 16, the Supervisory Board of Meta Platforms Inc., announced that they are reviewing how the company handles two images made with artificial intelligence (AI). The image contains sexual content from female celebrities circulating on Facebook and Instagram services.
The board, funded by Meta but operating independently, will use the two examples to assess the effectiveness of Meta's law enforcement policies and practices around falsification of pornography created using artificial intelligence, as described in a blog post.
Despite providing a description of the images, the council did not reveal the identity of the female celebrities described inside, citing "preventing further damage," a board spokesman said.
Advances in artificial intelligence technology have made fake images, audio clips, and videos very difficult to distinguish from human-made content. This has resulted in the emergence of fake images that spread online, especially those featuring teenage girls and girls.
In a case that had surfaced to the public earlier this year, the social media platform owned by Elon Musk, X, temporarily blocked users from searching for all pictures of US pop singer Taylor Swift, after difficulty controlling the explicit spread of fake images of her.
Several industrial executives have urged that legislation be made that criminalize the creation of a dangerous "deep fake" and require technology companies to prevent the use of their products for that purpose.
SEE ALSO:
According to a description of the cases provided by the Supervisory Board, one of which involves an AI image featuring a naked woman who resembles a public figure from India, posted by an account on Instagram that only shares AI images from Indian women.
Another image, the council said, appeared in a Facebook group to share AI creations and display an AI image of a naked woman resembling "an American public figure" with a man touching his breasts.
Meta removed the image depicting the American woman for violating bullying and harassment policies. But initially they left a picture featuring the Indian woman and only changed the policy after the council voted for her to be reviewed.
In a separate post, Meta acknowledged the cases and pledged to implement the decisions taken by the council.