أنشرها:

JAKARTA - Fourteen-two German associations and trade unions representing more than 140,000 writers and perpetrators on Wednesday 19 April urged the European Union to strengthen artificial intelligence (AI) rules related to their copyright, as they identify threats from ChatGPT.

Trade unions for the creative sectors Verdi and DGB and associations for photographers, designers, journalists and illustrators expressed their concerns in a letter to the European Commission, the European Council, and EU lawmakers.

The letter highlights growing concerns about genative artificial intelligence (AI) such as ChatGPT which can mimic humans and create text and images based on input.

"The use of protected training materials without permits, non-transparent processing, and source substitution by a general AI output is approximable, raising fundamental questions about accountability, responsibility and reimbursement that must be addressed before inverse damage occurs," the letter said. seen by Reuters.

"AI generatively must be the center of meaningful AI market regulations," the letter said.

The European Commission, which last year proposed AI rules, will discuss final details with EU lawmakers and member states before the rule becomes law.

The association and trade unions called for the rule to be strengthened to regulate a generative AI across the product cycle, especially in basic model providers.

They also hold the technology provider accountable for all content generated and distributed by AI, especially for violations of personal rights and copyright, misinformation or discrimination.

The letter said basic model providers such as Microsoft, Alphabet's Google, Amazon, and Meta Platforms should not be allowed to operate central platform services to distribute digital content.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)