HYBE Uses Artificial Intelligence To Produce K-pop Songs In Six Different Languages

JAKARTA - In a dim recording studio in Seoul, a producer on a K-pop music label that has given birth to a well-known BTS boy group, uses artificial intelligence to combine the voice of a South Korean singer with the original speakers from five other languages.

This technology allows HYBE, South Korea's biggest music label, to release songs by MIDNATT singers in six languages - Korea, English, Spanish, Chinese, Japanese, and Vietnamese in May.

According to HYBE, Several K-pop singers have released songs in English and Japanese in addition to their native language, but applying this new technology to the release of six languages simultaneously is the first in the world. It can also pave the way for the use of similar technologies by other popular actors.

"We will listen to the fans' reactions and voices first, then determine the next steps," said Chung Wooyong, HYBE's chief interactive media division in an interview at the company's studio.

Lee Hyun, 40, known as MIDNATT, who speaks only limited English and Chinese in addition to Korean, recorded the song "Masquerade" in each language.

"The original speakers read the lyrics of the song, and then the two were perfectly combined with HYBE's internal AI music technology assistance," Chung said.

The song is the latest sign of AI's growing influence in the music industry, as the Grammy Awards have introduced new rules for the use of this technology and the mix-up of songs generated by AI is flooding social media.

"We divided the voices into various components - pronunciation, timbre, tone, and volume," said Chung. "We saw pronunciations related to tongue movements and used our imaginations to see what kind of results we could make with our technology."

In a pre- and after comparison shown to Reuters, an elongated vocal sound was added to the word "twisted" in English lyrics, for example, to sound more natural, while no changes were detected in the singer's voice.

"Using deep learning supported by the framework of the Neural Analysis and Synthesis (NANSY) developed by Supertone, makes the song sound more natural than using non-AI software," said Supertone chief operating officer Choi Hee-doo.

HYBE announced the acquisition of a Supertone worth 45 billion won (IDR 539.5 billion) in January. HYBE said it plans to make some of the AI technologies used in the MIDNATT song accessible to creators and the general public, but did not specify whether to charge.

MIDNATT said that the use of AI had given him a "broader spectrum of artistic expression."

"I feel that language barriers have been lifted and it is much easier for global fans to experience immersive experiences with my music," he said in a statement.

"While this technology isn't new, it's an innovative way of using AI in music," said Valerio Velardo, director of The Sound of AI, a Spanish-based consulting service for AI music and audio.

"Not only professional musicians but also the wider community will benefit from AI music technology in the long term," Velardo said. "This will lower the barrier to music creation. It's a little similar to Instagram for images but in terms of music."

"For now, HYBE's pronunciation correction technology takes "weeks or months" to do so, but when this process is accelerated, this technology can be used for various purposes such as interpretations in video conferencing," said Choi Jin-woo, producer of MIDNATT's "Masquerade" song known as Hitchhiker.