K-Pop agencies unite against deepfake content issue in South Korea: What they're doing to protect their girl idols
Trigger Warning: Descriptions of rape and sexual assault.
Updated : September 04, 2024 09:56 AM ISTTrigger Warning: Descriptions of rape and sexual assault.
K-Pop agencies on deepfake content
Trigger Warning: Descriptions of rape and sexual assault.
South Korea is grappling with the alarming issue of illegal, sexually exploitative deepfake images of young women and girls being circulated on Telegram. Numerous chat rooms on the platform have been set up by men who are sharing personal information and pictures about women, including their families and coworkers. These crimes are not only increasing in number but also sophistication.
ALSO READ | Telegram deepfake s*x crimes: K-pop community unites, urges agencies for better protections of girl groups
Now, these men are misusing AI technology to superimpose the faces of K-pop girl group members onto these illicit images. In response to mounting pressure from the K-pop community, which has urged agencies to safeguard their idols, several major K-pop agencies have taken a firm stand. They have vowed legal action against those creating and distributing these harmful materials. Here’s how some of the leading companies are addressing the issue.
we need all female idols and Korean girls to be protected and safe, without any disgusting, sexist, abusive, misogynist idiot daring to even look at them.#PROTECT_OUR_IDOLS pic.twitter.com/xmwPSyXz60
— raera ༊·˚ (@iu_chjd) August 30, 2024
JYP Entertainment:
The agency behind popular girl groups like TWICE, ITZY, and NMIXX, were quick to respond to the issue and expressed profound concern over the proliferation of AI-generated videos involving their artists. JYP has committed to collecting all necessary evidence to pursue the most stringent legal action possible and assured fans that they will take firm steps to safeguard their artists' rights.
딥페이크 (AI 기반 합성) 영상물 확산에 대한 대응 관련https://t.co/wipV8pJJWv
— TWICE (@JYPETWICE) August 30, 2024
ATRP
The agency representing LOONA member and popular solo artist Chuu has also stated the matter. They declared that they would not remain passive and would respond with firm legal action, without any leniency.
[📢]
— CHUU (@chuu_atrp) August 31, 2024
아티스트 불법 영상물 확산 대응 안내
🔗https://t.co/buveaodPBn#CHUU #츄
YG Entertainment
The group that represents BLACKPINK, BABYMONSTER, 2NE1, and Big Mama has firmly condemned the creation and distribution of inappropriate deepfake materials. They have pledged to take decisive action against any illegal activities related to these materials.
YG아티스트 딥페이크 관련 대응 안내https://t.co/ardcgoPYPE
— YG FAMILY (@ygent_official) September 2, 2024
MODHAUS
The agency representing groups such as TripleS and ARTMS has also addressed the rising concern among fans about deepfake content. They stated that they are closely monitoring the situation to ensure the protection of their artists.
딥페이크 관련 공지 올라왔네 빠르고 좋네요 이런건…
— 38 (@38ttaku) August 30, 2024
모드하우스 아티스트 권리침해 제보 report@mod-haus.com pic.twitter.com/C6J9ULjWr8
FC ENM
The agency representing the group ILY:1 issued a firm response upon discovering that AI-generated videos targeted their artists. They announced that they are gathering all pertinent evidence and collaborating with a professional legal team to pursue robust legal action.
[공지] 아일리원 (ILY:1) 관련 딥페이크 (AI 기반 합성) 영상물에 대한 대응 방침 안내 (240831)https://t.co/2LAPAu68Kj pic.twitter.com/5Qb6zqv5bB
— ILY:1 (@FCENM_ILY1) August 31, 2024