South Korea’s Vulnerability to Deepfake Sex Crimes
In October 2024, a South Korean court handed down a ten-year prison sentence to the perpetrator of a deepfake sex crime. The ringleader, a 40-year-old alumnus of Seoul National University, created and distributed 1,852 deepfake photos and videos of female students and others. The crime group set up more than 200 chat rooms on Telegram and continued the offenses for four years. They did not profit from the illicit pornography but instead perpetrated the crimes to fulfill their sexual desires (Kim 2024a). Media reports characterized the crime as a consequence of ordinary students failing to pass examinations, secure employment, or get married and developing a warped sexual perception that transformed them into sex offenders. The court also noted that “the perpetrators expressed their inferiority complexes and hatred of socially successful women in a perverted manner and with anonymity.” This case, which follows the so-called “Nth Room” case of 2019, which involved the organized distribution of sexual abuse materials via Telegram chat rooms, demonstrates how digital sex crimes are evolving in conjunction with new technologies such as deepfakes.
Deepfake sex crimes are also infiltrating schools. From January to September 2024, 434 reports of sexual abuse materials were received from schools, with the Ministry of Education referring 350 of them for investigation (Ministry of Education 2024-09-09). Of the 474 individuals apprehended for deepfake sex crimes between January and October 2024, 381 were adolescents (Lee 2024a), which indicates that minors are not only particularly susceptible to deepfake sex crimes but also that the majority of alleged perpetrators are minors.
The rapid advances in artificial intelligence (AI) technology have accompanied the worldwide proliferation of deepfake sexual abuse materials. A report by a U.S. cybersecurity firm revealed a significant increase in the number of deepfake pornography from 3,725 in 2022 to 21,019 in 2023. Notably, 99 percent of these videos targeted women. The report also revealed that approximately 53 percent of individuals depicted in deepfake pornography are South Korean, with eight out of 10 of the most frequently featured individuals being South Korean singers (Security Hero 2024). It should be noted that the nationality of the victim and the perpetrator may differ. Nevertheless, these findings illustrate the vulnerability of South Korea's highly digitalized society to deepfake sex crimes and the imperative for legal and social responses.
Digital sex crime is regarded as a grave infringement of women's fundamental human rights. The victims have reportedly experienced not only psychological harm but also physical deterioration with accompanying symptoms (Women’s Human Rights Institute of Korea 2023). Moreover, digital sex crimes can also result in considerable social costs, such as the discouragement of women's social and work-related activities. Targeting female journalists who have reported instances of deepfake sex crimes in the Telegram chat room represents a threat to both their personal rights and freedom of the press.
Legislative Response to the Rising Deepfake Sex Crimes
The repeated and pervasive victimization typifies digital sex crime through the reproduction and dissemination of material. Its occurrence in cyberspace across borders makes it challenging to detect and recover from the damage fully. Moreover, it may take time for victims to recognize the crime, if they recognize it at all. In response to the nature of these crimes, some advanced democracies have implemented the following measures.
In the United States, legislation is underway to criminalize deepfake sex crimes at the federal level. The “DEFIANCE Act,” which regulates non-consensual digital forgeries, was passed by the Senate in July 2024 and is currently under consideration by the House of Representatives. The bill would permit plaintiffs to seek redress through civil litigation for deepfake sexual abuse. It would also empower courts to impose punitive damages or order the removal and destruction of material. Furthermore, the legislation provides measures to safeguard the privacy of victims, such as authorizing plaintiffs to use pseudonyms. In addition, the “DEEPFAKES Accountability Act” was introduced in the House of Representatives in September 2023. This act would impose a fine or imprisonment for the creation and distribution of deepfake sexual abuse materials with the intent to humiliate or harass a person. Several states, including California, Texas, and New York, have already enacted laws that allow for civil lawsuits and criminal penalties for deepfake sexual offenses.
In May 2024, the European Council approved the Artificial Intelligence Act, classifying AI into four risk levels. Deepfakes are classified as “limited risk,” falling under level 2. While the Act's principal objective is to regulate and prohibit higher levels, it also addresses level 2. For example, Article 50 imposes a transparency obligation on AI systеms, requiring them to disclose that they have generated results with deepfakes. Furthermore, the Digital Services Act delineates the obligations of major online platforms and search engines. It also entails ensuring that deepfakes are labeled upon publication and conducting at least an annual assessment of the risk of gender-based violence, with the implementation of appropriate countermeasures.
The United Kingdom and Australia have also enacted legislation that criminalizes the dissemination or threatened dissemination of intimate images or videos of others, including deepfakes, without their consent. Additionally, online platforms and search engines are obliged to implement reporting and response systеms to mitigate the risk of illegal information. The Canadian government is investing in technological research and development to prevent the creation and distribution of deepfakes (Lawson 2023). In addition to criminalizing the production and distribution of deepfake pornography, these countries are also taking active anti-crime measures against the online platforms that facilitate such activities. In South Korea, along with enhanced penalties, proactive measures are necessary to eradicate the dissemination of deepfake sexual abuse material.
In the case of South Korea, the country’s stakeholders have advocated for legislation that would provide clear definitions and penalties for deepfake offenses. The Act on the Punishment of Sexual Crimes, amended in March 2020, stipulates that any individual who produces or distributes deepfakes with the intention of distribution shall be punished. Concurrently, revised laws regulating internet communications require internet service providers to implement measures to prevent the dissemination of illicit video content. However, it has been observed that the penalties are not sufficiently stringent. In the four years since the law was enacted, 39% of those accused were sentenced to probation. The majority of cases in which prison sentences were handed down were compounded by the addition of other charges than a deepfake offense itself (Kim 2024b). In this context, the recent 10-year prison sentence handed down to the ringleader of the deepfake case at Seoul National University was regarded as a significant penalty, given that the judiciary unusually accepted all sentences requested by the prosecution.
As the deepfake pornography incidents raise a public outcry, the legislature has responded with stronger punishment. In October 2024, the Act on the Punishment of Sexual Crimes was revised to criminalize the production of deepfake pornography, even in the absence of an intent to distribute, and it also increased the maximum prison term for production and distribution. The new legislation also criminalizes the possession, purchase, storage, and viewing of deepfake pornography. In addition, the state became responsible for facilitating the removal of illicit material and aiding victims. Subsequently, legislation was passed to permit investigative agencies to conduct covert investigations into digital sex crimes.
Some politicians have argued that media reports are exaggerating the threat of deepfake sex crimes and have expressed concern about excessive regulation. They pointed out that it is an overstatement to conclude that 220,000 subscribers to a Telegram channel with a bot that synthesizes deepfake pornography represent the same number of deepfake sextortion perpetrators in Korea. In light of the difficulty in determining the nationality of Telegram users, this remark has a point. Still, given the capacity of a relatively small number of individuals to generate substantial quantities of deepfake pornography in a short period using AI, the low number of perpetrators does not necessarily indicate a correspondingly low level of severity of deepfake sextortion.
Structural Background of Deepfake Sex Crimes in South Korea
In order to identify solutions to the recurring deepfake sex crimes, it is essential to gain a comprehensive understanding of the underlying structural context. Firstly, there have been attempts to find a link to the anxiety experienced by the younger generation in recent years. Young males, in particular, are experiencing increased competition, an unemployment crisis, and a housing shortage; however, the remnants of patriarchal culture still place more responsibility on men, which leads to a perception gap (Lee 2024b). This discrepancy is further exacerbated by the perception that there are no structural gender inequalities among the younger generation. For them, South Korea’s low gender equality and female participation in politics and senior corporate positions (World Economic Forum 2024) are perceived as a legacy of the past. These attempts conclude that the anxiety and dissatisfaction stemming from the discrepancy between economic reality and social perception give rise to anti-women sentiments and sexual crimes as manifestations of this hatred. This interpretation aligns with the growing gender conflict in South Korean society, particularly among the younger generation. However, given that the majority of individuals arrested for deepfake sex crimes are teenagers, it is difficult to attribute this phenomenon solely to male perception.
Experts of sex education and counseling for youth have observed that deepfakes have already become a form of entertainment among teenagers, positing that these individuals tend to view deepfake sex crimes as a means of gaining peer approval (Mackenzie and Choi 2024). However, the perception of deepfakes as a form of play does not necessarily indicate a lack of awareness of their seriousness. In a survey on cyberbullying asking the reasons behind the continued prevalence of digital sex crimes, the most frequently cited reasons by adolescents were “weak punishment” (26.1%) and “not worrying about being caught because of the anonymity of the internet” (22.3%). The primary reasons cited by adults were “to make money” (31.5%) and “because of the weak punishment” (30.5%). The lowest response among adolescents and adults was “because it is not serious” (Korea Communications Commission 2023-04-07). Although the survey was designed to gauge public opinion and does not directly purport to represent the motivations of actual criminals, it does indicate a trend of people who are aware of the illegality of deepfake pornography but rely on the anonymity of the internet and loopholes in punishment to commit the crime.
Further Action Against Deepfake Sex Crimes
It is yet to be determined whether the revised legislation will prove an effective deterrent against deepfake sex crimes. There are calls for further legislative action, including the police authority’s prompt intervention for internet service providers to remove and block illicit materials, as these can continue to proliferate while the decision-making process is delayed. Furthermore, in alignment with the principles of advanced democracies, it is imperative to enhance the accountability of internet service providers to implement more proactive preventive measures, such as the prevention of the posting of sexual abuse materials and regular monitoring. Regarding the subject of punishment, it was proposed that those who instigate the creation of sexual abuse materials should be held accountable. Furthermore, it is asserted that comprehensive legislation is imperative to address emerging forms of violence as technology advances. It is necessary that lawmakers are adequately informed of the gravity of the situation and that any developed legislation is legally stable and unambiguous.
In a more comprehensive sense, it is required to reinforce the regulatory framework and the response to disinformation in its entirety. The proliferation of disinformation has led to a significant increase in threats to democratic processes. These threats have been observed in the form of fake videos of politicians and fake news from unknown sources, which have the potential to influence election outcomes and impede citizens' ability to make informed decisions. While law enforcement agencies should respond expeditiously to identify digital sex crimes, comprehensive measures against rapidly evolving new technologies are also required to encompass the full spectrum of disinformation and facilitate a unified response across government ministries and between government and the private sector.
In light of the pervasiveness of deepfake sex crimes among young people, public education should equip students with the ability to recognize the seriousness and harmfulness of these crimes and to respond promptly when they occur. The most effective education would be to demonstrate that those who perpetrate crimes will face increased investigations and penalties. Besides, further educational initiatives are necessary to inform the public about disinformation and cultivate a new digital literacy that enables individuals to determine the veracity of the information they encounter and to consume it in an informed manner.
In conjunction with the proliferation of deepfake sexual abuse materials, there is a growing call from civil society for a swift and decisive response. Korean democracy is now confronted with the challenge of curbing deepfake sex crimes through legislative and institutional measures for the sake of women’s human rights. ■
References
European Commission. 2024. “The Digital Services Act package.” https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package (Accessed December 5, 2024)
Future of Life Institute. 2024. “The EU Artificial Intelligence Act.” https://artificialintelligenceact.eu/ (Accessed December 5, 2024)
Kim, Dongwon. 2024a. “"My Nude Photos in a Group Chat?"… Growing Deepfake Crimes Leave the Public in Fear.” The AI. August 27. https://www.newstheai.com/news/articleView.html?idxno=6220 (Accessed December 5, 2024)
Kim, Jeonghwa. 2024b. “Analysis of all sentences since the Deepfake Sexual Abuse Act of 2020 – Probation at Nearly 40 Percent.” (in Korean) The Kyunghyang Shinmun. September 4. https://www.khan.co.kr/article/202409041031011 (Accessed December 5, 2024)
Korea Communications Commission. 2023. “Four out of 10 teenagers have experienced cyberbullying, one more person than last year.” April 7. https://www.kcc.go.kr/user.do?mode=view&page=E04010000&dc=E04010000&boardId=1058&cp=10&boardSeq=54874 (Accessed December 5, 2024)
Lawson, Amanda. 2023. “A Look at Global Deepfake Regulation Approaches.” Responsible Artificial Intelligence Institute. April 24. https://www.responsible.ai/a-look-at-global-deepfake-regulation-approaches/ (Accessed December 5, 2024)
Lee, Soo-Jung. 2024a. “Adolescents 80% of 474 suspects arrested for deepfake porn this year.” Korea JoongAng Daily. October 16. https://koreajoongangdaily.joins.com/news/2024-10-16/national/socialAffairs/Adolescents-80-of-474-suspects-arrested-for-deepfake-porn-this-year-/2156145 (Accessed December 5, 2024)
Lee, Soohyun Christine. 2024b. “Anti-Gender Politics, Economic Insecurity, and Right-Wing Populism: The Rise of Modern Sexism among Young Men in South Korea.” Social Politics: International Studies in Gender, State & Society. October 18. https://academic.oup.com/sp/advance-article/doi/10.1093/sp/jxae016/7826751 (Accessed December 5, 2024)
Mackenzie, Jean, and Leehyun Choi. 2024. “Inside the deepfake porn crisis engulfing Korean schools.” BBC News. September 3. https://www.bbc.com/news/articles/cpdlpj9zn9go (Accessed December 5, 2024)
Ministry of Education. 2024. “Second Survey Results of School Deepfake Sexual Crime Damage Situation Released.” (in Korean) https://www.moe.go.kr/boardCnts/viewRenew.do?boardID=294&boardSeq=100958&lev=0 (Accessed December 5, 2024)
Security Hero. 2024. “2023 State of Deepfakes.” https://www.securityhero.io/state-of-deepfakes/ (Accessed December 5, 2024)
United States Congress. 2024a. “S.3696 - DEFIANCE Act of 2024.” https://www.congress.gov/bill/118th-congress/senate-bill/3696 (Accessed December 5, 2024)
______. 2024b. “H.R.5586 - DEEPFAKES Accountability Act.” https://www.congress.gov/bill/118th-congress/house-bill/5586 (Accessed December 5, 2024)
Women’s Human Rights Institute of Korea. 2023. “Research on the Experience of Digital Sex Crime Dissemination and Anxiety.” (in Korean) https://www.stop.or.kr/home/kor/M788976317/promotion/published/research/index.do (Accessed December 5, 2024)
World Economic Forum. 2024. “Global Gender Gap Report 2024.” https://www.weforum.org/publications/global-gender-gap-report-2024/ (Accessed December 5, 2024)
■ Hansu Park is a Research Associate at the East Asia Institute.
■ Edited by Hansu Park, Research Associate |