img
Sat, 26 Apr. 2025

Deepfake?porn is destroying real lives in South Korea

Ruma was having lunch on a summer day in 2021 when her phone began blowing up with notifications.

When she opened the messages, they were devastating. Photos of her face had been taken from social media and edited onto naked bodies, shared with dozens of users in a chat room on the messaging app Telegram.

The comments in screen shots of the chat room were demeaning and vulgar – as were the texts from the anonymous messenger who had sent her the images. “Isn’t it funny? … Watching your own sex video,” they wrote. “Tell me you honestly enjoy this.”

The harassment escalated into threats to share the images more widely and taunts that police wouldn’t be able to find the perpetrators. The sender seemed to know her personal details, but she had no way to identify them.

“I was bombarded with all these images that I had never imagined in my life,” said Ruma,

While revenge porn – the nonconsensual sharing of sexual images – has been around for nearly as long as the internet, the proliferation of AI tools means that anyone can be targeted by explicit deepfakes, even if they’ve never taken or sent a nude photo.

South Korea has had a particularly fraught recent history of digital sex crimes, from hidden cameras in public facilities to Telegram chat rooms where women and girls were coerced and blackmailed into posting demeaning sexual content.

But deepfake technology is now posing a new threat, and the crisis is particularly acute in schools. Between January and early November last year, more than 900 students, teachers and staff in schools reported that they fell victim to deepfake sex crimes, according to data from the country’s education ministry. Those figures do not include universities, which have also seen a spate of deepfake porn attacks.

In response, the ministry established an emergency task force. And in September, legislators passed an amendment that made possessing and viewing deepfake porn punishable by up to three years in prison or a fine of up to 30 million won (over $20,000).

Creating and distributing non-consensual deepfake explicit images now has a maximum prison sentence of seven years, up from five.

South Korea’s National Police Agency has urged its officers to “take the lead in completely eradicating deepfake sex crimes.”

But of 964 deepfake-related sex crime cases reported from January to October last year, police made 23 arrests, according to a Seoul National Police statement.

Legislator Kim Nam-hee told CNN that “investigations and punishments have been too passive so far.” So, some victims, like Ruma, are conducting their own investigations.
 

Victims taking action
Ruma was a 27-year-old university student when her nightmare first began. When she went to the police, they told her they would request user information from Telegram, but warned the platform was notorious for not sharing such data, she said.

Once an outgoing student who enjoyed school and an active social life, Ruma said the incident had completely changed her life.

“It broke my whole belief system about the world,” she said. “The fact that they could use such vulgar, rough images to humiliate and violate you to that extreme extent really damages you almost irrevocably.”

She decided to act after learning that investigations into reports by other students had ended after a few months, with police citing difficulty in identifying suspects.

Ruma and fellow students sought help from Won Eun-ji, an activist who gained national fame for exposing South Korea’s largest digital sex crime group on Telegram in 2020.

Won agreed to help, creating a fake Telegram account and posing as a man in his 30s to infiltrate the chat room where the deepfake images had circulated. She spent nearly two years carefully gathering information and engaging other users in conversation, before coordinating with police to help carry out a sting operation.

When police confronted the suspect, Won sent him a Telegram message. His phone pinged – he had been caught.

Two former students from the prestigious Seoul National University (SNU) were arrested last May. The main perpetrator was ultimately sentenced to 9 years in prison for producing and distributing sexually exploitative materials, while an accomplice was sentenced to 3.5 years in prison.

Police told CNN further investigations identified at least 61 victims, including 12 current and former SNU students. Seoul National University, in a briefing after the incident, said “the school will strengthen preventative education to raise awareness among the members of the university about digital sex crimes and do its best to protect victims and prevent recurrence.”

Excerpts of the ruling shared by Ruma’s lawyers state, “The fake explicit materials produced by the perpetrator are repugnant, and the conversations surrounding them are shocking … They targeted victims as if they were hunting prey, sexually insulted the victims and destroyed their dignity by using photos from graduations, weddings, and family gatherings.”

In response to the verdict, Ruma told CNN, “I didn’t expect the ruling to align exactly with the prosecution’s request. I’m happy, but this is only the first trial. I don’t feel entirely relieved yet.”

Ruma’s case is just one of thousands across South Korea – and some victims had less help from police.

Trending