TOKYO – Fake sexual images of children are being generated with photos from school events and graduation albums and spread on social media.
In some cases, minors themselves are the perpetrators, with elementary, junior high and high school students using “deepfake” technology through generative AI websites and other tools.
Police throughout the country are receiving an increasing number of reports about such cases.
The National Police Agency plans to investigate major image-generating AI websites to uncover any wrongdoing and devise prevention measures.
An elementary school boy in eastern Japan allegedly generated a fake nude image of a girl in junior high school who attended the same sports class and posted the image on a social media group chat. “I used generative AI to edit a picture of the face of a girl I like,” police quoted him as saying.
Many websites and apps allow users to generate images by giving instructions to AI, with some letting users generate fake sexual images based on children’s photos.
The boy allegedly used images that had shared by the class for its members.
According to the NPA, police received more than 100 consultations and reports regarding fake sexual images of minors in 2024. Most involved claims such as “Nude images made with my face are on social media”.
The majority of the victims are junior high and high school students, but elementary school children have also been involved.
In most cases, people they know are suspected of generating the images. Seventeen cases have been identified as involving images made through AI with photos from school events and graduation albums.
Harassment targeting same-sex peers has been a motive in some cases.
There have also been cases serious enough to spur criminal investigations. In the Tokai region in central Japan, a male junior high school student was referred to prosecutors on suspicion of defamation for generating a fake nude image using a female classmate’s photo and sharing it with friends.
The possession and production of sexual images of children is regulated under the child prostitution and child pornography prevention law. However, the law assumes that cases involve a real image of a child, so prosecuting deepfakes is difficult.
Police across the nation are therefore applying other laws and regulations to these cases. A survey by a US security company found that 95,820 deepfake videos were identified online in 2023, with 98 per cent being pornographic.
“The cases police are aware of are just the tip of the iceberg,” a senior police official said.
Given this situation, the NPA is focusing on about 10 image-generating AI websites and apps that are especially popular among junior high and high school students, out of the many to be found online.
Police will investigate their operations, and the results will be used not only for criminal investigations but also for awareness campaigns targeting youth through delinquency prevention classes and similar programs.
“If sophisticated fake images spread online, it violates the privacy of the children whose photos are misused, even if they’re fake,” said Mr Masaki Ueda, an associate professor of criminal law at Kanagawa University who is familiar with obscenity regulations.
“The psychological burden is significant, so deepfake images should be treated similarly to child pornography and legal regulations should be considered,” Prof Ueda said.
Nations increasingly regulate sexually explicit deepfakes.
Moves to regulate sexually explicit deepfake images are growing overseas.
In South Korea, a Bill was passed in September 2024 to punish not only the creation of deepfake images, but also the act of viewing or possessing them.
In the United States, a federal law was enacted in May prohibiting the posting of sexually explicit images or videos online without the consent of the subject in question.
The law made it mandatory for social media platforms and other relevant operators to remove such images and videos within 48 hours of receiving a request from victims.
Britain and Australia also regulate the sharing of sexually explicit deepfake images.
In Japan, the Tottori prefectural government in August revised an ordinance to prohibit – with penalties – the creation of sexually explicit fake images using photographs of children.
There have been calls for action, including legislation, within the central government’s working group discussing internet use by young people. However, discussions have not progressed much further.
ChildFund Japan, a Tokyo-based international nongovernmental organization, surveyed 1,200 men and women nationwide from January to February.
About 70 per cent of respondents believed that sexual images of children, including fake images, should be regulated by law regardless of whether the subjects in the images actually exist. THE JAPAN NEWS/ ASIA NEWS NETWORK