AI tools undress children, reports increasing rapidly

Published:

AI tools undress children, reports increasing rapidly
Photo: Anders Humlebo/TT

The so-called nudify apps have received a lot of attention in countries such as Spain and the United States, where several children have been exposed to the dissemination of AI-generated images.

Simply put, they are apps or other tools where you can upload any image and turn it into a nude or pornographic image. They rarely have protection that prevents them from being used on children, says Jonas Karlsson, an investigator at the children's rights organization Ecpat.

More calls

In a new report, Ecpat has attempted to investigate what children in Sweden know about the apps and whether they themselves have been victimized. The results show that so far only a vanishingly small percentage of children know that they have been victimized, but at the same time, Ecpat is receiving more and more calls about this to its helpline.

"We now have a couple of contacts a week about this. The conversations are increasing rapidly," says Jonas Karlsson.

"Both we and the children believe that it is a problem that will only grow."

Meta has banned apps from advertising on its platforms, but on Telegram, for example, they are advertised openly.

"They say it's for adults, but they use words like 'see your crush naked,' so it's aimed at a young audience."

A large majority of children who responded to Ecpat's survey think the apps should be banned. Such work is also underway within the EU, according to Karlsson.

Help from adults

What is striking in the report is that many more children than before say they are willing to turn to adults if they are victimized online, says Jonas Karlsson.

"It's hopeful, but it also indicates that they are quite annoyed that new tools are constantly being introduced that make the place they spend so much time in more unsafe."

In addition to supporting children who are being exposed, adults have a responsibility to inform about what is legal and what is not. Jonas Karlsson points out that children who themselves spread AI-generated nude images can be prosecuted for child pornography crimes.

Ecpat's survey about nudify tools was open for children to answer for two weeks in March 2026.

The survey consisted of short questions and a number of vignettes that the children were asked to respond to. The children could also leave open-ended responses. A total of 880 children responded to at least one of the vignettes.

80 percent of the boys and 75 percent of the girls in the survey knew about the tools.

Just over 20 percent of the boys stated that they had either seen such images or created them themselves, and just over 10 percent of the girls.

2 percent of the girls stated that someone had created pictures of them with the tools; for the boys, it was less than 1 percent.

55 percent of girls thought it was just as bad if an AI-created image was shared as if a regular nude image was shared. 47 percent of boys felt the same.

Any dealing with so-called child pornography is fundamentally a crime in Sweden.

Source: Ecpat report "AI or not - it's an invasion of someone's privacy, or their body"

Loading related articles...

Tags

Author

TT News AgencyT
By TT News AgencyEnglish edition by Sweden Herald, adapted for our readers

Keep reading

Loading related posts...