It's no news that digital technology can be used to create fake nude images of anyone, but the problem has exploded thanks to AI development. American Wired has investigated the app Telegram and found around 50 bot accounts that promise to remove clothes from people in images or even create images where people perform sexual acts. According to the investigation, the different accounts had more than 4 million users each month.
It's disturbing that these tools – which destroy lives and create nightmare-like scenarios primarily for young girls and women – are so easy to access and find openly, on one of the largest apps in the world, says an expert who Wired spoke with.
Several well-known individuals have been exposed to public fake nude images, deepfakes, recently. Among them are artist Taylor Swift and Italy's Prime Minister Giorgia Meloni. But the images that AI bots on Telegram "undress" can just as easily be someone's partner, colleague, daughter, or classmate.
The majority of the bots and channels used to market them were shut down by Telegram after Wired contacted them for a comment.
However, several similar accounts have emerged since then.
Telegram has not commented on the existence of AI bots on the platform.
Telegram's founder, Pavel Durov, was arrested for a time in France, accused of allowing criminal content on the app, including child abuse material, drug and weapons trade, and calls to violence and terrorism.