“Undressing” app similar to DeepFake, DeepNude widespread on Telegram messaging app

3017
SHARE

An undressing app similar to the 2019 DeepNude and the 2017 Deepfake hoax is now in widespread use on the messaging app Telegram and was also found on the Russian social media platform VK. These undressing apps use artificial intelligence (AI) to create fake nudes.

Sensity AI, previously known as Deeptrace Labs, is a cybersecurity research company whose primary focus is the abuse of manipulated media. They performed an investigation and issued a report.

Undressing app discovered on Telegram messaging app

On July 11, 2019, a new undressing app was launched on Telegram, only a couple of weeks after an AI software tool called DeepNude was voluntarily removed from Twitter.

Sensity AI discovered the AI-based software was being publicly offered as a bot on Telegram. Unlike its 2019 DeepNude predecessor it had a simple user interface. 

A community of users developed around a central channel that hosts the bot.  It is allegedly connected to seven Telegram channels which are dedicated to technical support and image sharing. The image-sharing channels include interfaces that people can use to post and judge their nude creations. 

The seven Telegram channels hosting the undressing bot has a combined total of over 100,000 members. Although there may be duplicate across channel memberships the main group has more than 45,000 unique members.

Any Telegram user could send a photo to the bot and receive a faked nude back minutes later. This is a free service, although users are able to pay 100 rubles ($1.50 USD) for perks such as removing the watermark on the faked nude photos. 

At least 100,000 women usually girls became victims of the undressing app

As of July 2020, the bot had already been used to target and create unclothed fakes of at least 100,000 women, the majority of whom had not given consent and were not aware that their privacy had been invaded.

 “Usually it’s young girls,” Giorgio Patrini, the CEO and chief scientist of Sensity, who co-authored the report claimed. “Unfortunately, sometimes it’s also quite obvious that some of these people are underage.”

The Telegram community appears to have steadily grown in membership in 2020.

A poll of 7,200 users shows that roughly 70% of them are from Russia or Russian-speaking countries. The victims appear to come from many countries, including Argentina, Italy, Russia, and the United States. 

Sensity AI researchers reached out to Telegram and to the FBI. As of October 20, 2020 Telegram has not responded. 

Sensity AI also found a fake nude app on Russian social media platform VK. An ecosystem of 380 pages had the creation tool and AI algorithm to create and share deepfake nudes. 

Sensity AI reached out to VK and appropriate law enforcement. VK responded with a statement, “VK doesn’t tolerate such content or links on the platform and blocks communities that distribute them. We will run an additional check and block inappropriate content and communities.”

DeepNude 2019 AI software tool appears 

On June 27, 2019, a software tool named DeepNude first gained attention. On Twitter, it was advertised as “The superpower you always wanted.” 

The app was available for Windows as “free to download”.  But the premium version cost $99 and offered a better resolution of the output images.

The DeepNude software used a photo of a person with clothes and created a naked image of that person. This app only worked with women’s bodies and worked best on images with a lot of skin already shown. 

This app used neural networks to remove clothing from the images of women, to create realistic nudes.

Both free and premium versions had watermarks for the AI-generated photo nudes. The watermarks clearly identified them as “fake.” But the watermarks were easy to remove.

The DeepNude app disappears (almost)

 With thousands of downloads in a short period of time, the site was crashing and some customers were complaining that the app wasn’t working properly. 

The app creator, who gave his name to a Verge reporter as “Alberto” made announcements on Twitter. It was first announced that the app was down because of server overload.

Then by the end of the day, he announced that it was possible that people may misuse the app. The developer said he had developed DeepNude because he thought someone else would soon make the app if he didn’t.

He said that his team “will quit it for sure” if the app was being misused.

And finally claimed that they were pulling the app and, “The world is not yet ready for DeepNude!”

The site is still up,  although the download page has been removed. It asks for donations in Bitcoin or Ethereum.

On June 27, 2019, DeepNude disappeared but it may have been moved directly to the messaging app Telegram.  

The USA Herald discovered, while researching this story, that there is currently a potentially active DeepNude site linked to an August 2019 video on YouTube. The site claims to have 135, 000 downloads.

“Deepfake” bots use AI to create porn in 2017 

The term “deepfake” was the name used by the app maker who posted 2017 hardcore porn videos, on Reddit. He refused to give his real name.

He created a system based on multiple open-source software like Keras with TensorFlow. Using deep learning methods focused on image searches, stock photos, and Youtube videos the app-maker created porn. 

Deepfake wasn’t the first to photoshop images to look like famous people posing nude. But he may have created some of the most realistic fakes using AI technology at that time.

Celebrity porn was made with photoshopped images to look like famous people posing nude. The hoax victims include celebrities Scarlett Johansson, Taylor Swift, Maisie Williams,  Aubrey Plaza, and Gal Gadot on Reddit.

With deep learning networks of interconnected nodes autonomously run computations on input data. In this case, the algorithm on porn videos was trained on celebrities’ faces. Using this AI “training,” the nodes eventually arrange themselves to complete a particular task. In this case, the task was manipulating video to create convincing fakes.