Hi, y’all! How have you guys been holding up? I hope the WFH policy is treating y’all well.
Today, we’re going to look at a distressing deepfake tool that’s becoming popular in Telegram, the messaging app.
Want to know what this deepfake tool does? It has the ability to create fake nudes out of pictures of people who are clothed. This disturbing deepfake tool is found to produce nude images of women and girls who clearly look underage, from pictures found on their social media accounts, and share these images on Telegram.
Distressing, right? Let’s learn more about the life-ruining capabilities and characteristics of this deepfake tool.
Hold up, what’s ‘deepfake’?
Deepfake is the 21st century version of Photoshopping. Utilizing a part of artificial intelligence (AI) called deep learning (DL), deepfakes are able to generate images that portray fake events. Hence, the name ‘deepfake’.
Though the actual reasoning behind this technology is not very clear, many applications of deepfake are pornographic in nature. In September 2019, AI firm Deeptrace discovered 15, 000 deepfake videos online - a near doubling over 9 months. 96% were pornographic and 99% of those mapped faces from female celebrities on to porn stars. With latest techniques enabling unskilled people to easily birth deepfakes with just a few photos of victims, fake videos and images are nothing but expected to go beyond the celebrity world to affect any random person on Earth.
Deepfakes don’t stop at photos and videos. Audio can be deepfaked, too. In March 2020, the chief of a UK subsidiary of a German energy firm paid nearly £200,000 into a Hungarian bank account after being phoned by a fraudster who mimicked the German CEO’s voice. Though the evidence isn’t solid, the company’s insurers believe the voice in the phone call was a deepfake. There are other similar incidents where recorded WhatsApp voice messages were taken advantage of to create deepfakes.
So, now that we understand what a deepfake is, let’s get into today’s story!
What has happened?
A report recently discovered a deepfake bot on Telegram that is creating fake nudes of girls and women from photos they share on their social media accounts. According to Sensity, these fake nudes are produced by an elementary part of AI that can virtually remove clothes from supplied photos and predict what the body would be like under those clothes.
This bot’s been around for a while now. Over the span of a year - between July 2019 and July 2020 - more than 100,000 non-consensual sexual images of 10,000 women and girls have been shared online. All of these disturbing images were the consequence of this bot’s handiwork. Analyzing the origins of the photos of the victims, Sensity says the majority of them come from social media of private individuals. Interestingly, all victims were female and some of them looked ‘visibly underage’.
Figure 1: Straightforward steps to create deepfakes
‘Deepfake porn’ isn’t new, y’all
Yup, you read that right!
Last year, a tool called DeepNude was produced. I think it’s a no-brainer as to what this tool does. It is suspected that this bot seems to be functioning based on the technology of DeepNude.
DeepNude was launched online and apparently, it was difficult to go about this intricate AI service. The objective of this tool is to predict what the body of a person would look like if the clothes they are wearing in an image was removed.
This sinister tool was removed from the Internet within 24 hours of its launch but Sensity suspects that the current bot we have at hand is a result of a cracked version of DeepNude. The easy, straightforward process that an unskilled person has to go through to produce a fake nude using this bot is what makes this bot particularly terrifying. All it needs is the upload of an image (in this case, images of women and girls), clicking a few buttons which then leads the bot to use its ‘neural network’ to determine what would be under the clothes and produce a nude.
More about the bot and its network
This bot has yet to have a name for itself and runs on Telegram. The administrator of the bot, who goes by as ‘P’, told the BBC that the service was purely for entertainment and that it 'does not carry violence'. According to Sensity, the bot network - where the images are produced and shared - was found to have over 100,000 members, who are mostly based in Russia or Eastern Europe.
This is what P has said to the BBC:
“No one will blackmail anyone with this, since the quality is unrealistic …. any underage images are removed and the user will be blocked for good.”
*User = Member of the bot network who uploaded the fake nude image
Easily about 70% of all of the images used in the app came from social media or private sources - such as pictures of friends or people the users know. In other words, the victims are being preyed on by either people in their own social media circle or people they personally know in their lives.
Giorgio Patrini, CEO of deepfake-research company Sensity said:
“As soon as you share images or videos of yourself and maybe you're not so conscious about the privacy of this content, who can see it, who can steal it, who can download it without you knowing, that actually opens the possibility of you being attacked.”
It is noted that the bot was predominantly advertised on VK, a Russian social networking service. However, VK has said it doesn’t put up with such content and removed it upon discovery.
Patrini went on to express his concern on this deep fake technology:
“Many of these websites or apps do not hide or operate underground, because they are not strictly outlawed. Until that happens, I am afraid it will only get worse.”
Experts fear that these kinds of bots could be used for the extortion of women as the deep fake technology gets enhanced further. This fear is valid because normally deep fakes are used to taint the names of celebrities or politicians but this bot and its users are more engrossed in people they know.
What has been done so far to address this issue?
A survey done by Sensity on bot users has established that 63% of them were utilizing the bot to satisfy their curiosity of knowing what women looked like underneath their clothes.
Authors of the Sensity report have already shared their findings with law enforcement agencies, VK and Telegram but have yet to receive any word from any one of these parties regarding this issue.
When speaking to the BBC, this is what Nina Shick, author of Deep Fakes and the Infocalypse, had to say:
“Our legal systems are not fit for purpose on this issue. ... Society is changing quicker than we can imagine due to these exponential technological advances, and we as a society haven't decided how to regulate this. ... It's devastating, for victims of fake porn. It can completely upend their life because they feel violated and humiliated.”
That’s it for the blog today, y’all! Feel free to drop comments and share this blog if you found it informational.
Stay safe and stay tuned.
Until next time, friends!
Credits: DailyMail UK & The Guardian