Deepfake is being used to Steal Social Network Selfies to Generate Indecent Photos
Remember the "one-click stripping AI" DeepNude? In June last year, this in-depth software was released for only a few hours, and it went down quickly due to excessive access requests. The reason was that it could take off women’s clothes with one click and create a large number of pornographic images. Recently, according to investigations, this pornographic software appeared again in the instant messaging application-Telegram. Up to now, the software has publicly released about 104,852 pornographic pictures on the TeleGram platform. This data does not include the pictures shared privately. More importantly, 70% of these pictures are all from real women on social networks, even minors. In other words, the selfies that are shared daily on social media such as Facebook and Weibo are likely to be used to generate "fruit photos."
DeepNude is eyeing "someone you may know"
The investigation comes from Sensity, a cyber security company based in Amsterdam, which focuses on the so-called "visual threat intelligence", especially the dissemination of deeply faked videos. Their investigation found that as of the end of July, all seven major channels on the Telegram platform were using DeepNude software. The software has attracted 103,585 users worldwide, of which the channel with the most users has 45,615. Most of these users gather in Russia and the surrounding areas of Eastern Europe.
It is understood that there are three reasons for the explosion of this software. One is that the operation is very simple. You only need a photo. After uploading to the software, a nude photo can be automatically generated in a few seconds without any technical knowledge or special hardware support. The second is free for everyone. However, it also has a profit model. The nude photos obtained by unpaid users will be labeled with watermarks or only partially exposed. Users can pay $1.28 to obtain unwatermarked pictures, but the permission is limited to 100 photos in seven days. The third is to obtain nude photos of familiar people. It's not uncommon for Deepfake to fake pornographic videos, mainly targeting celebrities. However, according to a public opinion survey, most users are more interested in "women they know in real life," accounting for 63%.
And DeepNude happens to provide such support, any user can generate nude photos with any photo. Judging from the public data on the TeleGram platform, many of the women in the photos are ordinary office workers and college students, and these photos are all selfies from platforms such as TikTok, Instagram, and Facebook. It should be noted that this in-depth fraud software is only for women. If you upload a male or any other inanimate photo, the system will automatically be replaced with the female's body and recognized as the corresponding organ.
Sensity chief scientist Giorgio Patrini said that more than 680,000 women have been faked as pornographic pictures and uploaded to public media platforms in unknown circumstances. At present, Sensity's relevant investigation data has been disclosed to the public, and relevant law enforcement agencies have also launched an investigation. So far, Telegram has not given a public answer to why the pornography software is allowed to appear on social platforms. In addition, according to a Sensity survey, in addition to Telegram, another Russian social media platform VK also discovered DeepNude, and it has begun to advertise the software publicly.
However, as soon as the incident came out, the relevant person in charge immediately responded and denied this statement. At the same time he emphasized, VK does not tolerate the presence of such content or links on the platform, and will also prevent the community that distributes them. In addition, such communities or links do not use VK advertising tools for promotion. We also conduct additional checks and prevent inappropriate content from appearing. Creator: I am not a voyeur, just out of curiosity However, it is such a pornographic fake software that was originally created out of curiosity and passion for technology.
On June 27, 2019, the DeepNude software was publicly launched for the first time, and shortly thereafter, the system was down due to excessive user traffic. But in these few hours, this "one-click undressing AI" for women has caused an uproar at home and abroad. The accusations of "invasion of privacy and gender discrimination" have emerged on various social media platforms such as Twitter, Reddit, and Weibo. Deep learning pioneer Andrew Ng also tweeted that DeepNude is "one of the most disgusting applications of artificial intelligence." For all kinds of accusations from the outside world, its developer Alberto (Alberto) publicly responded:
"The world is not ready for DeepNude!" "I am not a voyeur. I am a fan of technology." At the same time, he emphasized, “I don’t care about nude photos and the like. Some people don’t want to use them, so they just want to let others not use them. This is stupid. Blocking and censoring technical knowledge cannot prevent its spread and spread. ." It is understood that Alberto's research and development inspiration came from an X-ray glasses in a magazine. He was very interested in AI technology. When he discovered that the GAN network was able to edit photos from day to night, he realized that using GAN can also convert a photo with clothes into a photo without clothes. It is out of such curiosity that DeepNude was finally developed.
Specifically, DeepNud is based on the open-source "image-to-image translation" software called "pix2pix", which was first discovered in 2018 by AI researchers at the University of California, Berkeley. Its core technology, GAN, can create its own fakes by identifying real images. Such as changing landscape photos from day to night or from black and white to full color. But whether it is out of curiosity or the accuracy of technology research and development, this software is indeed abusive of AI technology and has a bad influence. For example, the well-known writer Nina Jankowicz stated that DeepNude has a huge influence on women all over the world, especially in countries with conservative society such as Russia. If convincing but false nude photos are published publicly, victims may lose their jobs or face partner violence. Sensity also stated that these false pornographic images or videos are likely to be used as a means of blackmail and harassment.
Deepfake threatened, raging
This is true not only for DeepNude, but also for all Deepfake software. As Danielle Citron, professor of law at Boston University, said, "Deepfake has become a powerful weapon against women." The core technology behind Deepfake's deep-fake software is generative confrontation network GAN. GAN generates realistic fake images, videos or voices through mutual battles between generators and discriminators. Since 2017, after Deepfake was first used to make pornographic videos-Wonder Woman's Sea Incident, social networks have been flooded with such videos. According to statistics, pornographic videos account for 96% of all Deepfake created videos. With the continuous upgrading of technology, the fidelity of Deepfake has reached the point where it is indistinguishable by the naked eye, and it has also begun to be applied to the political field. False face-changing videos of celebrities and politicians such as Donald Trump, Barack Hussein Obama, Elon Musk, etc. are not uncommon.
According to the latest research results of Princeton University professor Jacob Shapiro, between 2013 and 2019, social media has initiated 96 political campaigns, of which 93% are original content, 86% amplify existing content, and 74% distorted the objective fact. The purpose is mostly to discredit politicians, mislead the public, and intensify contradictions. In order to cope with the threats and challenges posed by Deepfake software, in 2019, Facebook invested heavily in launching the "Deepfake Detection Challenge" (Deepfake Detection Challenge, DFDC). The recognition accuracy rate of Deepfake detection this year has reached 82.56%. In addition, academia and other corporate teams are also developing deepfake detection methods. However, despite the increasing number of detection methods and higher accuracy, the trend of creating false videos has not been well suppressed. One of the reasons may be that it is more and more convenient for AI to generate false content. For example, DeepNude only needs a photo, and everyone can use it. And second, as the person in charge of Sensity said, DeepNude is still in a legal gray area. Therefore, in order to close the Deepfake Pandora's box, it needs institutional constraints and the guidance of people's hearts.

Comments
Post a Comment
Please let me if you have any question