‘I felt violated’: Hundreds of deep nudes on forum reveal growing issue

The Feed revealed thousands of explicit images of underage girls and women were being traded on a disturbing international forum. Digital reporter, Eden Gillespie, has taken another deep dive into the site, focusing on its hundreds of deep nudes.

NUDE

Source: Getty

In October, The Feed reported that an international forum that hosts thousands of naked images of women and girls had resurfaced after it was shut down by the FBI and Dutch authorities in 2018.

On the forum, users illegally share ‘wins’ - or nude images - of victims without their consent. Many of these victims are under the age of 18.

Some of the images are ‘revenge porn’ or downblousing and upskirting images taken without victims’ knowledge. Others are highly realistic ‘deep nudes’ - fake images, created with AI-powered sites, apps and bots.

The Feed interviewed a woman in her 20s, who discovered deep nudes of herself had been uploaded onto the forum. 

A photograph of the victim in a restaurant, seen by The Feed, appeared to have been altered by deep nude software to make her appear topless. The image, although completely fake, was highly realistic. 

In another image, the victim’s bikini bottoms were removed by deep nude software.
image-based abuse
This year, posts about deep nudes soared to more than 419 on the forum. Source: Getty
The victim - who requested to be anonymous - said she found out about the forum after her friend forwarded her the link.

“When I first saw the images, I broke down. I felt violated,” she said.

“All of the images posted onto the forum were taken from my Facebook and Instagram accounts,” she said.

“Obviously these images were edited. However only myself and those that had seen the original would know that for certain,” she added.
Addiction Australia
A victim told The Feed she felt 'violated' after discovering deep nudes of herself on the forum. Source: E+
The victim told The Feed that as well as reporting the images to the eSafety commission, she made a statement to police and documented all images, links and time stamps. 

She also turned all of her accounts and photo albums to private.

“I was particularly concerned as my first and last name were shared along with the photos,” she said.

“I’ve never seen editing like that before. It looked completely seamless and incredibly professional,” she added.

“I spent hours going through friends and followers lists and removed over 1000 followers and friends on Facebook and Instagram.”
Midsection Of Woman Using Smart Phone On Table In Cafe
The deep nudes of the victim were deleted from the forum after The Feed's piece. Source: Getty Images
Following The Feed’s investigation, those fake images were successfully removed from the site, with the help of the eSafety Commission. 

The Feed has recently discovered that requests for deep nudes or x-ray images on the site jumped from 10 requests in 2018 to 177 requests in 2019. This year, those requests soared to more than 419.

On one of the forum’s image boards, users ask others to ‘x-ray’ women’s photos to make their clothing see-through or remove it completely. Women’s images have also been photoshopped into vulgar pornographic scenes.
eSafety Commissioner Julie Inman-Grant told The Feed that due to the sophistication of AI, it’s getting increasingly difficult to tell deep nudes from real images.

She said technologists are beginning to develop sophisticated detection tools to identify deep fakes.

“With deep fakes, you might see like blurring or pixelation, particularly around the mouth, eyes or neck,” she told The Feed.

She said other signs are skin inconsistency or discolouration, as well as bad sound for video deep fakes.

Where are these fake nudes coming from?

Last year, a site offering deep nudes apps was taken down just one month after it was launched. In that short time, it amassed more than 500,000 visitors and 95,000 downloads.

According to a - the world’s first visual threat intelligence company - the site’s creators sold the app’s licence on an online marketplace to an anonymous buyer for $30,000. Its software has since been reverse engineered and can be found in torrenting websites.
woman
Ms Inman-Grant said while the images are not real, like all image-based abuse, the impact on victims could be severe. Source: The Feed
Copycat sites have emerged as well as deep nude bots on encrypted messaging apps like Telegram.

One Telegram chatbot, identified by Sensity, shared more than 104,000 images of women altered to appear nude in public channels. 

Sensity found through self-reporting by the bot’s users that 70 percent of victims had their photos either taken from social media or private material.

The Feed has identified seven deep nudes sites and one deep nude Telegram channel, where an admin offers to create deep nudes in exchange for cash. 
Telegram messenger
Telegram is an encrypted messaging service. Source: Pixabay CC0
A deep nude site told The Feed it had one million images created on its website in a single month and claimed thousands of men have paid them to create more advanced images. 

A spokesperson from the site said they are “sorry” if deep nudes expose victims to abuse and humiliation but “ this is the dream that nearly all men have in their mind.”

They said they created the site in March and “are using a mix of technology including state of the art AI model and computer vision techniques” to create fake nude images.

While the Telegram channel, identified by The Feed, has gained 200 subscribers since it was created less than a month ago. The admin claimed its former channel was blocked by Apple for IOS users.
The Feed asked the admin of the Telegram channel if they had considered the repercussions for potential victims who may have fake images created of them without their consent. 

The admin simply said people would be able to tell they were ‘clearly fake images’. The admin then deleted all messages they had sent to The Feed and blocked us from the deep nude channel.

Image-based abuse is on the rise

Alarmingly, Ms Inman-Grant said, the eSafety Commission has seen a 172 percent increase in reports of image-based abuse since this time last year. 

A survey by found that one in three participants had experienced at least one form of image-based abuse.

Ms Julie Inman-Grant told The Feed that legislation around image-based abuse covers deep fakes and threats to share images.
Revenge porn is being targeted under proposed law reforms in New South Wales.
A survey found 1 in 3 Australians have had intimate images of themselves shared without their permission. Source: AAP
In Australia, sharing sexual images of someone under 18 years of age in seven years’ imprisonment and accessing those images can result in 15 years’ imprisonment.

There are also further, which vary depending on each state and territory.

Ms Inman-Grant said the eSafety Commission has an 85 percent success rate with getting image-based abuse taken down. 

She said it’s helped more than 4,800 Australians get their intimate images and videos removed from hundreds of websites, almost exclusively hosted overseas.

Researchers at Monash University and RMIT have found that men are twice as likely to be perpetrators of image-based abuse.
phone
The eSafety Commission has seen a 172 percent increase in reports of image-based abuse since this time last year. Source: Getty
When it comes to deep nudes, Ms Inman-Grant points out that most apps and sites only ‘undress’ images of women.

“Women are still disproportionately impacted. A lot of mental and emotional distress is impacted for women, particularly if you come from a conservative background,” Ms Inman-Grant said.

“There are still gender double standards that exist... If you’re a woman showing a bit of cleavage, you’re a woman of ill-repute, it’s like the Scarlet Letter of the digital age,” she said.

Ms Inman-Grant said that we are only just seeing the beginnings of deep fake technology.

“Deep fakes can be used, not just in terms of false pornographic videos or images, it can create fake news, malicious hoaxes,” she said.

“Now, it tends to target well-known people such as politicians or celebrities but increasingly, we'll be seeing it potentially impact everyday people.”

“If your face was artfully morphed onto a porn star’s body and your parents saw it and thought you were engaging in crazy things online, it could be very damaging.”

You can report image-based abuse to the eSafety Commission at .

If you or someone you know is impacted by sexual assault or family violence, call 1800RESPECT on 1800 737 732 or visit www.1800RESPECT.org.au. In an emergency, call 000.


Share
Through award winning storytelling, The Feed continues to break new ground with its compelling mix of current affairs, comedy, profiles and investigations. See Different. Know Better. Laugh Harder. Read more about The Feed
Have a story or comment? Contact Us

Through award winning storytelling, The Feed continues to break new ground with its compelling mix of current affairs, comedy, profiles and investigations. See Different. Know Better. Laugh Harder.
Watch nowOn Demand
Follow The Feed
7 min read
Published 9 December 2020 12:14am
By Eden Gillespie


Share this with family and friends