App which used algorithm to ‘undress’ women and create fake nudes shut down

‘Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public,’ says campaigner

Maya Oppenheim
Women's Correspondent
Saturday 29 June 2019 18:55 BST
Comments
The $50 (£40) Deepnude app has faced heavy criticism and has been accused of objectifying women
The $50 (£40) Deepnude app has faced heavy criticism and has been accused of objectifying women (Getty Images)

An app which used a machine learning algorithm to digitally “undress” images of women wearing clothes to create fake nudes has been taken offline.

The $50 (£40) Deepnude app faced heavy criticism and has been accused of objectifying women.

DeepNude used artificial intelligence to create the “deepfake” images – showing realistic estimates of how a woman might look if she was naked. The app was not designed to work on men.

Deepfake images and clips often appear credible to the average viewer – with many raising alarm bells about their possibility to mislead members of the public.

The controversial app’s developers have now removed the software from the web – saying “the world is not yet ready”.

“The probability that people will misuse it is too high. We don’t want to make money this way,” DeepNude said in a message on their Twitter feed.

The developers said those who bought the app, which was available for Windows and Linux, will receive a refund.

They also asked people who had a copy not to share it. However the app will still work for anyone who possesses it.

One campaigner against “revenge porn” – defined as the sharing of private, sexual photos or videos of another person, without their consent and with the purpose of causing embarrassment or distress – branded the app “terrifying”.

“Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.” Katelyn Bowden, founder of anti-revenge porn campaign group Badass, told tech news site Motherboard.

After the outlet published a story on the app, the server for DeepNude crashed, provoking it to announce that it was offline because “we didn’t expect this traffic”.

The app later tweeted: “Here is the brief history, and the end of DeepNude. We created this project for users’ entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner.

“Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic.”

California is contemplating a bill which would make pornographic deepfake images illegal – meaning it would be the only state to ever take legislative action against them.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in