Deepfakes are incredibly controversial and this week that controversy was dialled up to 11 following a report by Vice.
That report pointed out an application that could be used to turn clothed images of women into nude images of women using artificial intelligence.
That application is known as DeepNude.
The Vice report threw the application under the limelight and as you might expect criticism spread like wild fire. That having been said, as much criticism as there was, there was also a large amount of interest in the application.
So much so that since the Vice report, the creators reported several server outages due to an influx of traffic.
“Hi! DeepNude is offline. Why? Because we did not expect these visits and our servers need reinforcement,” the creators tweeted.
“We are a small team. We need to fix some bugs and catch our breath. We are working to make DeepNude stable and working. We will be back online soon in a few days.”
But DeepNude will not be back, at least not from this developer.
“Despite the safety measure adopted (watermarks) if 500 000 people use it [DeepNude], the probability that people will misuse it is too high. We don’t want to make money this way,” the creators said.
How exactly an application that removes the clothing of women without their consent is meant to be used properly is beyond us and this entire situation smacks of somebody creating something without considering the consequences it might have.
The use of a watermark as a safety feature is almost laughable considering how easy these can be removed or – in the particular case of DeepNude – cropped out.
The creators have acknowledged that some copies of its application will make it onto the web where they will be shared. The creators also attempted to convince folks not to download these copies of its application.
“Downloading the software from other sources or sharing by any other means would be against the terms of our website,” the creators said.
What exactly will happen to you should to go against the terms of the website (which you can’t actually see anymore) is unclear.
The DeepNude team has opened a Pandora’s box and unfortunately for them, killing the app after gaining so much publicity might be too little too late.
— deepnudeapp (@deepnudeapp) June 27, 2019