AIDive
DeepNude logo

DeepNude

A controversial AI app that generated fake nude images from photos

Description

DeepNude appeared in 2019 and quickly drew widespread attention in the press and on social media. The developer, who identified himself as Alberto, released software that used computer vision to “remove” clothing from photos of women by generating fake body parts.

What it was

DeepNude was distributed as an executable for Windows and Linux. It was based on the pix2pix algorithm developed at UC Berkeley and trained primarily on images of nude women, which shaped what the model could generate.

Key details
  • Worked mainly on photos of women because the training data was mostly female.
  • Required a single clothed photo (often full-body or swimwear for better results).
  • Processing reportedly took about 30 seconds on a typical computer.
  • The free version added a large watermark; the $50 paid version left a small “FAKE” label.
  • Attempts to process men could produce incorrect outputs, reflecting the lack of male training data.
Public reaction and shutdown

Major outlets (including Vice, BBC, and The Verge) tested the app and noted that higher-resolution, unobstructed images looked more realistic. Digital rights groups and forensics experts warned about misuse, including non-consensual sexual imagery and “revenge porn.” Soon after the backlash, Alberto removed official download links, though copies and clones continued circulating online.

Summary

  • Author
  • Websitet.me
  • Published2025/03/25
  • Views

Tags

    Newsletter

    Get notified when new AI tools are added

    Join the community.