Things really do last forever on the Internet, especially the smut.

The latest example is DeepNude. Code-sharing website Github recently removed the code, which used artificial intelligence to transform innocuous photos of women into porn, from its platform, according to Vice’s Motherboard.

The app, which used a type of machine learning called GAN to replace clothing with nude body parts, was available for four days before its own creators shut it down in late June. The anonymous developers tweeted: “The world is not ready for DeepNude.”

However, other developers reverse-engineered DeepNude’s source code and made it available as open source on Github, eventually prompting last week’s ban of copies of the DeepNude code for violation of the sexually obscene section of Github’s community guidelines. 

This won’t eliminate of the code online, as it easily spreads from developer to developer. Github’s ban may greatly limit its reach.

DeepNude is an example of damage from so-called deepfakes, which use AI to manipulate images and create realistic fakes. Celebrities, politicians and business people have been victims — Mark Zuckerberg and Kim Kardashian have been targeted. Everyday people have also  become victims of deepfake porn or “fakeporn”, a phenomenon exacerbated by DeepNude.

  • Deepfake technology recently gained media attention once again with FaceApp, a photo manipulation app that uses AI to age a person in a selfie. FaceApp has come under criticism for its alarmingly lax user agreement and Russian creator. 
  • “The circulation of deepfakes has potentially explosive implications for individuals and society. Under assault will be reputations, political discourse, elections, journalism, national security, and truth as the foundation of democracy,” Danielle Keats Citron, a University of Maryland law professor, told a House committee meeting in June.
  • On July 1, the state of Virginia amended a law to classify revenge porn and falsely created videographic or still images as Class 1 misdemeanors. Other states, including New York, California, Massachusetts and Texas, have introduced bills addressing deepfakes.
  • Karma Take: Use of artificial intelligence to create deepfake videos and falsified pornography may hamper AI advancement as governments and regulators face pressure to restrict the technology in the interest of Internet safety.