
In a few short months, the idea of training a neural network to put a celebrity’s (or anyone’s) face on top of a porn performer’s body went from scary novelty to something relatively easy for even non-tech-savvy people to do. The Reddit user who first gained notice for the technique, “deepfake,” has spawned a whole subreddit dedicated to it, and there’s an app that allows users to make their own. For an example of the app in action, here’s a very safe-for-work montage of Nicolas Cage’s face getting put into famous movie scenes, to varying degrees of success:
But YouTube won’t host blatantly pornographic material, so most of those “deepfakes” were hosted on Gfycat, a tremendously popular site for hosting short, animated image files. But, as noticed by the Next Web, the site is now pulling down images created in the subreddit. New submissions are still going up on Gfycat, but the service appears to be manually removing them, and subreddit users are warning others not to upload there.
Gfycat confirmed as much to Gizmodo, telling them: “Our terms of service allow us to remove content we find objectionable. We find this content objectionable and are actively removing it from our platform.” (Gfycat, it should be said, does host a large amount of pornographic material not created by AI.)
The bigger issue is the subreddit and what Reddit will do. The site has traditionally taken a hands-off approach to pornographic material, except when it involves minors. But Reddit has a pretty clear policy about posting porn of other people “taken or posted without their permission” (think: revenge porn). Does creating eerily realistic fake porn of celebrities using a neural network count?
There are also laws on the books in the majority of states outlawing revenge porn in general, but things get complicated when it’s a neural network creating the actual video, the body belongs to a performer, and the face is an amalgamation of hundreds of different pictures of a celebrity in question. It took years for laws to catch up to the problem of revenge porn, and the speed at which AI-assisted fake porn is improving is happening very, very quickly.
Meanwhile, though Gfycat may decline to host the clips in question, there are plenty of other sites that will. The bigger issue remains that in the near future, you could, conceivably, swap out and replace nearly anyone’s face on anyone’s body and have it appear at least somewhat believable. Fake photographs have circulated social media with ease; how much worse will it be when video is just as easy to alter?