Could Affirmative Consent Help Stop Revenge Porn?

By
Photo: Andrew Brookes/Getty Images

Affirmative consent — that is, defining sexual consent as “yes means yes” rather than “no means no” — has been the guiding principle in new campus assault policies from California to New York. By recasting consent as a definitive act rather than a nebulous, nonverbal feeling, supporters hope to eliminate the gray area that so often surrounds sexual encounters. Now, some cyber-civil-rights activists are hoping to use that same logic to fight nonconsensual pornography. Though a federal law banning revenge porn is in the works and tech companies are reshaping their policies to address the issue, activists believe an affirmative-consent approach could provide a stronger framework to protect future victims. Shifting the burden of consent from the subjects of naked photos to the uploaders of them would make legal battles waged by victims much easier to win.

An affirmative-consent model for nude photos could be applied either through the legal system or by lobbying tech companies to further adjust their policies. Legally, advocates have suggested adding an informed-consent requirement to statute U.S. Code 2257, which already requires creators of pornography to obtain documentation that certifies performers are over 18. With an affirmative-consent standard in place, anyone who created pornographic imagery would be required to obtain the written consent of the performer, and could be sent to prison if they don’t. (This standard would also protect amateur and professional porn actors from being forced to perform under coercion or duress.)

But convincing the tech world to tackle the problem of nonconsensual pornography might be the easier bet, as Google’s decision last month to remove revenge porn from search results suggests. An affirmative-consent approach instituted by tech companies might look like this: When a user uploads a photo to a platform like Instagram — where Carrie Goldberg, a New York lawyer who represents revenge-porn victims, says all photos inevitably end up — an advanced algorithm designed to detect nudity would trigger a pop-up box. The box would ask if the uploader has the consent of every subject in the photo to share the photo, and in order to post it, they would have to actively check yes. Not only would this put the legal burden on the uploader to provide proof of consent, should the photo’s subjects sue them for privacy invasion, it could also stop a lot of revenge porn from ever hitting the platform in the first place. Nonconsensual pornography is “often committed impulsively and in the heat of bitterness or jealousy,” said Mary Anne Franks, a law professor and vice president for the Cyber Civil Rights Initiative.

There is precedent for this paradigm: Franks points to a similar model, adopted by the anonymous message board Yik Yak, that triggers a pop-up any time a user posts a message including key words that mean the post could be offensive or violent. “Pump the brakes, this yak may contain threatening language,” the message reads. “Now it’s probably nothing and you’re probably an awesome person but just know that Yik Yak and law enforcement take threats seriously. So you tell us, is this yak cool to post?”

(Of course, that strategy hasn’t entirely curtailed Yik Yak’s abusive content; in May, a Virginia Tech senior was arrested for allegedly threatening a mass campus shooting.)

Franks and the other members of the Cyber Civil Rights Initiative are hoping to convince tech companies that not only is switching to an affirmative-consent model the right thing to do for their users, but that it will also make them less liable for the illegal content posted to their platforms.

“From a litigation perspective, if we have to sue the uploader in civil court, the uploader cannot say it was an accidental posting because there was an affirmative act that was taken to make it an intentional moment,” said Goldberg. “It removes some culpability from internet providers because it’s in the hands of the uploader, and if they’re lying then it’s on them.”

There’s just one snag: If a user is anonymous — which many of those who peddle nonconsensual pornography are — it could make it harder to prove the existence of affirmative consent. To skirt this issue, Goldberg suggests only allowing verified accounts to upload explicit photos, a tactic that might work for decreasing nonconsensual pornography but one that tech companies may be wary of, as it could pose a free-speech problem. 

Though it may be tougher to convince behemoths like Google to switch to an affirmative-consent model, smaller, curated pornography hubs have found some success with it. Zivity, a community of artistic nude photography, empowers the models in each photo to determine how and where their portraits can be distributed, bucking the trend of placing artistic control entirely in the hands of photographers.

“Models have the power to approve, reject, and after many levels of consent, remove their photos after two years,” Zivity CEO Cyan Banister said. The site allows delayed removal of photos, Banister added, as an acknowledgment that people change their minds about what they do and don’t want online. This borrows from another totem of affirmative-consent logic — that a “yes” to one question does not guarantee a “yes” to another.

“The models LOVE our policies,” Banister said. “It definitely makes them feel safer and gives them the control they desire and that we feel is important.”