This afternoon, Facebook announced that it is working to revamp the social network’s community guidelines in the wake of sustained criticism about their enforcement. Weeks ago, the site faced an embarrassing situation after they removed the famous Vietnam War photo widely known as Napalm Girl. Its concern stemmed from fears of violating certain, region-specific norms on child pornography, but the site quickly learned that a “one size fits all” approach doesn’t work on controversial and arguably newsworthy subject matter.
“Observing global standards for our community is complex. Whether an image is newsworthy or historically significant is highly subjective,” Facebook vice-presidents Joel Kaplan and Justin Osofsky wrote in a blog post today. “Images of nudity or violence that are acceptable in one part of the world may be offensive — or even illegal — in another.”
In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards. We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.
This is admirable — Facebook accepting, and taking responsibility for, the enormous power it wields. At the same time, it’s kind of worrying. This is what happens when Facebook gets to be the filter through which a vast proportion of internet users get their news and assorted media. “We’re going to begin allowing more items that people find newsworthy.” Was that ever … not going to be the case? “Newsworthy” is by definition a synonym for “of public interest.” That “newsworthy” posts were ever subject to removal should give pause to news organizations that rely heavily on Facebook for traffic and, subsequently, revenue. And not just news organizations: It should give pause to everyone, considering how important Facebook has become as a source of news and information for its billion users.
The announcement today follows a sustained internal debate at Facebook over how heavily the company should monitor and interfere on behalf of the quote-unquote Facebook community. According to The Wall Street Journal, some employees argued that Republican presidential candidate Donald Trump’s call to ban Muslims from entering the country was hate speech. The post was flagged as hate speech and appeared to violate Facebook guidelines, but the decision was made to leave it up in an effort to be impartial during election season.
During one of Mr. Zuckerberg’s weekly town hall meetings in late January at the company’s Menlo Park, Calif., headquarters, a Muslim employee asked how the executive could condone Mr. Trump’s comments. Mr. Zuckerberg acknowledged that Mr. Trump’s call for a ban did qualify as hate speech, but said the implications of removing them were too drastic, according to two people who attended the meeting.
The news of Facebook’s striving for political neutrality comes on the heels of reports earlier this year that it suppressed conservative news outlets and concerns from appearing in the site’s trending topics, and Mark Zuckerberg’s defense of Trump donor and Facebook board member Peter Thiel. In the latter, Zuckerberg called for more openness to “diverse” opinions on the site.