Norway is really pissed at Facebook.
This week, the world’s largest social network banned an iconic photo taken during a napalm attack during the Vietnam war, because it includes a naked nine-year-old girl. Facebook claimed the photo violated its ban on nudity, and especially child nudity. When Norway’s largest newspaper, Aftenposten, reported the ban and included the photograph in its story, Facebook also banned the story. So the paper published an open letter to Mark Zuckerberg, accused him of abusing his power. And then the Norwegian Prime Minister got involved. She tried to pub the photo to Facebook, accusing the company of censorship and curbing freedom of expression, and Facebook deleted that too. […]
Facebook likely relies on a combination of algorithms and human labor–much of it provided by contractors–to block pornography, videos of beheadings, and other unsavory things from the site. But what those algorithms look for and what policies are in place for its human moderators remains a mystery. Yes, it has its community standards page that explains that hate speech, nudity, and graphic violence are banned. But who exactly decides whether, say, photos of dead soldiers counts as graphic violence, or whether an iconic photo violates the nudity ban? What criteria do they use to make that decision? How much is human and how much is tech? What role to the algorithms really play? […]
While Zuckerberg himself probably isn’t sitting around his office dictating which breastfeeding photos to ban, he still bears ultimate responsibility for Facebook’s policies. Facebook might not be a news organization, but it’s definitely an editorial organization, and an extremely powerful one at that. And as Facebook’s influence over what we see online grows, so too does the need to hold it accountable.