Back in 1972, when I was a young television reporter, a shocking image appeared on the front pages of the world’s newspapers that helped convince many Americans that the nation had to get its military forces out of Vietnam and bring an end to a war that had been going on for more than a decade.
The image, taken by AP photographer Nick Ut, showed a young girl, naked and in agony from severe burns on her back, running in terror from a U.S. napalm attack on her village in Vietnam. That photo, and a few others from that war, changed many people’s opinions about the war and may even have helped speed up the peace process.
Moving to the present day, Aftenposten, a Norwegian newspaper, wrote a story about seven images that changed the face of war. Ut’s iconic photo, so powerful that it won the Pulitzer Prize for photography, was one of the images. Facebook killed it.
When confronted with its decision to take the image down, Facebook said that it violated its standards of decency. It should be noted that Facebook apparently has software that examines such posts and makes the decision. So the decision to censor the photo was automatic and was hardly surprising.
But what happened next was not only surprising, but contrary to normal practices in both social networks and in media. Facebook also removed all discussion of its decision and it then blocked the people who didn’t agree with its decision and had the nerve to mention it. One such person was the prime minister of Norway, Erna Solberg, who was blocked for sending a complaint to Facebook CEO Mark Zuckerberg.
Since that happened, news organizations began running stories about the issue and suggesting that Facebook was abusing its power. Some claimed Zuckerberg was personally responsible for that abuse. The newspaper published an open letter to Zuckerberg, which most thought would have no effect.
But in fact, the letter, and the outrage about Facebook’s censorship, did apparently penetrate to the company’s executive ranks. Several hours after the photo was removed from Facebook, it was back. There’s no word on whether people who complained about that practice have been allowed back on Facebook. I asked Facebook’s media relations folks to explain the change, and a spokesperson sent me a prepared statement.
“After hearing from our community, we looked again at how our Community Standards were applied in this case. An image of a naked child would normally be presumed to violate our Community Standards, and in some countries might even qualify as child pornography,” the statement said.
“In this case, we recognize the history and global importance of this image in documenting a particular moment in time. Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed,” Facebook’s statement continues.
Outcry Prompts Facebook to Reverse Ban of Iconic Vietnam War Photo
“We will also adjust our review mechanisms to permit sharing of the image going forward. It will take some time to adjust these systems, but the photo should be available for sharing in the coming days. We are always looking to improve our policies to make sure they both promote free expression and keep our community safe, and we will be engaging with publishers and other members of our global community on these important questions going forward,” the statement concluded.
This is, of course, good news. It’s heartening that the company was able to see the error of its ways, at least in this instance.
But Facebook’s actions speak to a much deeper problem now that this social network is trying to present itself as a news source for its subscribers. That problem is Facebook’s apparent assumption that it is the arbiter of what constitutes news.
Before I go on, I realize that every news organization decides what constitutes news if only because you can’t print or post everything. But for those of us in the news media, those decisions are made by people who can discuss the importance and the implications of running a specific story or photo. With Facebook, the decisions are made by a computer algorithm that exists in an all-or-nothing world.
What’s worse is that when Facebook’s robots make a decision, the service then punishes those who disagree. Imagine what would happen if the editors of my hometown paper, The Washington Post, were to cancel deliveries to households of people who complained about coverage. You’d have more than just a scandal, you’d have hearings. But apparently Facebook sees nothing wrong with punishing those with whom it disagrees.
This move toward punishment is where Facebook shows its true colors. It is not a reliable news source, and it cannot be as long as it tries to silence those who don’t conform to the standards enforced by its algorithms.
One of the realities of journalists and the organizations for whom they work is that they must remain accountable to their readers or viewers. Even a lack of accountability is enough to distort the truth we journalists try to report.
To abrogate real-world accountability in such an egregious fashion as has been demonstrated by Facebook can only show that the organization, at least in terms of being a reliable source of news, is at least immature if not completely blind to its responsibility to the public.
This is too bad. But by pretending it is the protector of its users’ morals, Facebook only shows why it fails at just that task. While Facebook must have some standards, those standards can’t short-circuit real news.