Facebook Still Playing Way Too Close to the Creepy Line

NEWS ANALYSIS: A social experiment involving big data and manipulation of newsfeeds erodes trust in the world's largest social network.

Facebook has been lurking on the border of Creepland for several years and it's now time for it to pick up and move far, far away. The giant social network is facing some serious consequences if it keeps toeing this line.

And what consequences might those be? How about multiple thousands, or even millions, of users closing their accounts because they cannot trust Facebook to deliver news and other information from their friends without tampering with it first.

On July 2, Chief Operating Officer Sheryl Sandberg, while on a business trip in New Delhi, leaned in to apologize on behalf of the company for what amounts to a social-networking emotion-control scheme it concocted two years ago.

Facebook admitted manipulating the newsfeeds of 689,000 unsuspecting users to place negative- and positive-type items consistently at the top of the feed over a period of one week; researchers then monitored, recorded and analyzed members' reactions to see if their subsequent postings were accordingly negative and/or positive.

Report Unearthed in Professional Journal

The report was published last March in a professional journal, Proceedings of National Academy of Sciences, and entitled "Experimental Evidence of Massive-scale Emotional Contagion through Social Networks." Researchers Adam D.I. Kramer, Jamie E. Guillory and Jeffrey T. Hancock apparently were trying to figure out if they could make users feel sadder or happier by feeding them continual streams of negative and positive news from friends.

Somebody at Facebook had far too much time on his/her hands. Whose bright idea was this? Of course sad news will make a reader feel sad, particularly if it concerns a friend. And certainly good news will brighten a person's day. But news is supposed to come along as it all takes place, not when the news aggregator—or, in this case, news aggravator—prescribes it.

With this social experiment, Facebook showed that a news service—let's face it, that's Facebook's No. 1 function—that cannot deliver the news cleanly and in a timely fashion without interfering with the natural course of human events can't be trusted. Delivering news is a journalistic function; trust is a news service's key stock in trade. Without trust, there is no journalism; there remains only a web service vendor with an agenda.

If Facebook becomes a mere web service with an agenda, then it loses its mojo and may never get it back.

It's All About Trusting Sources

Trust is also any web service's most important attribute. Two billion users trust Facebook to hold their personal news and information and to deliver reports to them without hindrance or manipulation. In return, Facebook gets our personal information. If Facebook decides to veer away from that service for which I registered, then I don't know if I want to use it anymore. And I wouldn't be alone.

A few points to consider:

--Firstly, knowing that Facebook can decide the order and placement of users' newsfeed items based on content and emotional weight is not only disturbing, it is creepy and makes a user question Facebook's entire purpose. What other experiments are taking place at Facebook and other sites? For what else are they using my personal information?

--Secondly, this revelation was randomly discovered by the Wall Street Journal a couple of days ago in a report published in an obscure professional journal more than three months ago. It could have easily slipped into the netherworld and not come to public scrutiny and Facebook wouldn't have had to answer to its actions.

--Thirdly, if Facebook doesn't quickly make good (Sandberg has made a start) and show that it can be trusted—although that type of credibility will take a long time to earn—there will be serious consequences, and they could spiral up and torpedo the whole company. Look at how fast ultra-popular networks like MySpace and Friendster went down the tubes.

Not the First Creepy Episode

This is not the first revelation about Facebook and its manipulation of user data. It used to be that power, money and sex were the key corruptors of people; in the 21st century, we can safely add control of data to that list.

In 2011, Facebook changed its Timeline feature to make it possible for strangers to follow you and access your information—even from years ago. That one didn't go over very well among many users, yet it is still in the feature set.

The social network's 2 billion members assume that when they logged in and began populating status updates, photos and videos it was from only that point going forward. Not so. Timeline enables users to catalog their life history, from birth to present day, in digital form and make it easily accessible. Users must intentionally opt out if they don't want this feature; some people have trouble doing this, so it becomes a default.

Facebook sees this as valuable because it means it can better fine-tune its social ad algorithms, but it's made many users uncomfortable.

Entering the C-Zone

Companies certainly know when they're entering the C-zone, and Facebook is hardly the only one walking in a problematic area.

Three years ago, Google Executive Chairman Eric Schmidt was in a discussion at the Newseum in Washington, D.C. about the invasiveness of social networks. Responding to a joking question about the possibility of Google developing some kind of neurological implant in the future, Schmidt snarkly replied: "Google's policy is to get right up to the creepy line and not cross it."

Why even get close to the creepy line in the first place? This conversation is far from over.

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor-in-Chief of eWEEK and responsible for all the publication's coverage. In his 15 years and more than 4,000 articles at eWEEK, he has distinguished himself in reporting...