The right to be forgotten concept is the right to free oneself from being stigmatized for events that happened in the past.
It sounds like a great idea.
For example, let's say a person gets drunk as a teenager and causes an accident. The local papers pick up the accident, the trial and the sentencing. There was lots of coverage.
That person may go through all the required programs, pay their debt to society, permanently quit drinking, start a family, launch a business, become active in programs that help young people avoid drunk driving and so on.
But then 10 or 20 years after the accident, a Web search on that person's name may bring up mostly stories about the car accident. Years later, despite all the persons efforts, the imbalance of links about the accident may affect personal and business relationships and unfairly paint a productive person who's really all about preventing drunk driving as a reckless drunk driver.
The concept has been applied to search engines by law in the European Union, Argentina and elsewhere. But activity around the right to be forgotten in the EU has brought it front and center as a hot-button topic in technology circles.
Unfortunately, the way the EU is building the concept into law is the single greatest threat to the Internet of this decade. In fact 2014 may go down in history as the year Europe ruined the Internet.
The current debate started when the Luxembourg-based European Court of Justice ruled against Google in a right to be forgotten case. A Spanish man named Mario Costeja González wanted Google to remove a link to an article published in 1998 about his debt and home foreclosure. He had paid his debt, but a Google search linked prominently to a now-outdated article stigmatizing him as a debtor with financial problems.
The ruling required Google to not only remove the stigmatizing results when anyone searched for Gonzalez's name, but for all search engines to establish policies, practices and resources for anyone to petition them to have similarly stigmatizing content removed as results from searches for their names.
The criteria for removal are that the right to be forgotten applies to the search results that appear for a specific person's name. There is also a long list of situational rules that attempt to prevent the system from being abused by public figures, politicians, criminals and those who might want to simply make search results make them look better.
The policy is an attempt to prevent the fact that the Internet never forgets from violating the individual right to privacy.
Unfortunately, this policy does far more harm than good.
The right to be forgotten amounts to censorship.
The American Library Association defines censorship as follows: "The change in the access status of material, made by a governing authority or its representatives. Such changes include: exclusion, restriction, removal, or age/grade level changes."
That's not to say that the EU's right-to-be-forgotten censorship is perfectly comparable to censorship in, say, China. In that country, censorship of Internet references to people (like the Dalai Lama) and events (such as the 1989 crackdown on Tiananmen Square protesters) are designed to suppress support for alternative political opinions beyond those that are approved by the Chinese Communist Party. Censorship is used as a form of political repression.
Right-to-be-forgotten censorship is falsely thought to be similar in kind to French and German censorship of Nazism or Holocaust denial that are intended to protect minorities or individuals.
Yes, both kinds of censorship are designed to protect the rights of people (rather than the exclusivity of political parties). What's different about right-to-be-forgotten censorship is that it's only for search engines. Other forms of European censorship are applied to all media, including books, newspaper articles and so on. Right-to-be-forgotten censorship erases only content in search results—it outlaws links to legal content.
In the EU way of thinking, it's OK to censor search engines because they're new and therefore not specifically covered by free speech laws; politically safe since they'd never get away with censoring newspapers over the same content; and they're easy to enforce since there are far fewer search engines to go after than publishing companies and book publishers.