European Union privacy regulators may ask Google to extend users’ “right to be forgotten” to its Websites outside the EU as well.
Regulators meeting in Brussels, Belgium, Nov. 26 have prepared a proposal that will require Google to apply the EU privacy obligation—which gives its citizens the right to ask Google to remove content—to its main Google.com site in the United States and to other sites viewable from the EU, Bloomberg Businessweek reported today.
The decision apparently is rooted in concerns that information blocked by Google in the EU will still be accessible to Internet users there simply by visiting Google search sites in other countries, Bloomberg said, quoting unnamed sources. If the proposal is approved, all search engine companies, and not just Google, will be required to abide by it.
Isabelle Falque-Pierrotin, the chairman of the EU data protection council, is expected to present the guidelines later today, possibly with some modifications, the Bloomberg report noted.
A Google spokesman said the company hasn’t seen the EU Article 29 Working Party’s new guidelines yet. “But we will study them carefully when they’re published,” he said.
Marc Rotenberg, president of the Electronic Privacy Information Center (EPIC), said the proposal that is reportedly being considered by the EU makes sense.
“This is a logical and sensible request from the European Union since Google is the entity that gathers the personal data and chooses to make the subsequent disclosure,” he said in emailed comments.
“It would make little sense to allow Google to publish in domains outside of Europe private facts concerning EU citizens that should be removed from Google search results.”
The more interesting question now is how Google will respond to growing expectation that the company will recognize a similar legal right in the United States and other countries, he said.
In May of this year, the Court of Justice for the European Union held that European privacy law gives citizens the right to ask Internet search engine companies like Google to remove search results pointing to inaccurate, outdated or incomplete data about them.
The Right to Be Forgotten decision was related to a lawsuit filed by an individual in Spain who wanted Google to remove search results pointing to two articles in a Spanish-language newspaper from 1998 that mentioned his name in connection with the recovery of Social Security debts.
Since the European court ruling this May, Google says it has received more than 174,000 right-to-be-forgotten requests from EU citizens and has evaluated some 602,000 URLs for removal. So far, the company has removed 42 percent of the URLs that people have asked it to remove and is in the process of working through the remaining requests.
The removal requests have involved a wide range of content, including criminal records, embarrassing photos, slander, online bullying, negative press mentions and content pertaining to sexual crimes, Google has noted.
Google has maintained that while it wants to be respectful of EU law, the right-to-be-forgotten obligation is a new and difficult challenge for the company. It requires Google “to weigh, on a case-by-case basis, an individual’s right to be forgotten with the public’s right to information,” Google’s Advisory Council on the Right to be Forgotten has noted. “We want to strike this balance right.”
This week’s proposal, if adopted, would extend Google’s obligation to remove content at the request of EU users to its main Website as well. It is unclear how the company will respond to the new development or even what its legal obligations will be under the new proposal.
Either way, the company is likely going to have to find a way to respond to the issue quickly because there are signs of similar demands from countries outside the EU as well.
In October, for instance, a court in Tokyo ordered Google to remove about 120 search engine results pointing to articles hinting about a certain individual’s involvement in a crime from more than 10 years ago.
Some privacy groups, such as the Electronic Frontier Foundation, have expressed alarm at the EU requirement and have likened it to censorship. “The court has created a vague and unappealable model, where Internet intermediaries must censor their own references to publicly available information in the name of privacy, with little guidance or obligation to balance the needs of free expression,” the EFF noted in a blog in July.
“That won’t work in keeping that information private, and will make matters worse in the global battle against state censorship.”