News Analysis: When CNBC's Maria Bartiromo asked Google CEO Eric Schmidt if people should treat Google as their most trusted friend Schmidt stepped into the damned-if-you-do, damned-if-you-don't pitfalls all companies who harvest computer users' data find themselves in at one point or another.
This is how Schmidt responded to the question in the "Inside the Mind of Google" segment CNBC aired Dec. 3:
"I think judgment matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. If you really need that kind of privacy, the reality is that search engines -- including Google -- do retain this information for some time and it's important, for example, that we are all subject in the United States to the Patriot Act and it is possible that all that information could be made available to the authorities."
Of course, privacy and security pundits had a field day when they learned of Schmidt's comments, which point to a certain liberty Schmidt and Google are taking: that people shouldn't do anything that they would be embarrassed about, or do anything that might implicate them in criminal or other matters. The suggestion is that perhaps some people shouldn't use search engines because they will be, well, exposed.
Bruce Schneier, a renowned security expert, responded accordingly:
"Privacy protects us from abuses by those in power, even if we're doing nothing wrong at the time of surveillance. We do nothing wrong when we make love or go to the bathroom. We are not deliberately hiding anything when we seek out private places for reflection or conversation. We keep private journals, sing in the privacy of the shower, and write letters to secret lovers and then burn them. Privacy is a basic human need.
For if we are observed in all matters, we are constantly under threat of correction, judgment, criticism, even plagiarism of our own uniqueness. We become children, fettered under watchful eyes, constantly fearful that -- either now or in the uncertain future -- patterns we leave behind will be brought back to implicate us, by whatever authority has now become focused upon our once-private and innocent acts. We lose our individuality, because everything we do is observable and recordable."
Ironic that Schneirer notes that people will lose their individuality because everything Google does is geared toward dissecting users' collective Web surfing habits and serving them ads based on their interests. By granting Google our user data, we may feel as though we cannot deviate from socially accepted, normal behaviors because it might be used against us.
Bartiromo also asked Google CEO Eric Schmidt if he sees Google as the most powerful company in the world. He was genuine when he said: "No, not at all." "You have a lot of information though about people..." Bartiromo responded.
"But we don't use it and we don't misuse it," Schmidt said. "We could misuse it, but if we did, we would quickly become much less powerful because everyone would flee to our competitors. So part of the answer to the criticism, that's implied by your question... is that if we broke our trust with end users, they would leave and we wouldn't be very important anymore."
It goes to Google's whole "the competition is just a click away" campaign. Google's position is that people should trust their data with the company because it would not do anything with it to break that trust.
Google may well not abuse user data, but who is to say authorities won't under the Patriot Act, whose latitude is suspect to the point of being Orwellian? This is what keeps privacy advocates up at night and part of what makes Google a target for federal scrutiny.
So what is the solution? Forget data anonymization because no one is quite comfortable with that. Perhaps Google should create some sort of instant analysis engine that gleans user data as it enters the system, uses it to improve search -- think personalized search but in real-time -- and then nukes it into the digital boneyard forever.
Earlier this week Google proved it can index search results in real-time, so the idea that Google could create algorithms to scrape useful user data and do real-time analysis that helps construct contextual advertising targeting is a real possibility.
That way Google wouldn't have to store user data and mask it with anonymization techniques. Google likes to solve technological challenges, so it makes sense that it would turn to technology to solve any privacy problems.