When Longhorn ships, Microsoft plans to change the way data is collected via the Dr. Watson error-reporting tool, causing some users to fear for their privacy.
Microsofts Dr. Watson error-reporting tool will undergo a significant makeover in Longhorn, but changes in the way program crash data is collected and transmitted have raised eyebrows among privacy rights advocates.
The Dr. Watson program error debugger,
aka Windows error reporting, will be revamped to collect more than just the dump of the memory image when an application crashes.
Although Microsoft Corp. will set up a strict "opt-in" process to determine how data will be collected, security experts believe end users will find it difficult to sort through the sheer volume of information.
Russ Cooper, founder and editor of the NTBugtraq security mailing list,
was among the first to raise privacy concerns.
"[T]he vast majority of consumers wont be able to navigate through the volumes of data to make informed decisions as to what they dont want to send.
"Microsoft has said the data will be submitted anonymously, but its hard to see how a submission will be useful to the person who submits it if its done completely anonymously," Cooper argued in a published column.
In an interview with Ziff Davis Internet News, Cooper said the risks could be even higher in a corporate environment where valuable intellectual property and confidential data is transmitted automatically when a piece of software crashes.
"There is a real risk that data could be intercepted," said Cooper, who doubles as senior security analyst with Cybertrust Inc. He described a theoretical situation in which a malicious hacker could trigger a denial-of-service attack against an application and eavesdrop on the error-reporting dump transfer to hijack data.
Longhorn will not be built on a .Net framework after all. Click here to read more.
Cooper believes that the automatic error-reporting coming in Longhorn will help Microsoft in its quest to stabilize the operating system, but warned that IT administrators will simply turn off the tool to avoid problems.
A spokesperson for Microsoft downplayed the privacy fears, arguing that the user would be total control over any data that is collected.
"In Longhorn, the first level of detail collected by these tools does not include any personal information. If additional levels of detail are required, consumers will be invited to inspect the data that would be sent and only after they provide their consent will the data be sent to Microsoft," the spokesperson said in a statement sent to Ziff Davis Internet News.
"Data is used to make the entire Windows ecosystem measurably better over time for customers," he added.
However, if theres anyone to blame for users initial fears, try company chairman Bill Gates. At WinHEC this year, Gates likened the Dr. Watson makeover to the data recorders used during flights to monitor cockpit activity.
"Think of it as a flight data recorder, so that any time theres a problem, that black box is there helping us work together and diagnose whats going on," Gates said. That description suggested (erroneously, according to insiders) that the tool would continuously monitor computer usage before, during and after an application crash.
After Gates WinHEC speech, the company huddled to contain the damage. The message from Redmond was that no information, under any circumstances, would be collected without user consent.
Click here to read about Microsofts "Palladium" security in Longhorn.
A source stated that only data that is absolutely necessary would be collected if the user reporting the error hit a particular type of crash. At that stage, Microsoft would ask for a description of the problem and default data, which is described a "crash minidump."
The source acknowledged that, in some cases, the minidump could theoretically contain sensitive data. The information in the minidump is described as a small snapshot of the state of the application at the time of the crash.
In rare cases, small portions of documents, e-mails or IM conversations may be included in the minidump, but, even then, it would not be enough to qualify as a security or privacy risk.
Opting in may not help out end users.