What was really behind the slip of Windows Vistas widespread release from fall 2006 to January 2007? In a word, quality, according to Windows head honcho Jim Allchin. However, quality (or the lack thereof) appears to mean many things to many people, or Microsoft divisions.
In a late-March conference call, Allchin cited overall quality issues, especially around security, drivers and performance, as the reasons behind the Vista delay. Sites with volume licensing agreements can obtain the Vista code in November, but everyone else will have to wait a couple months to upgrade to Vista.
Allchin, co-president of Microsofts Platforms & Services Division (who, after Marchs reorganization, was revealed to have but a handful of reports), acknowledged that some of Microsofts partners were not in favor of the company delaying the actual product launch. (Reading between the lines, perhaps these partners didnt have as many concerns around security, drivers and performance.)
But others felt that a Vista rollout in November or December, both of which include a number of holidays, might have less impact than it would if Microsoft waited until a little later.
However, the issue of quality in Redmond goes well beyond the occasional memo and pep rally. A number of divisions inside Microsoft have been charged to raise the quality bars for their respective product lines.
First, theres the test team inside Microsofts Core Operating System Division. The team consists of a number of working groups targeting areas such as verification, application compatibility and drivers as well as teams for setup and stress testing.
There is also a Core OS Share team that is charged with kernel, deployment and fundamentals testing and an interoperability test team.
These teams are focused on looking at the flow of code around the system and reducing dependencies among the various Windows subsystems and components.
By doing so, at least in theory, Microsoft will be able to reduce product delays and deliver more consolidated feature packs and interim releases more quickly. And the reduction of dependencies also is supposed to help security analysis.
But Microsoft now wants to re-engineer how software is developed. An Engineering Excellence team is managing the standards used to create Microsofts software products and has responsibility for a little-known project known as Software Quality Metrics, or SQM.
SQM consists of tools—which Microsoft is using internally only, at least at this point—to measure the performance of various product components.
While Redmond is mum on the project, our sources had a bit to share. SQM allows teams to check whether their products features and functionality are the right mix for the market.
The SQM tools, which sound as if they are being used across almost every major and minor product group inside the software giant, provide teams with reports on software reliability, quality of service and usage.
The SQM reports can provide real-time feedback on the potential impact of a user interface tweak, a feature cut or the wording of a product-feedback box.
SQM also aggregates this quality data into fact tables and OLAP (online analytical processing) cubes. This historical data is stored in data warehouses, and project members can view and manipulate that data with Excel, we hear.
There is discussion in the halls of Redmond to make SQM available to developers outside the company.
As you might imagine, the implications of such a project are mind-boggling, especially in light of the recent news and reorganization. I asked Allchin in March about the kinds of tools, including SQM, that helped Microsoft determine that it was going to need to push back by a few weeks the Vista RTM (release to manufacturing) date.
We also came up empty there: “Perhaps after we get closer to shipment we can discuss the dashboard and quality metrics we have,” Allchin told your trusty Microsoft Watcher. “Right now, the team needs to completely focus.” Please!
For more on Microsoft and Mary Jo Foley, check out www.microsoft-watch.com.