Microsoft’s upcoming Windows 8 will appear on both traditional PCs and tablets. However, as the company gears up to release the operating system’s beta in February, followed by the final version sometime in the second half of 2012, it’s increasingly clear that tablet functionality is a prime concern-a sea change from Windows 7, an OS whose interoperability with tablets often felt like an afterthought.
In order to optimize Windows 8’s functionality on tablets, Microsoft’s Windows teams are employing sensors in a variety of ways: not only to adjust screen brightness to compensate for ambient light conditions and rotate the elements on the screen to match the tablet’s orientation, but also to enhance apps.
That last one is particularly important, given Microsoft’s desire to transform Windows 8 into a viable tablet competitor to Apple’s iPad. The operating system will feature a mobile-applications store familiar to anyone who’s used either Apple’s App Store or Google’s Android Marketplace; Microsoft has spent the past few months encouraging developers to think about creating apps for the platform, arguing that Windows’ built-in audience is potentially a very lucrative one.
“Initially, some thought that the need for such sensors was scoped to very few apps, such as specialized games,” Gavin Gear, a manager of the Device Connectivity team, wrote in a Jan. 24 posting on the Building Windows 8 blog. “But the more we examined the 3D motion and orientation sensing problem, the more we realized that applications are much more immersive and attractive if they react to the kind of motion humans can easily understand, such as shakes, twists, and rotations in multiple dimensions.”
Hence, Microsoft’s requirement for its hardware partners that any Windows 8 tablet feature a performance-calibrated combination of gyroscope, three-axis accelerometer and magnetometer. “Combining the input of multiple sensors to produce better overall results is a process we call sensor fusion,” Gear wrote. “The ‘magic’ of sensor fusion is to mathematically combine the data from all three sensors to produce more sophisticated outputs.”
Active sensors also drain power and impact system performance, which led Microsoft to tinker with ways to minimize both. As a result, most sensor fusion data processing occurs at the hardware level, sparing the main CPU from having to burn power and cycles in order to wrestle with algorithms. In addition, Gear added, “We implemented powerful filtering mechanisms that we tied directly to the needs of sensor apps running at any given point of time,” a decision that meant “sensor data is only sent up the stack at the rate that apps need that data, and no faster.”