Microsoft is planning to add eye-tracking technology to Windows 10, a move the software maker says will help improve how people with disabilities interact with technology.
The technology, dubbed Eye Control, was inspired by the Eye Gaze Wheelchair project, winner of Microsoft’s internal One Week Hackathon in 2014. It enabled Steve Gleason, a former NFL player diagnosed with amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease, to move his motorized wheelchair using a Surface tablet and custom software. ALS is an incurable neurodegenerative disease that renders its victims immobile.
In the years since, Microsoft researchers worked with ALS groups and affected individuals to study how technology could help improve their lives. Eventually, Microsoft dedicated an engineering team to bringing eye-tracking to Windows.
Now, it’s official. Eye Control is coming to Windows 10, enabling users with compatible eye-tracking hardware such as the Tobii 4C to use eye movements to perform actions that typically require a keyboard and mouse.
Eye Control is currently in beta and Microsoft is soliciting testers and their feedback through the Windows Insider early-access program. This Aug. 1 announcement video provides more information on Eye Control and it journey from a hackathon project to component of the Windows operating system.
Microsoft has made product accessibility for people with disabilities a major priority in recent years. In early 2016, the company named Jenny Lay-Flurrie its new chief accessibility officer as part of a company-wide push to make its products more accessible to disabled people.
Last month, Microsoft enlisted its slate of artificial intelligence (AI) technologies to the cause.
The company launched Seeing AI, an Apple iOS app for the visually impaired that generates an audio description of a scene captured by the camera on an iPhone, iPad and iPod Touch. In addition to detecting objects, text and people, the app can also describe a person’s approximate age, facial features and emotional state.
Soon, some hearing-impaired users will be able to stream audio directly to their Cochlear implants from their Apple iPhone, iPad and iPod Touch.
In a related development for the hearing impaired, medical device maker Cochlear on July 26 introduced its Nucleus 7 Sound Processor, the first “Made for iPhone” cochlear sound processor, an external unit that feeds audio signals to a hearing implant. The device, which was approved by the U.S. Food and Drug Administration’s (FDA) in June, will be available next month. The accompanying Nucleus Smart App helps track usage and helps locate lost sounds processors.
“The approval of the Nucleus 7 Sound Processor is a turning point for people with hearing loss, opening the door for them to make phone calls, listen to music in high-quality stereo sound, watch videos and have FaceTime calls streamed directly to their cochlear implant,” said Chris Smith, chief executive officer and president of Cochlear, in a July 26 announcement. The product also enables synchronized streaming for people with a hearing aid in one ear and an implant in the other, a setup the company describes as the industry’s first Made for iPhone Smart Bimodal Solution.