It’s right in the name. Visual Studio, the popular integrated development environment from Microsoft, uses a graphical interface and various visual tools to create a canvas that helps professional coders turn their ideas into working applications.
However, not everyone is blessed with good eyesight, in which case the IDE’s oftentimes graphically dense interface can quickly turn into a hindrance. While screen readers and other technologies can audibly convey on-screen content to visually-impaired users, they often fail to pick up syntax highlighting and other nuances that define the code editing experience.
With these challenges in mind, Microsoft released an Visual Studio extension called CodeTalk that improves accessibility for users with vision problems.
“Highlights of the extension include the ability to quickly access code constructs and functions that lead to faster coding, learn the context of where the cursor is in the code, navigate through chunks of code with simple keystrokes and hear auditory cues when the code has errors and while debugging,” wrote Microsoft’s Suresh Parthasarathy, a senior research developer, and Gopal Srinivasa, a senior research software development engineer, in a Dec. 18 blog post. “The extension also introduces a novel concept of Talk Points, which can be thought of as audio-based breakpoints.”
The features are particularly helpful for completing syntax-checking tasks, said the Microsoft staffers. In their own testing, they found that the extension’s Talk Points and audio error notifications helped improved productivity. Some sighted users also found some extensions useful, they added.
More information is available in the Microsoft Research blog. The CodeTalk extension for Visual Studio is available for download at GitHub.
Microsoft has made accessibility central to its mission of empowering individuals using technology.
Windows 10 Fall Creators Update contains an Eye Control feature that can help people with neurological diseases like amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease as it is more commonly known, and other disabilities control a PC using their eyes and an off-the-shelf eye-tracking camera. Wayne Rash, an eWEEK contributing editor, described the technology as a lifeline to the disabled, helping users access an employment market that has become highly reliant on online job applications.
In July, the company released Seeing AI, a mobile app for Apple iOS devices that turns what the camera “sees” into audio descriptions. The app can detect objects, text and people. When a person steps into frame, Seeing AI goes a step further by relating a person’s emotional state, facial features and age, give or take a few years.
Facebook, meanwhile, is taking a different approach to letting its visually-impaired users know when people show up in their News Feeds.
Facebook’s new face recognition feature uses the company’s screen reader compatible alt-text tool to identify people who appear in their timelines, even if they are not expressly tagged in a photo. However, privacy-conscious users will soon be able to prevent Facebook from automatically recognizing them in photos, the company said in a Dec. 19 announcement.