Google Android Wear 2.0 to Feature on-Device Machine Intelligence

Wearable devices that run Android Wear 2.0 will be the first that fully support Google's on-device machine learning technology.

Android Wear 2

Google's Android Wear 2.0 wearable computing operating system will be the first version that fully supports on-device machine learning technology.

The ML technology, developed by Google's Expander research team, will power smart messaging across Google applications as well as third-party messaging apps installed on the wearable. The feature will let users of Android Wear 2.0 wearables respond to incoming chat messages with a single tap and without having to be connected to a cloud service.

Up to now, the ML systems behind the conversational understanding and image recognition capabilities supported by many Google apps, have been hosted in the cloud. The systems have typically combined technologies, such as graph-based machine learning and deep neural networks that are so computationally and memory intensive they could only be run in the cloud.

With Android Wear 2.0, Google has introduced a newly designed lightweight ML architecture for enabling Smart Reply on Android Wear and similar capabilities on other mobile applications installed on a device, Google staff research scientist Sujith Ravi wrote on the company’s Research Blog this week.

Smart Reply is a feature that Google introduced with Inbox that is capable of generating automatic responses to incoming messages using machine learning and natural language processing smarts. The technology looks at the words used in a message and the context in which they are used to automatically generate a list of potential responses that the user can choose to respond to the message. The goal is to make it easier for users to respond to incoming messages.

In order to deploy Smart Reply capabilities entirely on to a smart device like Android Wear, Google had to first develop a lightweight ML architecture, Ravi said.

The big challenge was in figuring out a way to compress the machine learning models that Google typically runs in the cloud and make it fit in a small device's memory while also ensuring the models produce robust predictions.

Google made multiple attempts at taking its existing machine learning models and shrinking them down to a fraction of their size, but after the attempts failed to yield useful results, the company decided to try an entirely new approach, Ravi noted in his blog.

The new lightweight on-device ML powering Android Wear 2.0 uses a fast mechanism to compute potential responses to incoming messages more quickly, on-the-fly and with a small memory requirement, he said.

The new machine learning architecture eliminates the need for the device to be connected to a cloud service in order for Smart Reply to work. "Apps running on the device can pass a user's incoming messages and receive reply predictions from the on-device model without data leaving the device," he said.

The technology can be adapted so it is capable of learning the user's individual preferences and writing styles and using that knowledge to create personalized responses.

Google CEO Sundar Pichai has said the company intends to use machine learning and artificial intelligence technologies to make its applications and services smarter. The company's long-term goal is to use machine learning and AI to enable conversational understanding across most of its products so users can interact with them via voice commands.

Jaikumar Vijayan

Jaikumar Vijayan

Vijayan is an award-winning independent journalist and tech content creation specialist covering data security and privacy, business intelligence, big data and data analytics.