IBM Looks to Make Watson More Humanlike

By Jeffrey Burt  |  Posted 2016-04-09 Print this article Print
IBM Watson

That same capability to rapidly run through, collect and return massive amounts of information can help in a broad array of areas, from health care to other verticals, High said. Doctors would need 160 hours every week to keep up with the new information in their fields being generated. Watson can go through all that information for them and return with the information they need.

Over the past several years, IBM has added other capabilities to Watson, from expanding Discover Advisor to offering Watson services that customers can leverage for their own businesses. The company now offers 32 services on its Watson Developer Cloud, which runs on the Bluemix platform-as-a-service (PaaS) platform.

Much of the work this year will be around making Watson more humanlike, which will change its relationship with users. If Watson—and eventually other systems—can better understand the nuances of human communication, then it can handle more dynamic changes. Humans will no longer have to adapt and change to the system's interface. Instead, it will be the systems that have to adapt to humans.

With this in mind, IBM researchers are developing a series of programming interfaces that will be able to analyze text for everything from emotions to tone. A beta version of an emotion analysis capability in Watson can take text and analyze it for emotions from anger to fear to happiness. The idea is to build more "sympathetic systems," High said. Similarly, Tone Analyzer, an API designed to better understand and measure emotions in written text, can detect emotions in the text—such as anger—and suggest new wording that might send a better emotional message.

Through the Watson Developer Cloud, IBM also offers a personality insights service that can analyze text and offer insights into the writer's personality. It can determine if the person is introverted or extroverted, for example.

Such capabilities not only can help businesses—a restaurant owner can derive information from online comments from customers to determine what he can do better, or recruiters can better match a candidate with a job—but will lead to robots in the future that can more naturally interact with humans, High said. As an example, he pointed to the work IBM has done with Softbank's Aldebaran NAO robots. Through such APIs, the robots show more humanlike characteristics, such as not only responding to questions from humans in similar natural language, but also gesturing as they do.

He showed a NAO robot not only singing a Taylor Swift song, but also performing a Gangnam Style dance.


Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel