IBM Adds New Watson Emotional, Visual APIs to Bluemix
Meanwhile, IBM has added Emotion Analysis as a new beta function within the AlchemyLanguage suite of APIs. Emotion Analysis uses sophisticated natural language processing techniques to analyze external content and help users better understand the emotions of others. Developers can now go beyond identifying positive and negative sentiments and distinguish a broader range of emotions, including joy, fear, sadness, disgust and anger. By gaining this deeper understanding, Emotion Analysis can help identify new insights in areas like customer reviews, surveys and social media posts. For example, in addition to knowing if product reviews are negative or positive, businesses can now identify if a change in a product feature prompted reactions of joy, anger or sadness among customers. Also, moving beyond visual capabilities that enable systems to understand and tag an image, Visual Recognition is available now in beta and can be trained to recognize and classify images based on training material. While other visual search engines can tag images with a fixed set of classifiers or generic terms, Visual Recognition allows developers to train Watson around custom classifiers for images—the same way users can teach Watson natural language classification—and build apps that visually identify unique concepts and ideas. This means that Visual Recognition is now customizable with results tailored to each user's specific needs. For example, a retailer might create a tag specific to a style of its pants in the new spring line so it can identify when an image appears in social media of someone wearing those pants. Braxton Jarrett, general manager of IBM's Cloud Video Services unit, said Watson's visual recognition API has been in high demand among developers building video applications. As video has become a first-class data type in business as well as for consumers, IBM's new Cloud Video Services unit is going after the $105 billion opportunity in cloud-based video services and software.Previously, automated systems relied on a predetermined, rules-based corpus of words. This has been categorized by limited emotional queues, such as "good news equals a raised tone" or "bad news equals a slowed tone." In creating Expressive TTS, IBM studied and decided on a specific set of expressive styles to frame this speech capability. To do this, the research team made significant enhancements to IBM's existing synthesis engine incorporating ideas from machine learning to allow for seamless switching across expressive styles. Developers now have more flexibility in building cognitive systems that can demonstrate sensitivity in human interactions. These new and expanded services are part of IBM's open Watson platform that now includes more than 30 Watson services.
Moreover, to further advance emotional capabilities for cognitive systems, IBM has also incorporated emotional IQ into its existing Text to Speech API and is releasing Expressive TTS for general availability. Expressive TTS is now generally available to help cognitive systems generate and deliver an advanced level of adaptive emotion in vocal interactions, meaning computers can not only understand natural language, tone and context, but respond with the appropriate inflection.