When IBM initially launched its Watson cognitive computing platform, one of the first questions on a lot folks’ minds was, “When can I tap into the power of Watson?”
IBM responded by opening up Watson to developers via the Watson Developer Cloud, which offers Watson services and APIs as well as useful documentation and tutorials, starter kits and access to the Watson developer community.
IBM started slow and continued to evolve its Watson strategy for developers. The company started with just a few Watson partners and offered just a handful of Watson services. Now there are 30 Watson services, and IBM continues to add more.
What started as a limited offering for developers now boasts a broad ecosystem of more than 500 partners, including startups and established businesses, that are tapping into the Watson Developer Cloud. There are more than 80,000 developers in the Watson developer community who use the Watson Developer Cloud to develop and test new business applications in industries ranging from health care, financial services and retail to education, music, sports and more.
IBM has refined its Watson strategy for developers to the point where Big Blue now offers what the company refers to as “self-service AI,” or self-service artificial intelligence. From gaining insights from text to analyzing images and video, developers can easily tap into the power of Watson APIs to build cognitive apps.
“At IBM Watson, we are focused on empowering these developers with platforms that are truly self-service and superior in three important ways: Science, Scale and Simplicity,” said David Kenny, general manager of IBM Watson, in a blog post.
Kenny argues that AI computing environments can be complex, “But a superior cognitive computing platform must hide this complexity to embolden developers and to empower its users,” Kenny said in his post. “Think of it as ‘self-service AI.’ The goal is creating development environments in which it is easy for developers to navigate, compose their apps and launch them—whether they are data scientists in a big bank, an analyst for a retailer, or a coder in a hospital system.”
That’s just how easy IBM has made it for developers to tap into the power of Watson—to the point that it is now self-service.
“We’re creating cognitive solutions that marry digital business with digital intelligence,” said Martin Schroeter, IBM’s CFO, during the company’s last earnings call. “We’re bringing our industry expertise together with these cognitive solutions, and we’re building it all on cloud platforms.”
Meanwhile, IBM is making available a set of developer tools that reduce the time required to combine Watson APIs and data sets. The tools make it easy to embed Watson APIs in any form factor from mobile devices, cloud services and connected systems.
IBM also is previewing IBM Watson Knowledge Studio where the company is opening up its machine learning and text analytics capabilities in a single tool. This will make it easier for line-of-business or general subject matter experts to use their own industry expertise to rapidly train their cognitive applications.
“Cognitive technologies are quickly becoming an ingredient in almost every software application built today, thanks to cloud platforms that allow developers in every business to select the software components that they need to build everything from simple mobile apps to very sophisticated cognitive ecosystems,” Kenny said.
At its InterConnect 2016 conference earlier this year, IBM introduced new and expanded cognitive APIs for developers that enhance Watson’s emotional and visual senses.
IBM added three new APIs to enable developers to add new emotional functionality to their cognitive applications.
The three APIs, Tone Analyzer, Emotion Analysis and Visual Recognition, are now available in beta. Additionally, IBM updated its Text to Speech (TTS) API with new emotional capabilities and re-released it as Expressive TTS for general availability. These APIs push the sensory boundaries of how humans and machines interact, and they are designed to improve how developers embed these technologies to create solutions that can think, perceive and empathize.
Charles King, principal analyst at Pund-IT, said IBM’s self-service AI initiative “aims to make it far simpler for developers to determine which APIs will work best for given projects by using IBM cognitive solutions. … That should help speed application design/development processes and also help enhance the quality of finished apps.”
Kenny agrees and looks for the community to continue to tap into the Watson services to expand the ecosystem.
“For every good idea that we have, our development community is thinking up hundreds and thousands more,” he said. “These innovators are now able to build on assets of legacy knowledge, empowered by our self-service AI. The result is a very different type of technology disruption is taking hold, and an entirely new breed of developer is emerging at the front lines.”