LAS VEGAS–IBM, which brought us monstrous mainframe computers in the early 1960s, Selectric typewriters in the 1970s and the first Windows PCs in the early 1980s, has a new mission in life in the 20-teens: to make artificial intelligence part of the IT woodwork.
And it looks like they’ve got a good playmaker with that guy Watson, who wins at “Jeopardy” and runs around solving all kinds of business problems without a moment of rest.
At the company’s inaugural Think 2018 conference here at the Mandalay Bay Conference Center, IBM introduced new applications and capabilities that will enable AI to be used faster and more efficiently by developers, data scientists and enterprise professionals.
Most enterprises already know that AI has the power to upgrade their business, but few know how to harvest the benefits and unlock value from data to make CFOs and board members happy. To stay ahead of the IT curve, IBM contends, businesses need to provide their tech talent with the resources and tools to simplify and accelerate their AI work.
Deep Learning as a Service
The company’s new Deep Learning as a Service enables users to run hundreds of deep-learning training models at once and quickly optimize their neural networks. Users only need to pay per GPU, which makes this once cost-prohibitive capability more accessible to businesses, IBM said.
Drawing from advances made at IBM Research, Deep Learning as a Service enables organizations to overcome the common barriers to deep-learning deployment: skills, standardization, and complexity. It embraces a wide array of popular open source frameworks, such as TensorFlow, Caffe, PyTorch and others, and offers them truly as a cloud-native service on IBM Cloud.
IBM claims it combines the flexibility, ease-of-use and economics of a cloud service with the compute power of deep learning. With easy-to-use REST APIs, developers can train deep-learning models with different amounts of resources per user requirements.
IBM also revealed that it has open-sourced the core of the Deep Learning as a Service offering, calling it Fabric for Deep Learning (Ffdl, or fiddle, for short). The move not only embraces a wide array of popular open source frameworks, but also lowers the barrier to entry for deep learning. It’s a truly cloud-native offering and the latest in IBM’s long history of establishing open source centers of gravity in Cloud, Data, AI and Transactions.
Contributing Mightily to Open Source Community
IBM isn’t stopping there when it comes to open source. It also unveiled two other projects: MAX (Model Asset Xchange) and CODAIT (the Center for Open-Source Data & AI Technologies).
MAX is akin to an app store for machine-learning models, and it is free of cost. This exchange enables users easily discover, rate and deploy available models. CODAIT expands the mission of the Spark Technology Center, making AI models easier to create, deploy and manage. With partners like Google, CODAIT will create much-needed AI industry standards, IBM said.
IBM says its overall goal is to make it easier for developers to build deep learning models. Deep Learning as a Service has its own features, such as Neural Network Modeler, to lower the barrier to entry for all users–not just a few experts.
These new features all live within Watson Studio, the company’s cloud-native, end-to-end environment for data scientists, developers, business analysts and SMEs to build and train AI models that work with structured, semi-structured and unstructured data —while maintaining an organization’s existing policy/access rules around the data.
Image courtesy of IBM