How HPE is Attempting to Make Artificial Intelligence Easier to Use

New products/services paving the way for data scientists, developers and IT departments to deploy and scale deep-learning models into new and legacy applications.

Gluon Deep Learning Tool

Hewlett Packard Enterprise wants to make it easier for enterprises to adopt artificial intelligence into their IT systems and software products. Why? For starters, it's a good revenue stream for the company; secondly, it's because this is what more and more IT managers have been requesting during the last 12 or so months.

To answer this call, Palo Alto, Calif.-based HPE on Oct. 25 introduced several new purpose-built platforms and services capabilities to help companies do this, with an initial focus on deep learning.

Deep learning, as a subset of AI, is typically deployed for tasks such as image and facial recognition, image classification and voice recognition. To take advantage of deep learning, enterprises need a high-performance computing infrastructure to build and train learning models that can manage large volumes of data to recognize patterns in audio, images, videos, text and sensor data.

Most enterprises lack integral components to implement deep learning, including expertise and resources, sophisticated and tailored hardware and software infrastructure, and the integration capabilities required to assimilate different pieces of hardware and software to scale these computing-intense systems.

HPE now offers the following:

  • HPE Rapid Software Installation for AI: This is an integrated hardware and software solution, purpose-built for high-performance computing and deep-learning applications. Based on the HPE Apollo 6500 system in collaboration with Bright Computing to enable rapid deep learning application development, this package includes pre-configured deep-learning software frameworks, libraries, automated software updates and cluster management optimized for deep learning and supports NVIDIA Tesla V100 GPUs.
  • HPE Deep Learning Cookbook: Built by the AI Research team at Hewlett Packard Labs, the deep-learning cookbook is a set of tools to guide customers in selecting the best hardware and software environment for different deep learning tasks. These tools help enterprises estimate performance of various hardware platforms, characterize the most popular deep learning frameworks, and select the ideal hardware and software stacks to fit their individual needs. The Deep Learning Cookbook can also be used to validate the performance and tune the configuration of already purchased hardware and software stacks.

One use case included in the cookbook is related to the HPE Image Classification Reference Designs. These reference designs provide customers with infrastructure configurations optimized to train image classification models for various use cases such as license plate verification and biological tissue classification. These designs are tested for performance and eliminate any guesswork, helping data scientists and IT to be more cost-effective and efficient.

  • HPE AI Innovation Center: Designed for longer-term research projects, the innovation center will serve as a platform for research collaboration between universities, enterprises on the cutting edge of AI research and HPE researchers. The centers, located in Houston, Palo Alto, and Grenoble, will give researchers for academia and enterprises access to infrastructure and tools to continue research initiatives.
  • Enhanced HPE Centers of Excellence (CoE): Designed to assist IT departments and data scientists who are looking to accelerate their deep learning applications and realize better ROI from their deep learning deployments in the near term, the HPE CoE offer select customers access to the latest technology and expertise including the latest NVIDIA GPUs on HPE systems. The current CoE are spread across five locations, including Houston; Palo Alto; Tokyo; Bangalore, India; and Grenoble, France.

In its mission to help make AI real for its customers, HPE also offers users a flexible consumption services for HPE infrastructure, which avoids over-provisioning, increases cost savings and scales up and down as needed to accommodate the needs of deep learning deployments.

AI is becoming mainstream in the consumer world with applications such as voice interfaces, personal assistants and image tagging. However, the implications of AI go beyond mainstream consumer use cases to fields including genomic sequencing analytics, climate research, medical science, autonomous driving and robotics. These technology advancements and breakthroughs have been–and continue to be–based largely on deep learning. 

For more information, read HPE’s Vice-President of Artificial Intelligence Pankaj Goyal’s blog here.

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor-in-Chief of eWEEK and responsible for all the publication's coverage. In his 15 years and more than 4,000 articles at eWEEK, he has distinguished himself in reporting...