HDS, Pentaho Join for Hyper-Converged Data Center System

HSP 400 is a single unit that supports big data processing, embedded business analytics and intuitive data management.

Hitachi Data Systems has huddled with its corporate cousin, enterprise management provider Pentaho, to come up with a software-defined, hyper-converged platform for big data deployments that serves as an alternative to those being marketed by VCE/VMware, HPE, Cisco Systems, Oracle and IBM.

HDS on Feb. 3 introduced its next-generation Hitachi Hyper Scale-Out Platform 400, which includes native integration with the Pentaho's Enterprise Platform. Pentaho, a Hitachi Group company, is a data integration and business analytics provider with an enterprise-class, open source-based platform for diverse big data deployments.

Hyper-convergence is a type of infrastructure system with a software-centric architecture that tightly integrates compute, storage, networking and virtualization resources and other technologies from scratch in a commodity hardware box supported by a single vendor.

Because it combines compute, storage and virtualization management, the HSP 400 is a single unit that supports big data processing, embedded business analytics and intuitive data management.

Enterprises are looking for ways to obtain value from massive volumes of data being generated by day-to-day business, the Internet of things, social networking and machine-generated data in their IT systems. HDS says its software-defined architecture centralizes storage and processing of large data sets with high availability, simplified management and a pay-as-you-grow model.

HDS claims that its architecture also provides a scalable infrastructure for big data. The architecture also includes a centralized user interface to automate the deployment and management of virtualized environments for open-source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open-source stacks such as the Hortonworks Data Platform.

The native integration with Pentaho Enterprise Platform gives users complete control of the analytic data pipeline and enterprise-grade features such as big data lineage, life cycle management and enhanced information security.
"We consistently hear from our enterprise customers that data silos and complexity are major pain points—and this only gets worse in their scale-out and big data deployments," HDS Senior Vice President Sean Moser said. "We have solved these problems for our customers for years, but we are now applying that expertise in a new architecture.

"Our HSP appliance gives them a cloud and IoT-ready infrastructure for big data deployments, and a pay-as-you-go model that scales with business growth. Seamless integration with the Pentaho Platform will help them put their data to work—faster," Moser said.

Hitachi offers HSP in two configurations to support various enterprise applications and performance requirements: serial-attached SCSI (SAS) disk drives, generally available now, and all-NAND flash, expected to ship in mid-2016.

Go here for more information.

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor-in-Chief of eWEEK and responsible for all the publication's coverage. In his 13 years and more than 4,000 articles at eWEEK, he has distinguished himself in reporting...