Hitachi Data Systems Joins the Software-Defined Crowd
HDS says it now can address customer data workload requirements "from entry to the mainframe" with a single software stack.Hitachi Data Systems, an old-school IT company that often comes late to a trend but knows how to find customers who like its style, has joined the software-defined, hyper-converged, data analytics-based data center equipment crowd that already includes Cisco Systems, Hewlett-Packard, Oracle, EMC, Dell and others. On Day 1 of its Connect conference in Las Vegas, HDS on April 28 purported to provide all things for all enterprises by promising to address customer data workload requirements "from entry to the mainframe" with a single software stack. Hitachi has expanded its Unified Compute Platform lineup with new hyper-converged and converged servers and storage that its says now can support small-to-super large IT workloads. The latest additions to the UCP line include the hyper-converged Hitachi UCP 1000 for VMware EVO:RAIL and the mildly converged Hitachi UCP 2000, both of which are new rack servers that work best in small to medium or remote or branch office environments. The Hitachi UCP 6000 converged model combines the company's recently launched CB 2500 blade servers with new storage arrays noted later in this story. The UCP package, with the addition of Hitachi Unified Compute Platform Director infrastructure automation middleware, ostensibly allows for fast provisioning of infrastructure for managing fast-changing workload requirements.
HDS also has made available its own so-called "data lake" archive—another trendy term—for big data analytics consisting of its new Hyper Scale-Out Platform, which it contends provides cost-effective compute performance and on-demand capacity.