IBM (NYSE: IBM) announced several enhancements to its Smarter Computing initiative , including the introduction of a broad array of performance and efficiency enhancements to its storage and technical computing systems to focus on handling big data.
As part of its ongoing Smarter Computing effort, IBM on June 4 announced a new strategic approach to designing and managing storage infrastructures with greater automation and intelligence, as well as significant performance enhancements to several key storage systems and the Tivoli Storage Productivity Center suite.
At the same time, the company announced its first offerings that incorporate software from IBMs acquisition of Platform Computing earlier this year. These offerings are intended to help a broader set of enterprise customers use technical computing to achieve faster results with applications that require substantial computing resources to process growing volumes of data. IBM announced the news at its IBM Edge 2012 conference in Orlando, Fla.
Enterprises are dealing with data that is increasing exponentially in both size and complexity, said Rod Adkins, senior vice president of IBM Systems & Technology Group (STG), in a statement. The enhanced systems and storage solutions were announcing today have the performance, efficiency and intelligence to handle this big data. This is smarter computing that allows our clients to organize and analyze their data to better understand and serve their customers.
According to a survey of more than 300 global CIOs conducted by IBM and IDC, the most efficient companies have been able to spend more than 50 percent on new projects that were transformative to their business. By implementing techniques such as virtualization, de-duplication, automated tiering and cataloging, IT leaders are able to reduce the amount of time their architects spend provisioning storage by up to 50 percent as well as reduce the cost by up to 20 percent.
With more than 256 petabytes of client data managed, IBM has been building a portfolio of products and technologies for the past several years toward this end and is now announcing a formal approach behind it called IBM Smarter Storage. With this approach, customers are able to architect storage infrastructures that leverage such leading-edge technologies as real-time compression and automated tiering to help get more performance out of their systems, faster and for less cost.
Key Enhancements Drive Initiative
Moreover, to drive this initiative further, IBM is announcing enhancements to several key products. For example, it is adding real-time compression to IBM Storwize V7000, as well as to the IBM System Storage SAN Volume Controller (SVC), the companys industry-leading storage virtualization system.
Unlike traditional storage systems that compress only low activity data, or data not frequently accessed, real-time compression on the Storwize V7000 and SVC systems compresses active data by as much as 80 percent, increasing total effective storage capacity by up to five times. In addition to real-time compression, IBM also added four-way clustering support for Storwize V7000 block systems that can double the maximum system capacity to 960 drives or 1.4 petabytes.
From the move to electronic records to the ballooning sizes of medical images, storage in medical centers as large as ours, is rapidly becoming ground zero for big data, Rick Haverty, director of the information systems division at the University of Rochester Medical Center, said in a statement. IBM has recognized the need to start approaching the management of the growing data volumes in a strategic, smarter way, through built-in intelligence, automation and the cloud, to gain greater performance, reliability and better economics.
IBM added efficiency and performance boosts to several other systems as well, including:
- IBM System Storage DS3500, designed for small and mid-sized organizations and DCS3700, designed with high-density for high-performance computing environments, now feature Enhanced FlashCopy capabilities that result in 50 percent more snapshots, designed to speed up backups; and thin provisioning, which helps increase utilization of disk storage while lowering storage costs by reserving unused pools of storage for applications on an as-needed basis.
- IBM Tape System Library Manager (TSLM) is new software that expands and can simplify the use of IBM TS3500 tape libraries by providing customers a single, consolidated view of multiple libraries. The TSLM works with multiple generations of enterprise and drives based on Linear Tape-Open (LTO) technology, and media to store data into a single reservoir of tape that can be managed from a central point through IBM Tivoli Storage Manager.
- IBM Linear Tape File System (LTFS) Storage Manager is new software that provides lifecycle management of multimedia files, such as large video files, to customers using IBM LTO 5 tape libraries and IBMs LTFS Library Edition. As a result, video archive licensing costs can be dramatically lowered, as well as video tape cartridge costs.
- Enhancements to the IBM Tivoli Storage Productivity Center (TPC) suite will enable organizations to better manage big data storage requirements. With a new Web-based user interface, TPC can radically change the way IT managers view and manage their storage infrastructures. Also new to TPC is the integration of IBM Cognos, which provides intuitive reporting and modeling that can enable customers to easily create high-quality ad hoc and customized reports for better decision making. TPC offers simplified packaging that provides comprehensive management, discovery, configuration, performance and replication in a single license.
In addition to these enhancements, IBM will advance the Smarter Storage approach further in the future when it announces plans to extend its IBM Easy Tier capabilities to direct-attached, server-based solid-state drives (SSDs) to help customers coordinate data migration between their disk systems and servers. IBM Easy Tier automatically moves data to the most appropriate storage, including multiple tiers of disk and SSD, based on policy and activity.
Meanwhile, once considered the domain of supercomputing, workloads such as simulations, computer modeling and analytics are increasingly being adopted by a broader set of mainstream clients to drive business benefits.
Making Technical Computing Easier to Use
To make technical computing easier to use, IBM is enhancing its portfolio of hardware platforms with software to create integrated solutions that can help enterprises more quickly derive value from high-performance applications that require a lot of computing power and data. At the same time, IBM is committed to maintaining support for non-IBM systems with existing Platform Computing partners.
We are integrating the Platform Computing technology into our fold, starting with the IBM Platform Symphony family, a grid manager, David Geraldi, vice president of STG Competitive Labs, told eWEEK. IBM completed its acquisition of Platform Computing in January 2012.
The IBM Platform Symphony family is a grid manager that is now integrated with the MapReduce software framework to provide faster throughput and performance for demanding analytics and big data workloads in a single grid environment. Platform Symphonys resource-sharing model makes it cost-efficient for clients to expand their analytics environment as needed, Geraldi said.
The IBM System x Intelligent Cluster is integrated with IBM Platform HPC software to simplify cluster deployment, deliver results more quickly and improve productivity so clients can focus on research and analysis instead of managing their IT infrastructure. And the High Performance Computing (HPC) Cloud portfolio from IBM has been expanded with the new IBM Platform Cluster Manager offering as well as IBM Platform Load Sharing Facility (LSF) to provide clients with a shared pool of cloud resources available from anywhere, making it easy to create and manage HPC clouds.
The new IBM Platform Cluster Manager enables clients to self-provision clusters in minutes and automatically, dynamically manage cluster environments that include both IBM Platform Computing and non-IBM workload managers.
These new offerings will help IBM aggressively pursue the over $20.3 billion combined opportunity for technical computing, which IDC is projecting for 2012 and that is expected to grow at 7.6 percent annually to almost $29.2 billion by 2016.
Other technical computing offerings IBM is announcing today include:
IBM Platform LSF family: IBM Platform LSF is a comprehensive set of intelligent, policy-driven workload management and scheduling tools that can be used to manage high-performance workloads across a distributed, virtualized IT environment with up to 100 percent utilization to help keep costs low.
IBM General Parallel File System: GPFS now includes Active File Management (AFM) software to provide fast, trusted access to unstructured data, regardless of where the data resides so it can quickly be turned into insight.
IBM System x iDataPlex dx360 M4: The latest iDataPlex system can double performance on selected workloads with the latest graphics processing units (GPUs) from Nvidia, increases maximum memory to 512GB andwith the new slotless Fourteen Data Rate (FDR) Infiniband adapterprovides superior performance and flexibility for technical computing environments.
Moreover, complementing the new storage enhancements, IBM is offering a suite of services encompassing virtualization, automation and cloud technologies and that address the supply and demand side of storage. The new offerings use analytics and automation to infuse intelligence into everyday workflow with tools and services, including the following:
-
Intelligent Storage Service Catalog (ISSC) improves the way storage is used by simplifying how it is requested. ISSC promotes more efficient storage allocation and governance by establishing standards that can be used to optimize provisioning, backup, replication and archiving.
-
IBM SmartCloud services for storage provides support for IBMs family of cloud storage products, including IBM SmartCloud Managed Backup, IBM SmartCloud Archive, and IBM SmartCloud Object Storage, which improve data backup, resiliency and management of massive quantities of unstructured data from a cloud environment. The cloud offers clients flexibility, cost effectiveness and the ability to achieve faster recovery time, making archiving, exploration, and analysis faster and easier for businesses.