If there’s one word that spells out a major data storage trend for 2009, it’s “automation.” In its simplest form, we’re talking about the creative intersection of business intelligence, botlike software and data storage arrays.
Storage companies are finding ways to automate processes that used to be painstaking, tedious and expensive to handle. For example, storage tiering and change management priorities can now be dialed up from anywhere in the world on central, Web-based consoles supplied by a rapidly growing number of vendors.
Storage tiering keeps often-accessed data on a fast Tier 1 spinning or solid-state disk, by far the most power-hungry option; Tier 2 data, accessed less frequently, is kept on slower, cheaper SATA (Serial ATA) disks. Tier 3 is tape storage for data that may never see the light of day again.
Intelligent software in the data center is doing more of the heavy-and often very intricate-lifting. Old-school manual labor, where IT staff met once or twice a month to physically print out all the application patches and security updates on spreadsheets and walk them out to production locations, is finis.
The process of creating storage access and security policies also has been speeded up, with wizards and drop-down menus becoming commonplace. Templates are popping up everywhere. Administrative jobs are getting done in minutes that used to take hours or days.
On the operations side, data centers are being kept cooler with less electrical draw by automated variable-speed fans and pumps. These are replacing traditional CRAC (computer room air conditioning) and CRAH (computer room air handling) units with fans that run at a single speed. In this way, when a section of server racks is cool enough, the fans automatically slow down to save energy; conversely, when the racks need more cooling, they speed up.
This all adds up. A reduction of 10 percent in fan speed yields an approximately 27 percent reduction in a fan’s electrical use, and a 20 percent reduction in speed yields electrical savings of approximately 49 percent.
Automated features are trickling down from HPC shops
These are the advantages high-performance computing shops have enjoyed for years. Now those features are finding their way into most enterprise systems-and even some small and midsize business systems.
“Systems are becoming a lot more intelligent, quicker to adapt to changes in the environment,” Willy Chiu, IBM’s vice president of High Performance On Demand Solutions, told me. “You start off with these traditional processes-we call them BSS (business services) and OSS (operations services). Those will be automated, and they will leverage these cloud [storage] computing infrastructures and will provide more links to your business processes.
“IT becomes a much more intelligent organization, smarter about the environment you’re in; therefore, you can leverage it to drive your businesses across this new environment.”
Strategic Automation in the Data Center
Bryan Doerr, CTO of Savvis, a hosted IT service provider, explained to eWEEK that his company has been all about automated data center infrastructure.
“Underpinning all cloud [storage and computing] delivery is automation,” Doerr said. “If you don’t have the automation associated with delivery of the infrastructure, there is no way you can keep up with the move/add changes associated with clouds. Furthering this thought, enterprises have no business trying to develop all that automation. Why? Why are you investing in that, buying the software, messing around with all that stuff?
“And by the way, the only way we get dynamic infrastructure-infrastructure that is totally responsive to changes in the way the application is performing in real time, if you get the feedback loop, the transaction rate, the provisioning environment-the only way you’re going to get that is with automation,” Doerr said. “That’s nirvana.”
Savvis provides services to about 4,000 enterprises that include managed hosting, co-location and network connectivity, supported by the company’s global data center and network infrastructure. The company says it delivers “IT infrastructure as a service” by combining cloud technology, a global network and 29 data centers in the United States, Europe and Asia. It also has automated management and provisioning systems and a best practices operations model.
Some specific examples of how automation is becoming more strategic in the data center:
-Earlier in 2008, Compellent Storage Center 4.0 became the first networked storage system to manage data inside the storage volume, so that it can automate tiered storage within every drive.
–Sun Microsystems’ open-source-based Lustre storage automation software, a major part of Sun’s Open Storage approach for enterprise customers, is gaining a reputation as being among the speediest storage backup and management packages available, since it works alongside the super-fast Zettabyte File System.
-Onaro, now a division of NetApp, makes SANscreen Capacity Manager 1.0 and SANscreen Provisioning Manager 1.0, which enable enterprises to obtain an automated view of networked storage and provisioning.
–Symantec’s CommandCentral 5.1, announced Dec. 16, became the first storage optimization suite to integrate traditional storage resource management with automated storage change management.
–IBM has used autonomic computing practices for years. Autonomic, or self-healing computing, enables systems to self-diagnose components on a regular basis and troubleshoot problems with little or no human intervention.
Autonomic computing is a self-management mechanism for a system or systems. Autonomic IT systems can make pre-programmed “decisions” for themselves to solve problems-then solve them very quickly-in order to keep the data center operational. At its optimum, the process actually prevents problems from happening in the first place through a combination of business and operational intelligence, gained by a constant collection of data.
IBM is now including many of these features in its new z System mainframes as well as its WebSphere servers.
–Intel is testing low-power, self-sustaining sensors that can gather and record data on weather and other environmental conditions and larger sensors with transmitting devices that can help monitor and run data centers. This type of automation will help IT managers control conditions in their data centers and let them handle pressing issues other than power and cooling.