Managing data storage is more than making sure you have big-enough buckets for the bits. Several factors, including federal regulations and the continuing need to reduce costs, are forcing IT managers in a wide variety of industries to rethink their data storage and retention policies.
Data life-cycle management, or DLM, is the newest technology being offered as a solution to increasingly complex storage problems.
Simply put, DLM products help IT departments manage data throughout the course of the datas life span. DLM will be an important IT goal moving into the future because it will let IT departments better use valuable storage resources while keeping data available, accessible and protected. An effective DLM solution also helps IT identify data usage patterns and automates the processes for moving, protecting and archiving data.
Customers understand that there are tremendous benefits to implementing DLM solutions, said Chris Wood, director of marketing and technical sales at Sun Microsystems Inc.s Sun Network Storage unit. “They see a significant reduction of both the cost of storage acquisition and the subsequent cost to administer the storage by not keeping around superfluous, obsolete or low-priority data,” said Wood, in Newark, Calif. “They understand that the less core data they have to manage is the less they have to administrate, which results in a direct decrease in total cost of ownership.”
One of the biggest drivers for implementing a DLM solution is compliance with regulatory mandates. “There are, by some estimates, over 20,000 regulations worldwide that directly or indirectly impact the management of information,” said Howard Elias, EMC Corp.s executive vice president of corporate marketing and new ventures, in Hopkinton, Mass. “Just figuring out what regulations exist and how they apply to a given customers situation is often a substantial undertaking in and of itself, especially in regulated industries such as financial services and health care.”
The DLM products on the market today include older, repackaged technologies, such as HSM (hierarchical storage management) systems, and newer technologies, including file-system-level WORM devices and ATA-based nearline storage devices.
Next Page: DLM Offers Graded Tiers of Storage
DLM Offers Graded Tiers
of Storage”>
A common thread among DLM solutions is that they rely on graded tiers of storage and facilitate the movement of data from tier to tier. The logic behind using storage tiers is that as data gets older and is less frequently accessed, it should be stored on more inexpensive storage devices.
For example, primary-storage-tier devices, such as Fibre Channel storage systems, offer the fastest performance but also have the highest cost based on price per megabyte of storage—several times as much as simply archiving data on tapes. Moving stale data from the primary storage tier to a nearline storage device or a long-term archive, such as ATA- drive-based arrays, optical libraries or tape silos, can save a lot of money.
While the concept of storage tiers is an old idea thats commonly associated with HSM products, DLM solutions include features that will meet todays business needs.
“[DLM has helped customers] focus on and justify creating policies for the storage, retention and, ultimately, the disposition of all different types of electronic documents and unstructured data,” said Veritas Software Corp.s Brenda Zawatski, vice president of product and solutions marketing, in Mountain View, Calif. “In addition, customers are seeing economic benefits by being able to leverage their existing investments in data management while creating a DLM solution to comply with regulations.”
HSM systems tend to be one-dimensional in the sense that data is moved based on elapsed time since last use, according to Storage Technology Corp.s Mark Ward, vice president and general manager of Information Lifecycle Management Solutions, in Louisville, Colo. However, in todays business environment, information must be moved on a proactive basis—for example, to address quarter-end processing or retention for regulatory needs, Ward said.
Although HSM has been around for several years, it has never been popular in open-systems environments (compared with mainframe environments, where they are common).
“HSM solutions, particularly in open systems, have not gained traction in the marketplace as readily as you might expect,” said Karen Dutch, vice president of product management for Fujitsu Software Technology Corp., in Sunnyvale, Calif. “HSM solutions have some inherent problems in a multivendor environment. Management of the solution with symbolic links or stub files can be extremely time-consuming and is easily corruptible. Because of these limitations, there are serious concerns about the scalability of HSM systems.”
Next Page: ATA: An Attractive Option
ATA
: An Attractive Option”>
The recent emergence of low-cost ATA-based storage systems, however, has created an attractive secondary storage tier for IT managers who dont want to implement optical or tape nearline solutions.
According to Michael Marchi, senior director of enterprise marketing at Network Appliance Inc., NetApp customers are deploying ATA-based systems as a storage tier behind their primary storage. This strategy has yielded significant benefits, because ATA provides inexpensive, online, fast data access without the performance issues associated with tape.
“NetApp tiered-migration deployments have increased with the proliferation of nearline storage,” said Marchi, in Durham, N.C. “It is important to note that software, as associated with ATA storage systems, is critical in achieving the level of performance and reliability that customers demand. Simply purchasing a commodity ATA system without value-add software is a recipe for disaster.”
Another new concept is the use of WORM technologies to make hard-drive-based storage suitable for long-term archiving.
During interviews with storage vendors, eWEEK Labs was not surprised to find that those with tape solutions (including StorageTek and Hewlett-Packard Co.) were less enthusiastic about using hard drives for archiving than vendors that concentrate mostly on disks, including NetApp and EMC.
Some storage pundits and storage vendors who traditionally focus on disk-based storage have predicted the demise of tape for many years, but the market has consistently proved them wrong. Customer demand for tape solutions is strong, and HP, for one, does not see this changing in the near future. “Tape will continue to be an important media for customers,” said Rusty Smith, director of ILM for HP, in Houston.
eWEEK Labs believes that tape will be around for many years because it is a mature technology and has been proven as a removable storage medium.
Future Hurdles
The creation of open standards for DLM and ILM will be extremely important moving into the future.
Most DLM solutions today are primarily single-vendor solutions, which is exceedingly scary for many IT managers because the potential for vendor lock-in is present as long as no industrywide standards exist.
All the vendors we spoke with embraced the concept of open standards and expressed a willingness to partner with one another to add value to their DLM solutions. It will be interesting to see if this happens, as vendors try to balance the needs of their customers with their own desires to mold the market.
Considering that it has taken years for an acceptable level of SAN (storage area network) interoperability to become available and considering the relatively slow pace of storage management standards development, we expect it will be several years before DLM processes become standardized—if ever. However, we do expect to see partnerships formed among vendors.
IT managers should keep an eye on the Object-Based Storage Devices (OSD, formerly OBSD) Technical Working Group (www.snia.org/ tech_activities/workgroups/osd ). This group is designing standards for the next generation of storage devices, which will eventually be able to treat data as objects instead of just seeing things as blocks and files. As object-based storage progresses, storage devices will be able to handle DLM tasks such as identifying data and automatically setting policies.
Also interesting are the application-level tools that allow IT managers to migrate old data out of databases, keeping database servers lean and mean. Embarcadero Technologies Inc. and Princeton Softech Inc. have interesting application-level products that will become increasingly important as the amount of data contained in databases continues to grow.
Next Page: Implementation Guidelines
Implementation Guidelines
More than with most other technology implementations, eWEEK Labs recommends that IT managers know what they are getting into before they implement a DLM solution. A companys networking products, server hardware and software may change from year to year, but businesses maintain their storage systems for years or even decades.
IT managers might be tempted to implement something quickly, to help deal with regulatory compliance, but its never a good idea to implement a system without fully understanding all of the negative and positive business impacts.
DLM is not something you can acquire by purchasing a single product. To get the most out of DLM, IT managers need to know their business processes inside and out. By using storage resource management tools or checking application logs, IT managers can get a feel for the amount of stale data in the network and also get a realistic picture of data growth.
The key to successful DLM implementation is balancing costs with business impact.
If an IT manager were to get overzealous migrating data from primary storage to slower storage tiers, the negative impact on users (having to wait a long time for data to be retrieved) could be worse financially than just buying more primary storage and migrating data less often.
For example, after initially observing performance problems, NASAs Advanced Supercomputing Division decided to keep smaller files on primary storage permanently instead of migrating them since users didnt like to wait when retrieving smaller files. (To read our Labs On-Site about the NASA Advanced Supercomputing Divisions DLM implementation, visit www. eWEEK.com/labslinks.)
In addition, IT managers should expect long implementation times when deploying DLM solutions. Even after all components of a DLM system have been configured, it can take weeks or even months to get all of the policies optimized.
IT departments should plan to conduct extensive trend analysis to see how the DLM components are performing and tune the systems accordingly.
Security is another concern that must be addressed when implementing DLM solutions. Because these solutions typically create multiple copies of data on different tiers of storage, its very important to ensure that each of the data repositories in a DLM implementation is secure.
For example, removable media such as tapes should be secured with encryption, to prevent the theft of data from tapes stolen from a computer room or while in transit to some off-site facility.
Decru Inc. and NeoScale Systems Inc. are among the vendors that produce useful hardware-based encryption devices to protect the data on tape.
There is no question that DLM is difficult to implement, especially during tough economic times. But if its done correctly, IT managers should be able to save money in the long run.
Senior Analyst Henry Baltazar can be contacted at henry_baltazar@ziffdavis.com