Halfway through calendar year 2007, trends in the data storage business are becoming more defined. Decision makers at enterprises are now more aware of such topics as carbon footprints, green data centers, pulling less wattage “out of the wall,” and leaner, cleaner data for storage, thanks to the improving methods of data deduplication.
Efficient data storage continues to be a key part of a well-run enterprise IT operation. Redundant data is the enemy; it is costly, wastes energy and generally slows storage I/O, traditionally the major bottleneck of storage.
All data roads eventually lead to some kind of storage, whether it be on 15K RPM Fibre Channel disks, a slower-running SATA (Serial ATA) drive, or a tape cartridge stored in an Iron Mountain vault.
Recent federal court e-discovery rules and commercial regulations are key drivers in forcing enterprises to re-examine their storage and data accessibility capacities, or else incur substantial risk in the case of litigation.
When eWEEK last looked at trends in December, we identified several that were beginning to make an impact. Those trends have indeed been borne out, through large enterprises and SMBs alike.
And there has been one relatively new trend that has come to the fore since then: thin provisioning.
Looking ahead to the second half of 2007, eWEEK sees the following trends continuing to gain momentum:
Deduplication and single-instance storage adoption.
There has been plenty of evidence recently that the concepts of deduplication and single-instance storage are winning acceptance in the market. When a company is able to go public based on a single technology, as Data Domain did on June 27 with deduplication as its main attraction, you know that trend has some traction.
Sparked by EMC’s October acquisition of market leader Avamar, other companies are buying or developing this functionality. Deduplication eliminates redundant data-down to sections of individual files-throughout a storage network and enables the system to run faster and more cost-effectively. Quantum, Diligent, NetApp and others have joined the dedupe parade in recent months and have a clear advantage in the marketplace.
EMC Avamar’s founder, Jed Yueh, whose nearly 500 customers include half of the Fortune 50 companies, told eWEEK that “data deduplication technology can transform archaic [digital tape] procedures, enabling automated, encrypted disaster recovery across existing wide area networks and accelerating the shift to disk as the de facto medium for data protection.”
Although the market often lumps deduplication solutions together, differences in implementation lead to significant differences in customer benefits.
“For instance, in real-world customer deployments, Avamar has seen as high as 588-1 daily reduction in network traffic and data storage for backups, a rate of efficiency that dwarfs the competition,” Yueh said.
Thin provisioning is a method of storage resource management and virtualization that lets IT administrators limit the allocation of actual physical storage to what applications immediately need. It enables the automatic addition of capacity on demand up to pre-set limits so that IT departments can avoid buying and managing excessive amounts of disk storage.
EqualLogic, Hitachi Data Systems, EMC, NetApp, 3PAR and CommVault all offer thin provisioning for either SAN (storage area network) or iSCSI storage systems. This will become a key factor in the “green” data centers yet to be built.
Enterprise features trickle down to SMBs.
Functionality such as deduplication, centralized data center automation and virtualization that weren’t available at price points affordable enough for small and midsize businesses a year ago continue to move into smaller packages for the midmarket.
All the major players-Hewlett-Packard, IBM, EMC, NetApp, Sun Microsystems, Quantum, CA and others-are fighting it out for attention here.
There has been an increasing use of virtualization in both systems and for files themselves. If it simplifies and centralizes control-and virtualization does this in spades-system administrators will jump for it. Gartner, Enterprise Strategy Group and IDC all point to virtualization as the longest-running trend in storage.
NAND flash memory evolution
NAND flash memory continues to evolve to higher capacity and become more prevalent in larger devices, such as laptop and desktop computers.
Dell previewed its first hybrid solid-state NAND flash/disk drive-based laptop two months ago. Samsung was the first to debut a completely solid-state laptop last year; other computer makers are expected to follow suit later this year.
Even as fabricators are learning to get more mileage out of flash chips, an adjunct to this trend is also developing. Intel is busily developing PCM ( phase change memory) chips. PCMs are nonvolatile memory chips that work well for both executing code and storing large amounts of data, giving it a superset of the capabilities of both flash memory and dynamic random access memory.
This means it can execute code with performance, store larger amounts of memory and also sustain millions of read/write cycles. There is more to come on this front soon.
Storage hardware efficiency
Storage hardware-both in spinning disk and tape storage-continues to add more capacity and yet it drops in cost. This trend has been with us for about five years, according to IDC and Gartner/Dataquest, and doesn’t appear to be slowing down anytime soon.