Suggestion No. 1: Leverage the Value of a Backup Assessment
A backup assessment by a knowledgeable consultant or an in-house resource can give you invaluable information about the efficiency and utilization of your current environment. A backup assessment is typically a low-cost, painless process in which you or your consultant runs reporting software that measures a variety of metrics in your backup environment. These metrics include data risk, backup and restore performance, resource utilization, power consumption and total cost of ownership.
The results of this detailed evaluation and recommendations for improvement are then provided in a comprehensive report. Experts can use the details provided by these assessments to offer recommendations for improvements. Backup assessments can help you make the most use of existing systems by finding “orphaned” storage and resources, identifying bottlenecks, reducing existing backup failure rates, cutting power and labor costs, calculating return on technology investment and developing meaningful metrics for measuring improvement.
Suggestion No. 2: Virtualize Your Backup
Many companies recognize the savings that can be achieved through virtual server technology. This technology enables multiple virtual machines to run side by side on the same physical machine. However, you may also need a virtual backup environment to deliver the performance and flexibility needed to obtain the most benefit from these environments. Virtual server technology consolidates backups to a VTL (virtual tape library), reducing the use of costly primary storage, and utilizing space and power more efficiently than physical tape.
Virtual server software removes many of the physical world’s limitations, enabling you to create servers easily and to allocate the appropriate level of processing and storage for each application run on each server. By running as many as 100 servers on a single physical host machine, you can improve utilization and reduce footprint in the data center.
Although you can back up your virtual servers without changing your physical backup environment, a more efficient method is to use the virtual server software to consolidate backups and then use a virtual tape environment to back it up.
With this method, the virtual server software backs up each virtual server as a stand-alone server. It takes a snapshot of each virtual server and mounts it to a backup proxy server. Standard backup software on the proxy server backs up all of the virtual servers in a single, consolidated backup to the VTL. This method provides the disk-based performance needed to store and restore snapshots on the VTL, instead of more costly primary storage. It also provides the flexibility to create virtual tape libraries as large as you may need for efficient management and to store more data in less space.
For example, at Sterling Testing, soon after we started our server virtualization program, we realized that our physical tape systems were not fast enough to keep up with our data growth. Our tape systems also lacked the management flexibility we needed to get the most value from our VMware environment.
After evaluating several virtual tape and disk-to-disk systems, Sterling chose a Sepaton S2100-ES2 VTL to protect more than 4.8TB of data on 200 virtual machines. VMware software is used to take snapshots of the virtual machines and consolidate them on a proxy server. They then use standard EMC Networker backup software to perform a single, consolidated backup of the snapshots to the Sepaton VTL.
The backup and restore performance of the Sepaton VTL is so fast that we can back up our snapshots directly to it, without going to more costly primary storage. It also lets us adjust our snapshot frequency and retention times for optimal protection and cost savings. Sterling retains one snapshot per day for 21 days, saving more than 100TB of costly, near-primary storage.
Suggestion No. 3: Compress and Deduplicate Data
Data deduplication, combined with hardware compression, can enable you to store as much as 50 times more data in the same footprint. Less data capacity requires less physical space, power and cooling.
There are several deduplication technologies available, each using a different approach to reduce duplicate data. Using published deduplication ratios alone is not the best criteria. There is no industry standard for how these ratios are calculated, and there are many variables that affect the deduplication efficiency of a given environment. The following criteria can help you choose the best technology for your specific environment:
1. Backup performance: Ensure data deduplication technology does not impede your ability to meet backup windows.
2. Restore performance: Evaluate the length of time required to restore data from a deduplicated backup.
3. Scalability: For enterprises with double-digit annual data growth, scalability is essential. Some deduplication technologies have limited scalability and require you to store data in multiple, separately managed “silos of storage”-an inherently inefficient methodology.
Suggestion No. 4: Use Thin Provisioning in Your Backup Environment
The physical capacity of a tape or a virtual tape that is actually used is often much lower than its maximum capacity. In addition, some applications and virtual server software require you to allocate more capacity than you actually use.
Thin provisioning enables you to create as many virtual cartridges as you want-without needing the same amount of physical storage. Some virtual tape libraries have software that monitors the disk capacity that is actually used. When it reaches a user-defined threshold, it automatically alerts the administrator that additional storage is needed.
This real-time allocation function eliminates the need to overbuy capacity, saving significant power and floor space. It also gives you tremendous flexibility by allowing you to create virtual cartridge schemas without requiring the physical storage to be in place. Other value-added applications that use storage are free to use capacity from the same disk pool without requiring users to make any storage allocation decisions.