Backup, Disaster Recovery System Cost Justification: 10 Convincing Tips

 
 
By Chris Preimesberger  |  Posted 2012-07-01
 
 
 

Your Companys Data Is Valuable

Data is the essence of any organization. However, backup and DR/COOP investments can be difficult to sell internally because their benefits aren't readily apparent in the bottom line. Just like with any insurance policy, these technologies matter most when the unpredictable happens, such as when a hurricane, tornado or catastrophic fire hits; a building floods; or a human error occurs. IT administrators should enlighten executives about all the disaster scenarios that actually happen fairly often. This can be done by calculating the cost of the DR systems needed to protect the data along with the potential savings over time as compared to inaction.

Your Companys Data Is Valuable

Determine the Cost of Downtime

Explore what downtime might cost per hour in terms of lost revenue in your industry. If you don't have the time or expertise to perform an internal study, you can use the generic validated industry sector numbers from the folks at Gartner or IDC. Determine whether your business can afford not to have a solid DR strategy. Having a DR plan in place will be cheaper than a flat-out outage.

Determine the Cost of Downtime

Understand the Real Need for Tape

Don't disregard storage tape because it may no longer get the job done when it comes to recovery; it still works well for archival purposes. Enhancing data protection isn't about eliminating tape. For daily protection, companies should use an optimized deduplication source with automated replication to off-site storage. The target site should then be able to automate output to encrypted physical tape based on your archive policy. Archiving to tape at the DR target eliminates tape movement and reduces media requirements while enabling the organization to reduce archive costs and automate their archive policy.

Understand the Real Need for Tape

Deduplication Offers Great Savings

Deduplication systems can have a significant impact on recovery-time objectives (RTO) and backup and replication costs for most applications. To further increase cost savings, add a dedupe appliance off-site to electronically vault space optimized data across the wide-area network (WAN) to your DR site. Electronically vaulting data that has been deduplicated gets rid of the tape transport and recall costs, and can eliminate 80 to 90 percent of any array-based replication licenses. Since the deduped data is already on disk, recovery should be fast enough to handle the bulk of the applications.

Deduplication Offers Great Savings

Optimize WAN Bandwidth

Measure your WAN bandwidth for peak loads to reduce risk. Determine your company's needs during critical batch-processing periods because these are times of potential high risk and your WAN needs to be up to the challenge. This might be the most expensive part of a DR plan, but it's necessary for managing data changes and high-risk, peak load events. Dedupe and encryption should be used wherever possible to reduce cost and risk.

Optimize WAN Bandwidth

Make Sure Your DR Site Can Handle Peak Workloads

Even if in a degraded fashion, the DR site must be able to handle peak workloads. The mission-critical applications need to be available in a timeframe that supports continuity of operations at minimal impact to the business. If you are looking for a cloud provider to provide this function, make sure your contract includes all your requirements in writing, with penalties for nonconformance.

Make Sure Your DR Site Can Handle Peak Workloads

Shorten Recovery Times, Backup Windows

Use a formula to predict your recovery times, which can be helpful for companies that are considering taking a step toward better data protection. Organizations using a traditional, tape-based data protection approach should keep in mind that it takes twice as long to recover data as it does to back it up. It's important to understand that many continuous data protection (CDP) or snapshot-based solutions can recover data in seconds. Also, most snapshot solutions can act as the backup source for the applications they protect, which in some cases can move the backup stream away from the server and the network to the storage-area network (SAN) where it belongs. Server-less and LAN-free backup is a good thing.

Shorten Recovery Times, Backup Windows

Plan for Fast, Intelligent Recovery

Disaster recovery and business continuity can include many complex steps that require unique skills and in-depth knowledge of the operational aspects of many different applications and data sets. It's important to know that there are many interdependencies between application servers and databases that need to be recovered in the correct sequence. You should take the time to document and automate recovery tasks into a series of steps that retrieve crucial data and quickly get systems back online in the right order. Done right, your organization can be assured that it has created an intelligent and efficient approach to recovery, even if the individuals who manage the applications are no longer available.

Plan for Fast, Intelligent Recovery

Snapshots, Continuous Data Protection Eliminate Data Loss

Using snapshots and Continuous Data Protection systems will enable your business to eliminate the lengthy process of traditional backup and enhance recovery time to a matter of seconds. Zero data loss and rapid recovery should be everyone's goal, and the systems are available today to get it done at the right price.

Snapshots, Continuous Data Protection Eliminate Data Loss

Virtualizing Storage, Servers Reduces Infrastructure Costs

Implementing server virtualization enables server consolidation. Combining server virtualization with storage virtualization makes recovery a breeze, makes disk storage a commodity, provides the ability to replicate data between unlike storage media, provides application transparent technology refresh with zero downtime and can dramatically reduce Infrastructure costs for the primary and DR data centers. Implementing an intelligent abstraction layer for your storage (which is where your data lives) is the gateway to the cloud.

Virtualizing Storage, Servers Reduces Infrastructure Costs

Rocket Fuel