Data center managers and CTOs already know that system downtime can be very expensive for an enterprise, but it’s possible they may not know the real extent of that expense when servers, networking and storage suffer a major outage.
New industry research from Emerson Network Power released this week at the Uptime Institute Symposium in Santa Clara, Calif., reported that businesses lose an average of about $5,000 per minute in an outage. At that rate, $300,000 per hour is not something to dismiss lightly.
The report, entitled “Understanding the Cost of Data Center Downtime: An Analysis of the Financial Impact of Infrastructure Vulnerability,” was based on a recent Ponemon Institute study, “Calculating the Cost of Data Center Outages.” The research analyzed costs at 41 data centers in varying industry segments; the data centers studied were a minimum of 2,500 square feet, so as to identify the true bottom-line costs of data center downtime.
Emerson used this study to provide an analysis of the direct, indirect and opportunity costs from data center outages. This takes in a lot more than lost customer sales. Factors include the damage to mission-critical data, impact of downtime on organizational productivity, legal and regulatory repercussions, and lost confidence and trust among key stakeholders.
Highlights include the following:
- The average cost of data center downtime across industries was approximately $5,600 per minute.
- The average reported incident length was 90 minutes, resulting in an average cost per incident of approximately $505,500.
- For a total data center outage, which had an average recovery time of 134 minutes, average costs were approximately $680,000.
- For a partial data center outage, which averaged 59 minutes in length, average costs were approximately $258,000.
Downtime can be even more costly for enterprises with revenue models that depend on the data center’s ability to deliver IT and networking services to customers-such as telecommunications service providers and e-commerce companies. The report cited the highest cost of a single event at about $1 million (more than $11,000 per minute).
“With the increase in reliance on IT systems to support business-critical applications, a single downtime event now has the potential to significantly impact the profitability-and in extreme cases, the viability-of an enterprise,” said Larry Ponemon, chairman and founder of the Ponemon Institute, a research center focused on privacy, data protection and information security policy.
A 2010 Ponemon Institute study, also commissioned by Emerson, surveyed more than 450 U.S.-based data center professionals and focused on the root causes and frequency of data center downtime.
Respondents experienced 2.5 complete data center outages during the past two years, the report said. Partial data center outages, or those limited to certain racks, occurred 6.8 times in the same time frame. The estimated number of device-level outages, or those limited to individual servers, was the highest at 11.3.
Respondents of that survey most frequently cited uninterruptible power supply battery failure (65 percent), exceeding UPS capacity (53 percent), accidental emergency power off/human error (51 percent), and UPS equipment failure (49 percent) as the causes of unplanned data center outages.
The complete Emerson-Ponemon Institute report is available here.