Last summers major power blackout proved a wakeup call for IT managers to examine business continuity plans. Analysts say that some have taken advantage of new data replication solutions and technologies that can span geographically distinct locations to mitigate the effects of a blackout or other calamity.
Still, a year after the six-state outage, analysts and storage vendors suggest a good number of CIOs continue to be reluctant to spend their budgets on the possibility of disaster.
The study by a joint U.S. and Canadian commission uncovered that uncorrected voltage fluctuations—and the failure to trim trees that ended up falling into power lines—were to blame for the Aug. 14, 2003 outage. The end cost for the damages was roughly $10 billion, according to reports.
The major power blackout and even more, the 9/11 terrorist attack, brought home the concept of disaster recovery planning to some enterprises, analysts said.
“Our perspective with our client base is that theyve developed a lot of interest [in disaster recovery],” said Maneesh Mehra, the Houston-based manager of PriceWaterhouseCoopers advisory services group.
“At a lot of organizations, systems operators, people plugged into the grid, received a wakeup call,” he said. “The infrastructure focus had not been there…what were looking for are systemic long-term changes in control room organizations.”
New York Citys financial district is at the center of a “perfect storm” of market forces driving solutions in business continuity analysts said. The regulatory activity prompted by insider trading scandals, the threat of terrorism, and to a lesser extent, health care regulation, are all factors said Peter Gerr, an analyst with the Enterprise Strategy Group, in Milford, Mass. However, the impact of the terrorist attack on the city had a much greater impact than the blackout, he estimated.
Outside the bustle of Manhattan, however, corporate attitudes toward disaster recovery seem to be moving along more slowly.
“I dont think [the blackout] did much,” according to Brad Wenzel, who runs a small storage integration company in Stillwater, Minn., with a client list that includes Boeing Co. and Honeywell International Inc. He said customers have plans in place to off-load their data to another location, but have not quite made the jump to making sure its available.
“If it doesnt hit close to home, its not important,” Wenzel said. “Thats human nature. The problem is high availability. Providing five nines of uptimes is extremely expensive and $100 million or so is a lot of money to spend. Nobody likes to spend money on [insurance].”
Still the market is moving, albeit slowly.
In a Harris Poll commissioned by SunGard Availability Services, of Wayne, Penn., C-level executives at Fortune 1000 firms or their designates were polled to address the issue of disaster recovery and information availability. About 60 percent of those polled said they plan to increase spending on information availability, and those who plan to increase spending expect an increase of 25.5 percent in 2004.
PriceWaterhouseCooperss Mehra said increased spending is an indication that IT is being brought to the table in making data availability decisions, a strategy that hes encouraging his clients to pursue.
Next Page:Replication is the Hot New Continuity Topic.
Replication is the Hot
New Continuity Topic”>
Specifically, the experiences of terrorist attacks and blackout, have cast a new light on remote replication services, Gerr said. Previously if a vendor wanted to invest in remote replication, the choice was between the three top vendors: EMC Corp., Hitachi Ltd. or IBM. Each product only worked on its own infrastructure, and licenses were needed for the equipment on both sides of the intervening data lines.
“Now, a combination of new technologies has cracked the golden castle of remote replication and made disaster recovery not the black art it used to be,” Gerr said.
Companies like XOsoft Inc. and Topio Inc. provide software solutions that allow companies to mix and match virtually any software and hardware combination, he said, through the use of agents. Smaller businesses also have the option of going with solutions like NSI Software Inc.s DoubleTake.
In addition, some companies are choosing to backup their data at a second remote site thats geographically separate, said Mike Marchi, senior director of marketing, ILM, compliance, and data protection solutions at Network Appliance Inc., of Sunnyvale, Calif. Marchi said that some backup facilities in New Jersey, the traditional location for New York firms to back up their data, were now being supplemented with tertiary sites in Minnesota or Arizona.
However, synchronous mirroring of data is mostly cost-effective in a range of 60 miles or so, analysts said.
“What I find interesting is that in Gartner surveys disaster recovery spending has not gone up,” said Donna Scott, a vice-president with Gartner Inc.s research division in Stamford, Conn. “They say spending budgets have not gone up, but I know theyve gone up. Where theyve gone up is replicating data over distance. That snapshot, that environment, becomes part of the production budget. It doesnt get put into the [disaster recovery] budget.”
Typically, a smaller, mission-critical store of data is housed at the second site, the most critical applications needs to keep the business up and running. If the primary backup facility goes down, a business can hobble along on the data stored at the third site for a week or so, Marchi said, instead of all of the data becoming permanently lost. While he wouldnt provide exact figures for the number of companies pursuing this strategy, Marchi said that his companys duplication software has a 40 percent attach rate.
According to an informal audience survey Gartners Scott conducted at a recent seminar, some 70 percent of the audience now outsourced this storage function, Scott said.
More storage customers are taking information off-site and not just replicating it to near-line storage, Marchi said.
“Most of the customers are doing it for the most important information, just for things that affect the business,” he said. “We do a lot of that. With e-mail, Microsoft Exchange, its very important to have a third replica; a primary site for the near-line storage, and a second site for a set of directories.”
Other records stored on near-line storage include generic Word documents and even instant-messaging logs, he added.
Some firms expand the planning for continuity beyond data replication to include an entire redundant infrastructure.
This can be seen in the strategy of Paul Bell, manager of network architecture and engineering for a New York-based securities firm. The firm maintains a 500-seat facility in Queens, complete with equipment that is periodically powered on and tested in case of an emergency.
After the Sept. 11 attack, he said, most of the company simply moved to the New York City borough and kept working. The Queens facility houses enough near-line storage to house the companys data, Bell said. Tape backups with snapshot and monthly archive data are physically taken to another facility a few hours away.
The companys ongoing disaster recovery program is continuously updated, Bell said, and it is signed-off at levels even above the CIO and CISO position.
“Senior trading management have to sign off on it,” he said. “Their revenue stream depends on it, so these people have to be involved in the decision.”
Although some areas of the country have placed a lesser priority on data availability and disaster recovery, the heart of the U.S. financial services industry has been tested, twice, and survived, analysts suggest. However, a true disaster would be a nationwide incident that would take the rest of the countrys plans off the shelf, Gerr said.
Be sure to add our eWEEK.com developer and Web services news feed to your RSS newsreader or My Yahoo page