Real-Time Storage Meets Growing Data Demands

 
 
By Gary Bolles  |  Posted 2002-08-19 Email Print this article Print
 
 
 
 
 
 
 

In the face of information overload, companies face a critical need to improve the way they store and move data—or risk costly consequences.

Every day at the Aviation Weather Center in Kansas City, Mo., a team of 60 meteorologists sifts through a torrent of weather data from thousands of sources, such as satellites, radar ground stations, weather balloons, ships, pilots, even offshore buoys. The groups nine different forecast desks run the information through a variety of computers, including one of the worlds fastest supercomputers, generating model after model of dense graphics detailing weather conditions covering two thirds of the globe, from tornados in Kansas to floods in central Pakistan. All in all, says Clinton Wallace, an IT specialist at the center, AWCs two aging Hewlett-Packard K-class servers, which move all the data from one place to another, have to juggle 13 gigabytes of data a day—roughly equivalent to about four and a half million pages of text in the nondigital world.

When they were functioning, that is. But the servers used to crash frequently. "Were just beating them to death," admits Wallace. "Were driving them at 100 miles an hour, all day long." Outages used to last up to an hour, leaving forecasters without the ability to generate advisories for thunderstorms, ice, turbulence and visibility for thousands of pilots crisscrossing the planet.

Aviation Weathers data storms arent exceptions in todays data-saturated world, and with the amount of information to be warehoused in the digital economy growing at 30 percent a year, according to Gartner Inc., companies and organizations everywhere will need to create a better and cheaper way to handle the data glut, or risk costly—even life-threatening—information failures.

Companies "should have secure access to information at any time, over any distance, no matter where the information is kept, no matter the type of computing platform—all at the fastest possible speed," says Dan Tanner, a senior storage analyst for the Aberdeen Group, a research firm. "Thats the Holy Grail. That should be the strategic aim [of IT]. If you can do that, your business will run smoothly."

Worrying about storing and accessing critical business data wasnt always a concern. In the days of mainframe computers, storage was centralized, and users always knew where it was—on the big iron—even if they couldnt always get at it. But with the advent of cheaper and more accessible networked computers in the 1980s and 1990s, IT departments opted to phase out their companys more reliable, enterprise-class storage systems. Cheap and accessible networked devices also let companies put computing and storage horsepower where they thought it belonged—in a workgroup, for instance, or a hosting companys data center. But the downside is complexity. "When you vastly increase storage, and you vastly complicate the network," says Tanner, "managing the storage and movement of data becomes steeply more difficult."

Storage Everywhere

Strategic Storage chart
Strategic Storage
Corporate strategy should always drive every major IT architecture decision. The infographic suggests a way to think about how a particular storage design affects strategy, in a company where increased nimbleness and employee productivity are essential to delivering value to customers. To ensure rapid access to data, IT must provide an efficient networking infrastructure to provide the necessary speed. The data must be widely available, which typically means a strategy that blends centralization, reliability and scalability. These factors, in turn, are supported by an increasing trend toward standardization, allowing IT to "virtualize" storage hardware, making it possible to save and access data wherever necessary in the networking infrastructure.
Part of the problem stems from the fact that distributed computers merged all major pieces of the computing puzzle into one box. That means storage was merged with other components, from applications to operating systems to processing. These puzzle pieces arent easily uncoupled, making it difficult to isolate storage to maximize its effectiveness. Says Tanner: "Now the data center, instead of a mainframe, might have a bunch of open systems. But these computers wont share the information very well if they all have their own storage."

And that complexity can be costly. For IT, managing distributed storage means heavy spending on hardware and staff time. According to Mike Kahn, chairman and cofounder of the Clipper Group, a Wellesley, Mass.-based consulting firm, the cost of labor can be as high as seven to eight times the cost of the hardware itself. Thats why most IT shops, especially amid the current economic downturn, are trying to do more with less. The challenge, says Kahn, is: "How can I manage two to four times more storage, and do it better?"

Also contributing to the storage problem: Few business executives have any interest in the technology their company is using to store data. Few understand the link between a companys storage strategy and its ability to use data, in real-time, in the course of cutting costs, boosting profits—or predicting hurricanes. "The most critical thing we need for forecast operations is data," says the Aviation Weather Centers Wallace, and that means "having your data available when people need it."

But the trade-off between cost and complexity doesnt have to be a bad one. In an effort to keep costs down while achieving the reliability of the mainframes of old, many companies are rethinking their digital storage networks.



 
 
 
 
Gary Bolles Gary A. Bolles is the Editorial Director for Ziff Davis Media's Custom Conference Group. He is responsible for directing the group's editorial efforts, ensuring the quality of the content it delivers, and moderating and speaking at client events. A frequent lecturer and keynote speaker on a variety of technology topics, he has hosted more than 50 events in the past year alone.

Bolles was the founding Editor-in-Chief of Interactive Week, developing its unique vision, the founding editorial director of Sm@rt Reseller magazine, creating the publication from initial research, and the founding Editorial Director of Yahoo! Internet Life, managing its successful launch. Bolles was also the Editor-in-Chief of Network Computing Magazine, and for one year was the host of 'Working the Web' for TechTV, covering a wide variety of technology-related topics. Until recently, he was a contributing editor to CIO Insight, writing on a broad range of technology subjects, and assisting in the coordination of the publication's research efforts.

Bolles is the former Chief Operating Officer of Evolve Software, Inc., and the former VP of Marketing for Network Products Corporation. He has served as a marketing consultant to a variety of organizations, and has advised a number of software startup companies in arenas such as online marketing and data mining.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel