New TCO Approach
For example, server and storage vendors will increase capacity on demand. But most dont offer to decrease capacity to lower costs when business conditions change, according to Efstathiou. And the capacity of hosted applications often can be increased or decreased only within specific time periods, not in response to dynamic business conditions. Efstathiou estimates that it will be 10 years before the computer industry develops the utility computing model to the greatest degree of TCO efficiency and market acceptance. But in the meantime, IT needs to change the way it analyzes TCO for utility computing. The traditional assumptions wont work, Efstathiou said.The Yankee Group recommends that enterprises adopt a process-oriented approach to TCO analysis, saying todays IT is able to deliver business value down to the process level."New technologies such as Web services and utility computing are making it possible to refine TCO analysis and deliver IT services down to the process level," according to the report. To get the best TCO, enterprises need to work with multiple sources for services and equipment. Efstathiou is also recommending that enterprises carefully integrate their business process planning roadmap with ITs planning, so the organization will be able implement utility computing systems at the most opportune times. This will be important, he said, because technology vendors will have to change their pricing strategy to respond to rapidly changing industry conditionsor get out of the market. Right now, IBM and Hewlett-Packard Co. are the leaders in supporting utility computing systems, Efstathiou said. While Sun Microsystems Inc. initially was an equally strong player in the field, the company has lost some of its momentum in the area due to distraction by persistent business-growth problems, he said. Along with IBM/Tivoli and HP on the software side, Veritas Inc., Computer Associates International Inc., Microsoft Corp. and Tibco Software Inc. are among the vendors that are well-positioned to remain major players in the field, he added. Eric Stouffer, program director for IBMs on-demand solutions in Austin, Texas, said he believes that the "work is well on its way to create the flexible infrastructure" that enterprises will need to implement utility computing. Click here to read about IBMs recent move to create a business unit focused on the high-growth areas of Linux, grid computing and virtualization. In fact, Stouffer suggested that building a more flexible infrastructure is an important general goal in itself because it can provide a lot of short-term benefits. "There is still a lot of learning to be done" about what is the most cost-efficient way to build a utility computing infrastructure, Stouffer said. But he said enterprise customers can work with IBM today to acquire server and storage capacity on demand. For example, customers can order fully configured and populated blade servers on demand and Shark storage units, paying for the additional capacity as they actually deploy it, he said. "Its not that they have created the total utility computing infrastructure, but they are working on it a little at a time," Stouffer said. "It might take five to 10 years for utility computing to be a broadly accepted way of acquiring computer capacity, storage capacity and software services," Stouffer said, adding that he hopes an estimate of a full 10 years would prove pessimistic. Next Page: Mobil Travel Guide is among the pioneers.