Enterprise IT managers are becoming more sophisticated planners and negotiators when it comes to implementing utility computing systems capable of responding in a cost-effective way to dynamic business conditions, according to a report by market research firm The Yankee Group.
The report on the TCO (total cost of ownership) for utility computing says IT managers need to obtain better analytic tools to support purchasing decisions in a dynamic business environment.
Since the dawn of commercial IT systems, corporate managers have always built new information-processing applications with the assumption that they will run unchanged as long as the business exists, said Andy Efstathiou, a business and IT services analyst at The Yankee Group and the reports author.
But todays IT managers must be able to plan for IT systems that can adjust to changing business conditions, Efstathiou said. This means they will have to account for changes in the “duration, volumes and prices in business processes” that are automated by IT systems, he said.
This is particularly true since corporations want their IT organizations to evolve from cost centers to a “profit or service center utility,” Efstathiou said.
Utility computing has the potential to allow IT managers to substantially reduce the cost of providing information resources to support steadily changing business conditions, he observed.
But most utility computing services on the market are just starting to make a transition to a true on-demand model, where resource consumption can rapidly adjust to changing business conditions, he said.
Part of the problem is that each vendor defines utility computing in a way that conforms to its own marketing strategy. Efstathiou defines it as a way to dynamically provision IT resources as on-demand services. But the industry hasnt fully managed to get utility computing to be as resilient as it needs to be to respond to dynamic customer demands.
Efstathiou likened it to ordering a drink at a bar. “People understand how to pay by the drink. But nobody knows what the definition of a drink is,” he said.
New TCO Approach
For example, server and storage vendors will increase capacity on demand. But most dont offer to decrease capacity to lower costs when business conditions change, according to Efstathiou. And the capacity of hosted applications often can be increased or decreased only within specific time periods, not in response to dynamic business conditions.
Efstathiou estimates that it will be 10 years before the computer industry develops the utility computing model to the greatest degree of TCO efficiency and market acceptance.
But in the meantime, IT needs to change the way it analyzes TCO for utility computing. The traditional assumptions wont work, Efstathiou said.
The Yankee Group recommends that enterprises adopt a process-oriented approach to TCO analysis, saying todays IT is able to deliver business value down to the process level.
“New technologies such as Web services and utility computing are making it possible to refine TCO analysis and deliver IT services down to the process level,” according to the report. To get the best TCO, enterprises need to work with multiple sources for services and equipment.
Efstathiou is also recommending that enterprises carefully integrate their business process planning roadmap with ITs planning, so the organization will be able implement utility computing systems at the most opportune times.
This will be important, he said, because technology vendors will have to change their pricing strategy to respond to rapidly changing industry conditions—or get out of the market.
Right now, IBM and Hewlett-Packard Co. are the leaders in supporting utility computing systems, Efstathiou said. While Sun Microsystems Inc. initially was an equally strong player in the field, the company has lost some of its momentum in the area due to distraction by persistent business-growth problems, he said.
Along with IBM/Tivoli and HP on the software side, Veritas Inc., Computer Associates International Inc., Microsoft Corp. and Tibco Software Inc. are among the vendors that are well-positioned to remain major players in the field, he added.
Eric Stouffer, program director for IBMs on-demand solutions in Austin, Texas, said he believes that the “work is well on its way to create the flexible infrastructure” that enterprises will need to implement utility computing.
In fact, Stouffer suggested that building a more flexible infrastructure is an important general goal in itself because it can provide a lot of short-term benefits.
“There is still a lot of learning to be done” about what is the most cost-efficient way to build a utility computing infrastructure, Stouffer said. But he said enterprise customers can work with IBM today to acquire server and storage capacity on demand. For example, customers can order fully configured and populated blade servers on demand and Shark storage units, paying for the additional capacity as they actually deploy it, he said.
“Its not that they have created the total utility computing infrastructure, but they are working on it a little at a time,” Stouffer said.
“It might take five to 10 years for utility computing to be a broadly accepted way of acquiring computer capacity, storage capacity and software services,” Stouffer said, adding that he hopes an estimate of a full 10 years would prove pessimistic.
Mobil Travel Guide
A number of companies, including Park Ridge, Ill.-based Mobil Travel Guide, are already using IBMs on-demand service to reduce costs and build more flexibility into their IT infrastructure.
Mobil Travel Guide is working with IBMs on demand e-business services to run its Web operations entirely on Linux-based services that are hosted and managed by IBM.
Mobil Travel Guide pays only for the processing, storage and networking capacity it needs and can scale its virtual infrastructure to meet demand spikes. Mobil estimates that it will save about 25 percent in overall maintenance and software costs from the on-demand service.
By driving down the cost of its IT infrastructure—in turn allowing it to develop new revenue streams—the company is now able to customize its database of hotels, restaurants and activities to deliver targeted travel recommendations and access to the lowest rates.
Utility computing, and in particular software as a service, is catching on because it allows enterprises to implement new business applications “at a much lower risk, at a much lower cost and therefore allows them to achieve a much higher ROI [return on investment]” far sooner than if they had built a custom system, said Stephen Savignano, CEO of Ketera Technologies Inc. of Santa Clara, Calif.
Ketera provides on-demand applications for enterprise spend analysis, sourcing, contract management, e-procurement and payment. Ketera is one of the partners that IBM works with in its own utility computing marketing program, Savignano said.
“I think having on-demand software-service offerings allows IT executives more choices” in their technology investment decisions, Savignano said. “They will be able to deploy their scarce internal resources into projects that are at the core of their business,” he said.
“In fact, these days, it doesnt make much sense for enterprises to build or implement custom applications for any business process that isnt at the core,” he said.
“In our view, spend management and e-procurement are not one of those business processes where unique innovation inside the company is going to provide competitive advantage,” Savignano said. This is what gives Ketera a strong business opportunity to sell its on-demand service to enterprises, he said.
Check out eWEEK.coms Utility Computing Center for the latest utility computing news, reviews and analysis.