Wide use of the Internet is helping the process to break down the organizational silos and support the move to utility computing, he said. But it has also forced the companys IT groups to confront "cleanup and integration challenge" to bring existing applications into line with the utility computing vision.
But before doing this, IT and company management has to "know what the cost of repositioning those service and the business value of repositioning them," Terbush said.
Jon S. Stumpf, vice president and senior technology officer at New York-based American International Group, an insurance and financial services company, echoed the views of other IT executives that successfully implementing utility computing isnt a matter of developing new technology.
"Companies today already have the technology required to offer IT as a utility," Stumpf said. But more companies dont implement utility computing because they dont have their business processes documented or defined well enough to support it, he said.
For utility computing to work, the IT organization has to be able to react to clients demands nimbly, Stumpf said. "The time to deliver [processing] capacity is short—it must be near-real time or in real time" after its requested, or the potential benefits will quickly be lost, he said.
The processing capacity has to be in discrete units that are isolated from other resource consumption and wont affect other resource increments, Stumpf said. When a business unit no longer needs that processing capacity, it should be returned in the same increments and in the same time that it was delivered, he said.
Another essential factor, Stumpf said, is that capacity has to be readily measured in terms of terabytes of storage or number of processors. That way, it can more easily be priced, so business units can budget and pay for the services rendered.