SAN JOSE, Calif.—Necessity, practicality and hard-nosed business principles are defining utility computing for the pioneering organizations that are actually delivering computing capacity on demand to operating units.
A handful of these organizations gathered here at International Data Corp.s IDC Enterprise Forum on Thursday to discuss the capabilities and implementation options for on-demand computing.
The common theme among the organizations that told their stories at the forum was not that they started with the clear intention of implementing a utility computing program, but that it turned out that they solved the business problems they had using the principles of utility computing—with shared, virtualized resources that clients paid for according to what they used.
The organizations that presented included Automated Data Processing Inc., American International Group Inc., eBay Inc. and the chief information officer at the Florida state attorneys office.
For global Internet auction portal eBay, an on-demand IT model was essential for the company to keep up with the explosive growth in the number of registered users and the number of auctions that its system manages every day, said Mark Hydar, enterprise systems manager at eBay, based in San Jose, Calif.
“We got up where we did about $1 billion, and we asked what are we going to do to get to $10 billion,” Hydar said. The utility computing model was a natural approach for eBay because it “gives us the flexibility, the scalability that we wanted to be able to do that without increasing our headcount” in a way that would be anywhere near proportional to the companys business growth, he said.
He noted that eBays business quadrupled in the three years after its growth started taking off in 2000, and that was before eBay started its aggressive expansion in international markets.
Utility computing allows eBay to readily add computing resources, and to shift application resources around as different business sectors grow in various parts of the world, he said. On-demand computing fits “our business needs, which are always availability, speed and economics—which means better, faster, cheaper.”
Utility computing enabled the state attorneys office for the 15th Judicial Circuit in West Palm Beach, Fla., to provide for a wide range of law enforcement and county government offices.
“Our office, which was just a prosecutors office, became a service bureau for 37 law enforcement offices in southern Florida,” said Dan Zinn, CIO for the state attorneys office. His office also provides computer resources for 67 social service offices, courts and county business departments, he said.
He was able to do this by using standard hardware, operating systems, databases and Internet access to give these diverse offices access to computing power, he said. The on-demand model allows Zinns office to “replace five disparate systems with one system.”
Winning Management Support
They key to making it work, Zinn said, is to examine each clients needs and decide “where does on-demand computing make sense.” The basic policy Zinn followed was to “identify the points of information need and availability, communicate that information to the stakeholders, and let the stakeholders generate the demand.”
Zinns office also was able to win the trust of separate law enforcement agencies to the point that they were willing to support the development of consolidated databases to assist with organized crime investigations, tracking investigations, leads and suspects.
One of the key advantages was that utility computing allows Zinns offices to readily move around computing resources to meet changing demands, he said. “If we need to load balance, we can move those products around,” he said.
Another factor in successfully implementing utility computing was determining what services can be outsourced and what services should remain in-house, Zinn said. Services that usually can be outsourced are those that require costly professional services, such as application programming, database administration and complex application migrates and upgrades, he said.
Government agencies, he noted, usually dont have the funds to retain highly skilled programmers or data administrators. So, its best to outsource these services.
Services that shouldnt be outsourced include system security, customer service and application architecture provisioning, he said.
ADP Inc. is implementing utility computing as way to consolidate computing resources and to “break down the walls” that exist within a large global computer services company that has “very siloed organizational and operational structure,” said Randy Terbush, vice president and chief technology officer of infrastructure architecture and strategy at ADP.
ADPs definition of on demand involves managing computing resources to gain “the ability to repurpose computing resources based on changing demands of carefully measured business processes,” Terbush said. Given ADPs business structure, this has to be a gradual process to win management support in an organization that has many separate operating units in a decentralized structure.
The company provides a wide array of services, including 62 employer services, 21 brokerage services, 11 dealer services and five claims services spread over the United States, Europe, Asia, Australia and Brazil, he said.
Wide use of the Internet is helping the process to break down the organizational silos and support the move to utility computing, he said. But it has also forced the companys IT groups to confront “cleanup and integration challenge” to bring existing applications into line with the utility computing vision.
But before doing this, IT and company management has to “know what the cost of repositioning those service and the business value of repositioning them,” Terbush said.
Jon S. Stumpf, vice president and senior technology officer at New York-based American International Group, an insurance and financial services company, echoed the views of other IT executives that successfully implementing utility computing isnt a matter of developing new technology.
“Companies today already have the technology required to offer IT as a utility,” Stumpf said. But more companies dont implement utility computing because they dont have their business processes documented or defined well enough to support it, he said.
For utility computing to work, the IT organization has to be able to react to clients demands nimbly, Stumpf said. “The time to deliver [processing] capacity is short—it must be near-real time or in real time” after its requested, or the potential benefits will quickly be lost, he said.
The processing capacity has to be in discrete units that are isolated from other resource consumption and wont affect other resource increments, Stumpf said. When a business unit no longer needs that processing capacity, it should be returned in the same increments and in the same time that it was delivered, he said.
Another essential factor, Stumpf said, is that capacity has to be readily measured in terms of terabytes of storage or number of processors. That way, it can more easily be priced, so business units can budget and pay for the services rendered.