Utilities as models of business efficiency? I dont think so. I tend to associate utilities—especially electric companies—with bright yellow, extended-cab pickup trucks emblazoned with a Mass. Electric logo parked in front of coffee shops. That and promises of nuclear power too cheap to meter and some long-ago high-school physics class where I learned that about 30 percent of the power generated is lost getting it to the house. Then there was that business about electrocuting the elephant. Ill get back to that in a minute.
So computer vendors will have to forgive me for not getting as breathless as they do when theyre talking about the new world of computing based on the utility model. IBM is the most vocal champion of this with its on-demand computing initiative.
Hewlett-Packard, meanwhile, claims to be the latest advocate, as well as the first proponent, of the concept. Recently, HP announced the name “Adaptive Enterprise” to package the idea. In both cases, users are urged to plug into the computing cloud not just to exchange data but also to tap into applications. A similar notion is behind Suns N1 scheme. The vendors all assert that utility computing is stable, safe and efficient. While lacking a claim to first usage, we as a publication (then PC Week) carried an article on computing fabrics in 1998. That article was written by Erick and Linda von Schweber, who called themselves the Infomaniacs and were well-remembered at Comdexs Spencer Katt parties for their glittering robes and out-there predictions. When I talked to them earlier this month at their new digs in San Francisco, they were still well-grounded in fabrics, grids and the utility computing business.
While the fabrics are still being woven, in Infomaniac speak, the days when utility computing will be easy and safe are still about a year and a half away. Thats not too far for planning purposes, but its still not reality, either. According to Erick von Schweber, the payoff in utility computing will be when functionalities can be combined while the methodologies to build those functions remain constant.
That view was echoed by John Jordan, principal in the office of the CTO, Cap Gemini Ernst & Young Americas region. “Were suggesting clients work on architectural fundamentals so their code base, network, application semantics and other elements will be ready for Web services, grids or utilities,” said Jordan.
Page Two
Of course, if you ask someone in the pay-as-you-go utility computing business, they will tell you that the subscription model, with far fewer fixed costs, rapid deployment and manageable expenses, is the wave of the computing future. George Kadifa, CEO of Corio, said the utility model “turns the IT cost structure from large fixed cost to a variable and predictable cost structure.”
And the people actually wrestling with the utility concept? “The funny thing about the term utility computing is that I dont know of anyone that has ever said they like their utility company,” said Darby Group President Carl Ashkin. One CIO, a year into an outsourced utility computing project and requesting not to be identified, told me, “We were running a pretty tight organization, and, perhaps as a result, we have experienced none—repeat, none—of the efficiencies promised from centralization.”
One other CIO, also asking to remain anonymous, said he is not ready to start looking at utility computing until additional safeguards are in place. “We dont discuss utility computing here yet because utility computing implies shared-CPU computing,” the CIO said. “Co-processing with strangers is still too premature a technology.”
Which gets me back to the elephant. Back in 1903, Thomas Edison and George Westinghouse were battling for control of the U.S. electrical infrastructure. Edison was promoting his direct-current technology as safer than Westinghouses high-voltage alternating current. Overcome with competitive fever and apparently losing control of his considerable senses, Edison took to holding public demonstrations of the inherent dangers of AC by electrocuting animals. A grim, final demonstration saw Edisons company zapping a 3-ton pachyderm named Topsy at Coney Island. AC won anyway.
Now, while I dont think data can down an elephant, I do think the utility vendors need to carefully build their case for utility computing based on testing and case examples before their claims reach the level of data delivery that is too cheap to meter.
More Opinion from Eric Lundquist: