Utility Model Is at Work Today

By Peter Coffee  |  Posted 2004-02-16 Print this article Print

When medical imagery data created explosive growth in demand for digital storage, Project Manager Midori Kawahara, of the British Columbia Cancer Agency, needed prompt and reliable access to those files for doctors and patients throughout that Canadian pr

When medical imagery data created explosive growth in demand for digital storage, Project Manager Midori Kawahara, of the British Columbia Cancer Agency, needed prompt and reliable access to those files for doctors and patients throughout that Canadian province. In the subsequent year and a half, 5 terabytes of online capacity—about 1.7 terabytes of data with double-redundant replication—has expanded to more than 14 terabytes and is growing at 50GB per week.

But the storage system, Kawahara said, is "invisible."

The agency relies on a utility storage provider, Vancouver-based Bycast Inc., to serve the storage needs of the agencys four major centers.

Bycasts is one of many utility computing offerings that are not tomorrows new thing but are already meeting enterprise needs—not with the modified supercomputing applications that readily move to grids or other utility platforms but at the foundations of mainstream IT functions.

Printing, for example, is an utterly boring component of mainstream IT that frustrates administrators and users with cumbersome and costly redundancy and management of consumables.

IBM is attempting to recast enterprise thinking in this area with Output Management Services, a program that unifies printing, copying, scanning and faxing equipment and support under a single monthly bill. More than a hardware management program, the offering aims to identify business process changes that can reduce the need for redundant steps, such as printing a form that will then have to be scanned after completion.

The virtualization of approaches such as Bycasts and the simplifications offered by IBM are preludes to the final utility step of automation.

Check out eWEEK.coms Storage Center at http://storage.eweek.com for the latest news, views and analysis on enterprise and small business storage hardware and software. Scripts and standard notations, such as Data Center Markup Language (www.dcml.org), enable companies such as MetiLinx Inc. to use hands-off, multiattribute measurements of system workload as triggers for planned responses.

"When one of your servers goes down," said MetiLinx Executive Director Larry Ketchersid, "the fire drill that follows is all manual. If you have a utility environment, a server gets provisioned, and you can figure out whats wrong more at your leisure."

That automation opportunity is rapidly expanding, moreover, into entire application platforms that are provisioned in utility fashion. The Web services platform offered by Salesforce.com Inc. is expanding from a portfolio of hosted applications into a tool kit that readily extends custom applications using the Sforce tools that are already incorporated into development products such as Borland Software Corp.s JBuilder X.

What makes a utility offering compelling

  • Incremental capacity adjustment
  • Superior visibility of resource usage and cost
  • Reduction of budget and technical risk through pay-per-use pricing
  • Although they might seem quite diverse, the offerings that are described here have three important traits in common: They offer ease of incremental expansion, they improve the visibility of IT costs to enterprise decision makers, and they protect the enterprise from technology risk by enabling buyers to pay for results instead of placing bets on IT products. These are the benefits that are making utility computing models the new focus of enterprise IT plans.

    But buyers should not expect to get these benefits merely by opening a package or signing an agreement. "Process is crucial to operating a pool of resources in a way that improves service and lowers cost," said Bill Mooz, senior director of utility computing at Sun Microsystems Inc. "You need processes to operate a data center and to turn your utility into a business service thats generating value."

    And if that process development task sounds time-consuming, it is, but no more so than any other choice of whether to build, buy or rent a needed resource.

    Nor is there any rush. "The technologies underlying utility computing are in a state of evolution thats going to last for a little while," said Mooz. "I would not recommend to anyone that they switch their entire environment to a utility model overnight, but if you approach it selectively and intelligently, I think you can get high returns today."

    Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.

    Submit a Comment

    Loading Comments...
    Manage your Newsletters: Login   Register My Newsletters

    Rocket Fuel