BURLINGAME, Calif.—Utility computing holds promise because of its basis in lower-cost hardware and software, but enterprises require better ways to manage and provision their computing resources before they can fully take advantage of the model, said panelists at a Linux conference here on Wednesday.
To a large extent, enterprises have moved into utility computing by accident as they have expanded their use of x86-based systems and the open-source Linux operating system, according to a panel at the OSDL Enterprise Linux Summit. As more applications and business processes rely on that architecture, demands on enterprise data centers to manage disparate systems have increased dramatically.
“The way Linux systems found the route into the enterprise was purely by accident. They backed into it,” said Akmal Khan, president and chief operating officer of San Francisco-based Levanta Inc.
“Nobody planned an enterprise data center with Linux in mind, and one piece of advice is to start thinking of it that way.”
Khan said that the proliferation of the utility computing model puts pressure on data centers. Data centers are struggling not only to manage the upkeep of the systems but also to find enough space to house them, he said. His company, Levanta, provides management and deployment software for Linux systems.
Part of the challenge comes from a fundamental shift in the operating system. Where data centers had relied on Unix running on mainframes or other centralized hardware that scaled up, they are now focused on Linux running on servers whose numbers are continually expanding, or scaling out, Khan said.
“The phenomenon that we all have been seeing over past few years is that Linux is not Unix,” Khan said. “Theres a big difference.”
Another challenge to the progress of utility computing is defining the model. Market analyst IDC itself classifies utility computing as a combination of six layers of software and categorizes it into some 21 functional software markets, said Dan Kusnetzky, an IDC vice president.
The notion of utility computing, also called on-demand computing, covers a swath of concepts from grid computing to buying applications and computing resources as a service, panelists said.
“You have 14 buzzwords, and if you ask 10 people what grid or utility computing is then youll get 10 answers,” said Rob Gingell, executive vice president and chief technology officer of Cassatt Corp. “Its more a confluence of trends.”
The trends include the move to commodity storage, networking gear, and computing hardware, and the cannibalization of Unix with the growth of its open-source cousin, Linux, Gingell said. Cassatt, of San Jose, also provides enterprises with software ad services for managing utility computing.
Within enterprises, the utility computing model will vary widely, said Carl Kesselman, a member of the Globus Alliance and chief scientist at Univa Corp. The ultimate value of it, though, will come from the underlying hardware and operating system becoming more invisible—or an “abstraction”—and enterprises using utility computing to support new business processes, he said.
“We should not expect or anticipate homogeneity or uniformity with respect to the operating system or the policy, or with respect to the way things are deployed,” Kesselman said.
But enterprises are seeking a more unified view of the utility computing world, Khan said. Established companies are trying to manage a wide array of hardware and OS configurations that have grown over time.
Because of those complexities, enterprises are taking longer than startup companies to embrace utility computing. New companies are at the forefront of utility computing now, but Khan said he expects enterprises to require another few years to migrate toward it.