SAN JOSE, Calif.–Self-configuring, self-managing data centers that use industry standards to oversee products from different vendors are still years away from reality, according to many of the 140 top researchers and IT vendor representatives gathered this week by IBM to generate support for its so-called autonomic computing initiative.
Researchers from leading vendors such as IBM and Microsoft Corp. and universities such as Cornell and University of California-Berkeley, meeting at IBMs Almaden Research Center here, described scores of projects to build and test technologies that would allow large distributed systems to configure, monitor, optimize and heal themselves with little or no intervention from human administrators.
Officials from IBM and other key vendors, however, acknowledged theyve only begun to define a new set of industry standards that would be required for hands-off management of large, distributed, heterogeneous systems.
“It will take quite a few years to get to [autonomic computing],” said Robert Morris, director of IBMs Almaden Research Center. “Right now were not quite sure what standards groups to use to define the standards. It could be well have to work with selected peers or partners first.” So far, IBM hasnt identified partners with whom its working to develop autonomic computing products and standards.
IBM Vice President for Server Group Technology and Strategy Irving Wladawsky-Berger said the company is keeping close tabs on standards evolving from the technical computing arena that could be used to enable autonomic computing in commercial environments. The Globus Project, a consortium of researchers from universities and national laboratories building technical computing grids, has begun to define what it calls the Open Grid Services Architecture, a collection of XML definitions and other tools that would allow systems to self-manage and interoperate over the Internet. The OSGA work has yet to be applied to commercial environments, however.
While the ultimate goal of autonomic computing is years off, vendors like Microsoft and IBM have made some progress rolling out products that make individual computing elements more self-managing. The latest release of Microsofts SQL Server database product, SQL Server 7, for example, includes an Index Tuning Wizard and Analysis feature that performs “what-if” analysis using sample queries to automate database design decisions. Microsoft is working on improving the tuning wizard by refining exactly which statistics to collect and analyze, said Surajit Chaudhur, senior researcher at Microsoft and leader of the companys Data Management, Exploration and Mining group.
IBM has made similar advances through its year-old eLiza Project, intended in part to spread many automated management features from its zSeries mainframes to other IBM server lines. IBMs iSeries servers, for example, have inherited from the mainframe world Chipkill memory technology, which provides hot spare memory allowing systems to automatically heal in case of a failure.
High-Level Nervous System
The autonomic computing vision at IBM and many academic research institutions, however, goes far beyond simply giving individual computing elements such as servers or routers the ability to manage themselves. The idea is, essentially, to create a high-level nervous system that would ultimately allow enterprises to define business policies and objectives and have entire computing environments manage themselves to deliver the necessary performance, availability, security or other outcomes.
“Vendors like IBM have put some of the self-managing piece parts together, but they havent been able to put together the high-level management layer of software that implements the total autonomic thing,” said James Cassell, group vice president at Dataquest Research in Tierra Verde, Fla.
Delivering that comprehensive level of self-management is becoming increasingly critical to IT vendors and their enterprise customers, however, speakers at the IBM-sponsored Almaden Institute conference said. Thats because, as demand for computing grows, enterprises are being forced to build increasingly complex systems, composed of many more servers, storage devices and other elements. Enterprises are even expected to borrow the concept of computing grids–networks of systems that dynamically share capacity over Internet technologies.
As computing environments become more complex, however, enterprises need more people and more money to manage them. Its the growing management cost–more than any inability to sustain Moores Law–that will ultimately limit IT innovation, said John Hennessy, a co-founder of MIPS Computer Systems Inc. and now president of Stanford University. “If we cant solve that problem, we will run increasingly into performance walls.”
While industry standards and products to support the grand vision of autonomic computing seem to be years off, many researchers are attacking the problem. At UC Berkeley, for example, the Telegraph project is developing concepts that would allow for automatic optimization of SQL queries on the fly, even during run-time.
A project at Columbia University, called Kinesthetics eXtreme, would place Java-based agents or probes into legacy systems and tie them into a high-level set of policies and rules to allow for automated self-management of entire systems.
Even as IBM this week called for industrywide cooperation to develop autonomic computing standards and technologies, officials at one rival–Hewlett-Packard Co.–made it clear that competitive pressure to deliver on the concept has already begun. Officials at HP, which is about to begin shipping a second version of its Utility Data Center products, said the company probably wont make UDC APIs and other specifications publicly available until later this year.
Although HP Director of Always on Internet Infrastructure Solutions Nick van der Zweep said the company ultimately plans to conform to OGSA or other evolving standards, the company believes it can exploit an advantage by being the only enterprise IT vendor shipping a product that delivers on autonomic computing concepts today.
“We have a product today that executes on the vision,” said van der Zweep. “IBM is very much talking about the same ideas, but is unable to point to the kind of deliverables that we have.”