Red Hat Chief Technology Officer Brian Stevens has escalated to new heights the debate over whether the open-source Xen virtualization technology is ready for prime time, saying Novell was being irresponsible and risked damaging enterprises first experiences with Xen.
The Xen technology lets users run multiple operating systems as guest virtual machines on the same hardware, allowing for the better utilization of resources.
Novell baked Xen into its current SLES 10 (SUSE Linux Enterprise Server) product, which shipped in July. Red Hat, based in Raleigh, N.C., is including Xen in its upcoming RHEL 5 (Red Hat Enterprise Linux) release slated to ship late this year or early 2007.
“What makes us most nervous is putting a bad taste in someones mouth around the Xen technology, which we think is business-transforming. We should not screw this thing up and put a cloud around Xen,” Stevens told eWEEK Aug. 16 in an interview.
“I would much rather a customer had a solid experience with Xen. I think they [Novell] are being cavalier. We know what we need to be enterprise-ready and we already have a checklist of everything we need for that. They [Novell] have decided its more important to be first. Thats fine and maybe makes sense for them,” he said.
For his part, Novell CTO Jeff Jaffe told eWEEK in an interview at the LinuxWorld Conference & Expo in San Francisco that the company had done an enormous amount of testing and firmly believed the Xen technology was ready.
“Could it be that Red Hat is embarrassed about the fact that they are six months late? This is the most transparent ploy and contradicts their own press release in March where they said Xen was ready. Its totally a joke,” he said.
But Stevens said his comments, and other comments from Red Hat executives on this issue, were not designed to attack Novell, but rather to be very open with those of Red Hats customers who, after hearing Novells comments that Xen was ready for use in the enterprise, questioned why Red Hat was not already delivering it.
When asked by eWEEK if he had run SLES 10 and seen its virtualization experience, Stevens said he had not, adding that all Red Hat needed to know were all of the issues that its team was currently fixing with the upstream XenSource code base, from data corruption to “everything else. Do we want to bring that out to the market? Absolutely not. We want to work and drive robustness,” he said.
Asked if some of these problems could be related to integrating Xen into Red Hats operating system rather than Xen itself, Stevens said that was “absolutely not” the case. “The other stuff is pretty much done; all of the manager APIs, the installers, the console work is all done,” he said.
“This is about getting SMP [symmetric multiprocessing] support to work, and we just redid a lot of the network transmission and we got about a 5x improvement,” he said.
Red Hat is going to bring virtualization to millions of servers in a pervasive way, and that means ongoing hard work to meet the broad criteria of enterprise readiness, Stevens said.
Novells Jaffe agreed that creating virtualization technology “is not trivial” and is an open-source project that came out of XenSource. However, Jaffe said many of Novells engineers participated in the Xen project.
“There is a lot of industry momentum and support around the Xen technologies, from the chip manufacturers to the system-level vendors, the Linux distributors and even Microsoft, with its recently announced relationship with XenSource,” he said.
Red Hats Stevens said that while Novells engineering team was doing great work, its management team had made a conscious decision to bring Xen out first “and maybe theres an opportunity there, but we dont want to be first, we want to be right,” he said.
Getting Xen Ready to
Red Hat Enterprise Linux 5 will not ship without Xen, and the company will delay its release if that technology is not yet ready, Stevens said.
“I still recommend VMware. Right now, VMware is rock-solid, its robust and it doesnt matter what the application is. I am not going to go out in a cavalier way and try and displace VMware with something that is not ready,” he said.
IBM, based in Armonk, N.Y., agrees with Novell that Xen is technologically ready, but says it is going to exercise caution about its use at this early stage.
Kevin Leahy, the director for virtualization at IBM, which has been contributing to and helping with the development of Xen, said the technology is ready, but the question is whether it is proven.
“So thats what you come down to. Theres lots of ready technology, but what surrounds technology is practice and skills and services and capability and support. Those are the things that make it enterprise-ready as opposed to technology-ready,” he said.
The distinction lies in whether practitioners and service providers have learned how to use and deploy the technology, whether there is support available and if there are proof-points that it could scale to whatever the requirements are, Leahy said.
“If we thought it wasnt ready at all, we would not have said we are going to provide support for SLES 10. But we are also going to be cautious in how we recommend people use it. The technology is ready. Now we need to start doing some real projects to help people do those kinds of things and establish the necessary discipline, practice, procedures and processes,” he said.
If an enterprise wanted to deploy Xen to 10,000 users tomorrow for a mission-critical application, “I dont think it would be technology issue at all, but there would be huge issues about whether we could say we have the experience and knowledge and skills to do that. I wouldnt recommend that, as we do not yet have the experience to do that,” Leahy said.
Other providers of virtualization solutions, like Mike Grandinetti, the chief marketing officer at Virtual Iron Software, based in Lowell, Mass., told eWEEK that Xen will never be enterprise-ready on its own, as it is the value that each vendor adds that makes the difference between its being enterprise-ready or unstable.
“Initially, Novell and Red Hat thought that they could just take finished code from the Xen open-source project and wrap it into their Linux offerings. They never intended to invest any significant development effort [in working] on the virtualization services layer or the virtual infrastructure management layer.” he said.
Those vendors now understand how badly they had underestimated what the Xen project was and was not, he said. “It was a project, not a product. Theres a big difference,” he said.
Novell deserves some credit, as they took the initiative, invested time in the project and came to market first with a solution that includes Xen. But current user feedback would suggest that they took a snapshot of the project too early, when it was not quite ready for use, Grandinetti said.
“And Red Hat gets no credit at all. All theyve done is complain and waffle back and forth on the readiness of Xen. Our approach has been different. We set out from the beginning to build an enterprise-ready virtualization solution on top of Xen, and we brought a core competency in architecture and software development to the task,” Grandinetti said.
Virtual Iron worked closely with the Xen community and companies like Advanced Micro Devices, Hewlett-Packard, IBM and Intel, to build its own virtualization services and management layer on top of the Xen hypervisor, he said.
“Then we tested it thoroughly to make sure it was enterprise-ready. This was a tremendous development effort, and its what really makes the difference between a Xen-based virtualization solution thats enterprise-ready and those that are not,” he said.