SAN ANTONIO—At first glance, there’s not much to see at the months-old Open Compute Lab here at the University of Texas at San Antonio. Off the main area is a windowless room populated by a few rows of cubicles, with a whiteboard colorfully decorated with various sketches, designs and notes stretching across one wall.
Off that room is a smaller one featuring two racks of systems running through their tests. The main area itself is a somewhat wide open area with its share of desks cluttered with various systems, and standing in the middle are two partially populated racks.
To the first-time visitor, it has a feel of a new apartment where most of the boxes have yet to be unpacked.
But what Paul Rad sees is potential, the beginnings of an operation that promises huge benefits for tech vendors, end users and the university researchers alike.
“Come back a few months from now. This place will look a lot different,” Rad, director of the University of Texas at San Antonio UTSA Open Compute certification lab and vice president of research at Rackspace, told eWEEK during a recent visit to the lab. “There is a lot happening here.”
Facebook started the Open Compute Project (OCP) in 2011, kicking off a consortium that now counts such names as Intel, Advanced Micro Devices, Broadcom, Mellanox Technologies and Rackspace as members. Like other Web-based companies running huge, hyperscale data centers, Facebook was finding that off-the-shelf servers and other systems were too costly and not nearly power-efficient for its needs. Instead, the social networking giant built its own systems and software, resulting in 38 percent greater efficiency and 24 percent cost reductions in its data center operations.
Facebook then open-sourced its hardware designs, creating the Open Compute Project in hopes of creating standards for highly efficient data center and IT hardware. The consortium has since branched out beyond servers to cover storage, networking and hardware management.
At the Open Compute Summit in January, Frank Frankovsky, OCP president and chairman, announced the opening of two certification labs—one in Taiwan and the other at UTSA’s Cloud and Big Data Laboratory—which are tasked with testing systems for quality assurance and ensuring they meet OCP specifications. The lab already has certified AMD’s Open System 3.0 “Roadrunner” and Intel’s Decathlete server boards. In addition, officials with original design manufacturer (ODM) Quanta in January said the vendor has seen three servers certified by the lab.
The testing and the certification of the systems that the labs do is crucial to driving adoption of the open-source systems coming out of the OCP, according to Matt Kimball, senior strategic marketing manager at AMD. IT professionals are increasingly realizing that many of the servers that are running in their data centers are commodity products, and they are intrigued by the energy efficiency and cost savings the open systems promise.
However, they also understand that the extra money they’re paying now for servers from Hewlett-Packard and Dell means high levels of reliability and availability, and if a server breaks, they know whom to call, Kimball told eWEEK.
“HP might have me pay a premium, Dell might have me pay a premium,” he said. “But there’s a reason for it.”
Open Compute Test Lab Head Sees Big Future for Facility
The UTSA certification lab plays the role of quality assurance—the lab’s seal of approval means the system has been rigorously tested and meets the high standards for such metrics as reliability and interoperability from the OCP, Kimball said.
“It’s really about peace of mind,” he said.
The interest in open hardware also illustrates the trend within the industry of customers looking for more customized data center solutions that address their particular needs, Kimball said.
“The concept of ‘one size fits all’ is going out the window,” he said. “People are starting to look at customized solutions as not just software optimized for hardware, but hardware optimized for software.”
Rad and Daniel Smolenski, senior network analyst in the UTSA College of Sciences and manager of the OCP lab, said the hardware gets tested in various stages, from the system level through the rack and then in the school’s big cloud data center. And it won’t just be servers, they said. For example, the lab recently received a storage appliance from Quanta that was built according to Open Network Install Environment (ONIE) standards.
However, to view the lab as simply a testing site is short-sighted, Rad said. The lab sits at a key intersection of vendors, end users, and UTSA faculty and staff, and will have a major influence on all those constituents, he said. The plan is to continue to build out the cloud infrastructure at the university, and then let the various stakeholders leverage the cloud for their needs, many of which overlap with each other.
For example, tech vendors have a place to test their equipment, which in turn gives the students at the school high-level hands-on experience handling these systems. Meanwhile, end users can use the lab and its cloud infrastructure to run proof-of-concepts (the lab currently is working with three large financial services firms to run their tests) to see how these open systems will work in their environments and run their applications. Being able to use the UTSA cloud saves these businesses the time and expense of setting up their own test environments. And again, it’s the UTSA students who do much of the work around the testing, Rad said.
As the UTSA’s work in cloud and big data grows, the expectation is that the university will become known as a “cloud school,” attracting students and researchers who will bring even more ideas to the environment. Rad calls it a “pinwheel,” where the lab’s significance in the field continues to grow.
But he also wants it to be more than just a test and certification facility. Rad sees it as a place for driving innovation in the areas of cloud computing and big data. In addition, open source is about collaborating with others, and the certification lab gives people a place to do that.
“To build gravity [around an effort], you need a place for people to go,” Rad said. “We are that place.”
Open Compute Test Lab Head Sees Big Future for Facility
Collaboration is a key part of any open-source movement, including the OCP. That includes collaboration between usually competing tech vendors and between end users. It’s a culture these businesses need to embrace as new computing paradigms—from the cloud and virtualization to big data and mobility—place new demands on both vendors and end users.
For example, collaboration among businesses could result in significant cost savings when it comes to IT, Rad said. He spoke about the idea of “community clouds” among companies in the same fields, from financial services to health care. Businesses like Facebook and Google, because of their massive size and the huge amounts of IT they buy, can influence the supply chain and the prices they charge, making it more cost-effective for them to build their own systems rather than buying off-the-shelf servers.
Individually, end users—even those like the largest financial services firms—don’t have that kind of economies of scale. However, should they join together as a single entity when buying IT, they could carry the same weight as those larger Web companies, which could mean lower costs from suppliers, he said.
“They’re not Facebook, not alone,” Rad said. “But five of them coming together, they could present a problem as big as Facebook presents to the supply chain.”
For many industries, about 80 percent of what’s in a business’s data center is relatively the same as that in their competitors’ facilities. Agreeing on that 80 percent could make a significant difference in influencing pricing. They can work together on that 80 percent, then compete on the 20 percent that is unique to each company.
UTSA, the Massachusetts Institute of Technology and the University of Notre Dame are working on a demonstration of a community cloud, Rad said.
The certification lab will be a key feature May 7-8, when the university hosts the OCP’s first Open BigCloud Symposium. Discussion at the event will focus on big data, cloud computing, Open Compute hardware, OpenStack software and software-defined networking.
In addition, it will give the lab the opportunity to talk about its growth and its future. Rad said he will announce that in the six months since the lab started taking shape, the cloud infrastructure has grown rapidly to 6,600 compute cores. In addition, in similar fashion to how the lab has signed on with the OCP to work with the open-source hardware, Rad also will talk about the work the organization is doing with the CloudStack organization and its open cloud orchestration and management software.