Open Compute Test Lab Head Sees Big Future for Facility

 
 
By Jeff Burt  |  Posted 2014-05-07 Email Print this article Print
 
 
 
 
 
 
 

The Texas-based open hardware certification lab will become a center for open hardware innovation, the UTSA lab's director says.

SAN ANTONIO—At first glance, there's not much to see at the months-old Open Compute Lab here at the University of Texas at San Antonio. Off the main area is a windowless room populated by a few rows of cubicles, with a whiteboard colorfully decorated with various sketches, designs and notes stretching across one wall.

Off that room is a smaller one featuring two racks of systems running through their tests. The main area itself is a somewhat wide open area with its share of desks cluttered with various systems, and standing in the middle are two partially populated racks.

To the first-time visitor, it has a feel of a new apartment where most of the boxes have yet to be unpacked.

But what Paul Rad sees is potential, the beginnings of an operation that promises huge benefits for tech vendors, end users and the university researchers alike.

"Come back a few months from now. This place will look a lot different," Rad, director of the University of Texas at San Antonio UTSA Open Compute certification lab and vice president of research at Rackspace, told eWEEK during a recent visit to the lab. "There is a lot happening here."

Facebook started the Open Compute Project (OCP) in 2011, kicking off a consortium that now counts such names as Intel, Advanced Micro Devices, Broadcom, Mellanox Technologies and Rackspace as members. Like other Web-based companies running huge, hyperscale data centers, Facebook was finding that off-the-shelf servers and other systems were too costly and not nearly power-efficient for its needs. Instead, the social networking giant built its own systems and software, resulting in 38 percent greater efficiency and 24 percent cost reductions in its data center operations.

Facebook then open-sourced its hardware designs, creating the Open Compute Project in hopes of creating standards for highly efficient data center and IT hardware. The consortium has since branched out beyond servers to cover storage, networking and hardware management.

At the Open Compute Summit in January, Frank Frankovsky, OCP president and chairman, announced the opening of two certification labs—one in Taiwan and the other at UTSA's Cloud and Big Data Laboratory—which are tasked with testing systems for quality assurance and ensuring they meet OCP specifications. The lab already has certified AMD's Open System 3.0 "Roadrunner" and Intel's Decathlete server boards. In addition, officials with original design manufacturer (ODM) Quanta in January said the vendor has seen three servers certified by the lab.

The testing and the certification of the systems that the labs do is crucial to driving adoption of the open-source systems coming out of the OCP, according to Matt Kimball, senior strategic marketing manager at AMD. IT professionals are increasingly realizing that many of the servers that are running in their data centers are commodity products, and they are intrigued by the energy efficiency and cost savings the open systems promise.

However, they also understand that the extra money they're paying now for servers from Hewlett-Packard and Dell means high levels of reliability and availability, and if a server breaks, they know whom to call, Kimball told eWEEK.

"HP might have me pay a premium, Dell might have me pay a premium," he said. "But there's a reason for it."



 
 
 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel