Facebook's First Open Compute Project Data Center Is Now Complete

 
 
By Chris Preimesberger  |  Posted 2012-08-17
 
 
 

A Modest Sign of the IT Times

A modest sign with Facebook's simple logo at the corner of a large parcel outside Prineville, in north central Oregon, is the only indication that the huge social network has a presence in the area.

A Modest Sign of the IT Times

Putting the Power Usage Effectiveness Out There for All to See

Most data centers don't give tours, much less post a real-time power usage effectiveness (PUE) meter right in the lobby, but Facebook has done both. PUE is a standard measurement used to determine a data center's energy efficiency. This is determined by dividing the amount of power entering a data center system by the power actually used by the IT infrastructure within it. The lower the number, the better; a PUE of 1.0 is the lowest possible. The U.S. Environmental Protection Agency considers a PUE of 1.5 as a best practice. Facebook's data center on this occasion was running 1.11 and averages between 1.05 and 1.18.

Putting the Power Usage Effectiveness Out There for All to See

Comprehensive Air Filtration for Cooling

Data centers, especially massive transactional facilities such as Facebook's that handles billions of Web and mobile transactions per minute, get very hot with all those servers working at the same time. With its first custom-designed and built data center in Prineville, Ore., Facebook took special care to both filter and hydrate the air coming into the facility. It also took special care to ensure that the hot exhaust doesn't find its way back inside. Site director Ken Patchett shows journalists an air chamber with outside air being pumped through cooled-water walls and with special filters before it enters the actual server rooms.

Comprehensive Air Filtration for Cooling

Inexpensive Filters

Patchett shows how light and easy-to-remove are the many air filters. This is not a common method of moving air through a data center. Most conventional data centers use large air-conditioning-type units atop roofs to suck in air from above and move it through ducts into hot server aisles. This air moves laterally through the server rooms and maintains steady temperatures throughout. Intelligence is built into the system. As it gets hotter outside, the system works harder inside. When it's cooler outside, the system winds down to save power.

Inexpensive Filters

Solar Pitches In

Facebook is very conscious of its carbon footprint and uses solar power to help augment power drawn from the Columbia River, about 80 miles north. At the rear right is Facebook's second data center building, which is still under construction. That facility will focus on data storage only, including all photos, videos, documents, cached Web pages, user information, logs—everything.

Solar Pitches In

Roof With a View

A view from the roof of the Facebook data center (showing the second building at the left rear) in Prineville, Ore., shows the type of environment that was chosen for this $210 million investment. Although a lot more capital expense has gone into the project since that number was released last spring. In summer, the air is mostly arid and temperatures can get hot. It was 93 degrees on Aug. 16. Conversely, in winter it can get very cold. Yet the data center is agile enough to handle the huge variance in temperature easily with its automated controls.

Roof With a View

Store-Bought Servers

Half the servers in the data center are what Facebook calls "store-bought" servers from Rackable, Dell, HP, IBM and several other vendors. These are slowly being phased out after their normal three- to five-year life spans of spinning disks 24/7—mostly at top speed.

Store-Bought Servers

Custom Facebook MemCache Racks

Director Ken Patchett shows the racks of custom-designed Facebook MemCache servers that are optimized to move Web content.  You won't find video cards or anything extraneous inside these machines.

Custom Facebook MemCache Racks

Close Look at a 1Gb Server

These custom-built 1Gb servers are stripped down to bare essentials for what Facebook needs to do: move tons of Web-based data fast and to the right place for as many of its 900 million users that are using the service at one time. Officials did not hazard to guess how many of these are firing away in this one data center.  There are 14 server aisles, which Facebook calls data halls. Each half-aisle has approximately 20 to 25 racks of servers, with an average of 22, 19-inch-wide servers per rack. If you do the math, it would come out to about 15,400 servers—7,700 on each side of the main central aisle.

Close Look at a 1Gb Server

Close Look at a 10Gb Server

These custom-built 10Gb servers are stripped down to bare essentials, just like their 1Gb predecessors—only they pack a lot more I/O power and are smaller. Facebook engineers are moving the custom-built machines into the data center racks as the older machines compete their lifecycles of three to five years.

Close Look at a 10Gb Server

High-Security Section

Anything to do with financial or personal information—credit card data, receipts, purchase orders and so on—is maintained in servers behind this high-security cage. Not even the data center manager can get entry to this area very easily. Once someone is inside, he or she is videoed and monitored at all times. "We take our users' trust very seriously," Patchett said.

High-Security Section

At Work in the Aisles

A Facebook technician uses one of the company's custom-designed work carts to bring tools and new parts to a server that fell out of commission. Every server is connected to all others so that when a disk or any component burns out or fails for any reason, the I/O through that unit simply moves to another one at the exact moment of the outage.

At Work in the Aisles

Lighting Up the Big Like Button

If you push in this 3-foot-wide Like button at the Prineville data center, it will light up and stay lit for a while (being energy-conscious, it will turn itself off). This was a big hit at the grand opening of the facility a few months ago.

Lighting Up the Big Like Button

Workplace Setting

At the moment, there are 64 full-time employees running the Facebook data center in Prineville. About 250 construction jobs over a span of two-and-a-half years amounted to 1 million man-hours of labor to complete the massive project.

Workplace Setting

Whimsicality

Strangely, these little gnomes show up in unexpected places throughout the data center, Patchett said.  Here they seem to be having a staff meeting.

Whimsicality

Rocket Fuel