Small Group of Journalists Takes First Tour

 
 
By Chris Preimesberger  |  Posted 2012-08-19 Email Print this article Print
 
 
 
 
 
 
 


 

Facebook on Aug. 16 invited a small group of journalists to tour the new facility. Go here to see a 16-page slideshow of highlights from that tour.

Most data centers don't give tours, much less post a real-time PUE meter right in the lobby, but Facebook does the latter. Power usage effectiveness (PUE) is a standard measurement used to determine the energy efficiency of a data center. This is determined by dividing the amount of power entering a data center system by the power actually used by the infrastructure within it. The lower the number, the better; a PUE of 1.0 is the lowest possible.

The U.S. Environmental Protection Agency considers a PUE of 1.5 as a best practice; Facebook's data center on this day was running 1.11 and averages between 1.05 and 1.18.

Air Filtration System Unique

Data centers, especially super-high-transactional facilities such as a Facebook data center that handle billions of Web and mobile transactions per minute, get very hot with all those servers working at the same time. Facebook took special care to both filter and hydrate the air coming into the facility and make sure the hot exhaust doesn't find its way back inside.

The airflow control process also is a revolutionary one. The company eschewed the standard on-the-roof or off-to-the-side air conditioning units that conventional data centers most often deploy. Facebook sees those as far too energy-hungry for its tastes. Instead, chief designer Jay Park and his group devised a vacuum-type setup that brings in air from the outside along an entire 300-foot-long wall and then forces it laterally through a second wall containing purified water -- a subprocess that amounts to a huge mister. The air that comes through that wall is as cool and comfortable to people as a high-end air conditioning system in a department store.

Finally, the cool, newly humidified air is then forced through a wall of hundreds of lightweight paper filters before being sucked down into the data center -- not through standard ducts, but through 13 6 foot by 6 foot, 14-foot-deep wells placed around the always-warm data halls containing some 15,000 servers that crank away 24/7.

This air moves down through the server rooms and maintains steady temperatures throughout. Intelligence is built into the system; as it gets hotter outside, the system works harder to keep everything cool inside. When it's cooler outside, the system winds down to save power. This is what adds greatly to the data center's PUE.

Conscious of Carbon Footprint

Facebook, very conscious of its carbon footprint, uses solar power to help augment power drawn from the Columbia River, about 80 miles north. It's only a small percentage at this point, but the company plans on increasing the sun's contribution over time.

The Prineville environment was carefully chosen for this $210 million investment (although a lot more capital expense has gone into the project since that number was released last spring). In summer, the air is mostly arid and temperatures can get hot -- it was 93 degrees on this day in August. Conversely, in winter it can get very cold, yet the data center is agile enough to handle the huge variance in temperature easily with its automated controls.

Half the servers in the data center are what Facebook calls "store-bought" servers, from Rackable, Dell, HP, IBM and several other vendors. These are slowly being phased out after their normal 3-to-5-year lifespans of spinning disks 24/7 -- mostly at top speed.

The other half of the server farm consists of customed-designed MemCache servers that are expressly built for moving Web content. You won't find video cards or anything extraneous inside these machines. They're not for everybody, Facebook Director of Site Operations Tom Furlong said, but they work perfectly for Facebook's purpose.

Custom-Built Servers

Facebook is using custom-built 1Gb and 10Gb servers that are stripped down to bare essentials for what Facebook needs to do: move tons of Web-based data fast and to the right place for as many of its 900 million users who want to use the service at one time.

Officials did not hazard to guess how many of these are firing away in this one data center. There are 14 server aisles; Facebook calls them data halls. Each half-aisle has approximately 20 to 25 racks of servers, with an average of 22 19-inch- wide servers per rack. If you do the math, which would come out to about 15,400 servers -- 7,700 on each side of the main central aisle.

Anything to do with financial or personal information -- credit card info, receipts, purchase info and so on -- is maintained in servers behind this high-security cage. Not even the data center manager can get entry to this area very easily. Once someone is inside, he or she is videoed and monitored at all times. "We take our users' trust very seriously," Data Center Director Ken Patchett said.

Facebook technicians use custom-designed work carts to bring tools and new parts to a server that fell out of commission. Every server is connected to all others so that when a disk or any component burns out or stops for any reason, the I/O through that unit simply moves to another one at exactly the time of the outage.

At the moment, there are 64 full-time employees running the Facebook data center in Prineville. About 250 construction jobs over a span of two-and-a-half years amounted to 1 million man-hours of jobs in doing the massive project. 

Prineville is but the first major data center project for the Menlo Park, Calif.-based company. Others currently are being built in Forest City, N.C., and in Lulea, Sweden, using the same principles as Prineville.

Second Facility in the Offing

Facebook is in the process of constructing a second data center building next door to its original Prineville facility. The second location will focus on storage of data only -- including all photos, videos, documents, cached Web pages, user information, logs -- everything. That one's probably a year away from completion.

Furlong, one of the designers of the Prineville facility, told eWEEK that in addition to the center helping to carry out the mission of Facebook, "we wanted to build the most efficient system possible. It's the combination of the data center, the server and the software that makes it happen. And now we know how to do this in various regions of the world so that they all work together."

Patchett summed up Facebook's -- and its newest data center's -- mission in this way: "Anywhere you are in the world, at any given time, no matter what day or what hour it is, there's always somebody waking up or going to sleep. And when they get on and they go to Facebook.com and they want to interact with their friends and see what's going on in the world, you've got to be available."

Chris Preimesberger is Editor of Features and Analysis for eWEEK. Twitter: @editingwhiz



 
 
 
 
Chris Preimesberger Chris Preimesberger was named Editor-in-Chief of Features & Analysis at eWEEK in November 2011. Previously he served eWEEK as Senior Writer, covering a range of IT sectors that include data center systems, cloud computing, storage, virtualization, green IT, e-discovery and IT governance. His blog, Storage Station, is considered a go-to information source. Chris won a national Folio Award for magazine writing in November 2011 for a cover story on Salesforce.com and CEO-founder Marc Benioff, and he has served as a judge for the SIIA Codie Awards since 2005. In previous IT journalism, Chris was a founding editor of both IT Manager's Journal and DevX.com and was managing editor of Software Development magazine. His diverse resume also includes: sportswriter for the Los Angeles Daily News, covering NCAA and NBA basketball, television critic for the Palo Alto Times Tribune, and Sports Information Director at Stanford University. He has served as a correspondent for The Associated Press, covering Stanford and NCAA tournament basketball, since 1983. He has covered a number of major events, including the 1984 Democratic National Convention, a Presidential press conference at the White House in 1993, the Emmy Awards (three times), two Rose Bowls, the Fiesta Bowl, several NCAA men's and women's basketball tournaments, a Formula One Grand Prix auto race, a heavyweight boxing championship bout (Ali vs. Spinks, 1978), and the 1985 Super Bowl. A 1975 graduate of Pepperdine University in Malibu, Calif., Chris has won more than a dozen regional and national awards for his work. He and his wife, Rebecca, have four children and reside in Redwood City, Calif.Follow on Twitter: editingwhiz
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel