eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.
1A Modest Sign of the IT Times
2Putting the Power Usage Effectiveness Out There for All to See
Most data centers don’t give tours, much less post a real-time power usage effectiveness (PUE) meter right in the lobby, but Facebook has done both. PUE is a standard measurement used to determine a data center’s energy efficiency. This is determined by dividing the amount of power entering a data center system by the power actually used by the IT infrastructure within it. The lower the number, the better; a PUE of 1.0 is the lowest possible. The U.S. Environmental Protection Agency considers a PUE of 1.5 as a best practice. Facebook’s data center on this occasion was running 1.11 and averages between 1.05 and 1.18.
3Comprehensive Air Filtration for Cooling
Data centers, especially massive transactional facilities such as Facebook’s that handles billions of Web and mobile transactions per minute, get very hot with all those servers working at the same time. With its first custom-designed and built data center in Prineville, Ore., Facebook took special care to both filter and hydrate the air coming into the facility. It also took special care to ensure that the hot exhaust doesn’t find its way back inside. Site director Ken Patchett shows journalists an air chamber with outside air being pumped through cooled-water walls and with special filters before it enters the actual server rooms.
4Inexpensive Filters
Patchett shows how light and easy-to-remove are the many air filters. This is not a common method of moving air through a data center. Most conventional data centers use large air-conditioning-type units atop roofs to suck in air from above and move it through ducts into hot server aisles. This air moves laterally through the server rooms and maintains steady temperatures throughout. Intelligence is built into the system. As it gets hotter outside, the system works harder inside. When it’s cooler outside, the system winds down to save power.
5Solar Pitches In
Facebook is very conscious of its carbon footprint and uses solar power to help augment power drawn from the Columbia River, about 80 miles north. At the rear right is Facebook’s second data center building, which is still under construction. That facility will focus on data storage only, including all photos, videos, documents, cached Web pages, user information, logs—everything.
6Roof With a View
A view from the roof of the Facebook data center (showing the second building at the left rear) in Prineville, Ore., shows the type of environment that was chosen for this $210 million investment. Although a lot more capital expense has gone into the project since that number was released last spring. In summer, the air is mostly arid and temperatures can get hot. It was 93 degrees on Aug. 16. Conversely, in winter it can get very cold. Yet the data center is agile enough to handle the huge variance in temperature easily with its automated controls.
7Store-Bought Servers
8Custom Facebook MemCache Racks
9Close Look at a 1Gb Server
These custom-built 1Gb servers are stripped down to bare essentials for what Facebook needs to do: move tons of Web-based data fast and to the right place for as many of its 900 million users that are using the service at one time. Officials did not hazard to guess how many of these are firing away in this one data center. There are 14 server aisles, which Facebook calls data halls. Each half-aisle has approximately 20 to 25 racks of servers, with an average of 22, 19-inch-wide servers per rack. If you do the math, it would come out to about 15,400 servers—7,700 on each side of the main central aisle.
10Close Look at a 10Gb Server
These custom-built 10Gb servers are stripped down to bare essentials, just like their 1Gb predecessors—only they pack a lot more I/O power and are smaller. Facebook engineers are moving the custom-built machines into the data center racks as the older machines compete their lifecycles of three to five years.
11High-Security Section
Anything to do with financial or personal information—credit card data, receipts, purchase orders and so on—is maintained in servers behind this high-security cage. Not even the data center manager can get entry to this area very easily. Once someone is inside, he or she is videoed and monitored at all times. “We take our users’ trust very seriously,” Patchett said.
12At Work in the Aisles
A Facebook technician uses one of the company’s custom-designed work carts to bring tools and new parts to a server that fell out of commission. Every server is connected to all others so that when a disk or any component burns out or fails for any reason, the I/O through that unit simply moves to another one at the exact moment of the outage.
13Lighting Up the Big Like Button
14Workplace Setting
15Whimsicality
Strangely, these little gnomes show up in unexpected places throughout the data center, Patchett said. Here they seem to be having a staff meeting.