Google Explains How Green Is Good for Its Data Centers

Green has been very, very good to Google. That's the theme of Google's new Web site on data center efficiency. On the site, Google explains how the cost of the IT infrastructure that powers its search engine and other applications is lower than the industry average. Google uses bare-bones parts in its servers and recycles water to meet its green IT goals.

Google is very coy about what exactly makes up its infrastructure. For example, ask how many servers the company has or whose gear its running, and you'll get sly smiles.
Google doesn't want us to know how many commodity servers it has in its dozens of data centers all over the world, but now that green is officially the new "good," the company has created a Web site that details in general, and even with some basic formulas, how the company's servers and data centers are more power-efficient than those fueled by others.
The search giant on the site details how it reduced the amount of energy needed for the data centers to the point where, Google claims, "in the time it takes to do a Google search, your own personal computer will use more energy than we will use to answer your query."
It's not clear what the formula is to prove that, but Google provides a convincing five-step plan to data center efficiency here. In short, Google claims its servers are more efficient because they use better voltage regulators, avoid graphics chips and wisely use fans for cooling machinery.

To see pictures of Google's Lenoir Data Center, click here.

Speaking of cooling, Google also uses evaporated water to cool its gear. Google recycles this water to avoid using extra drinking water. By 2010, Google said recycled water will provide 80 percent of the company's total water consumption.
Google uses the PUE (Power Usage Effectiveness) metric, and it's here that some industry watchers are scrutinizing Google for any trickery. PUE dictates the ratio of the total power consumed by a data center to the power consumed by the IT equipment in the facility.
A PUE of 2.0 means that for every watt of IT power, an additional watt is consumed to cool and distribute power to the IT equipment. Clearly, the ideal PUE is 1.0, but that seems unattainable. Google claims six of its data centers average out to a PUE of 1.2, which is phenomenal.