11 Best Practices to Save on Data Center Power Draw

 
 
By Chris Preimesberger  |  Posted 2008-11-13 Email Print this article Print
 
 
 
 
 
 
 

Data center managers could save millions of kilowatt hours annually by implementing 11 best practices, says analyst company Gartner. Most of these projects can be completed with little or no budget or effort, Gartner claims. Here's the list of 11 power-saving practices.

Click here to read more about Gartner's report on how data centers can save 1 million kilowatt hours annually using best practices.

Here is Gartner's list of green IT best practices for data center managers:

Plug holes in the raised floor: Most raised-floor environments exhibit cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10 percent of the energy used for data center cooling.

Install blanking panels: Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air leaving one piece of equipment from entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22 degrees Fahrenheit, greatly reducing the electricity consumed by fans in the IT equipment and potentially alleviating hot spots in the data center.

Coordinate CRAC units: Older CRAC (computer room air-conditioning) units operate independently with respect to cooling and dehumidifying the air. These units should be tied together with newer technologies so that their efforts are coordinated, or managers should remove humidification responsibilities from them altogether and place those responsibilities on a newer piece of technology.

Improve underfloor airflow: Older data centers typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old data centers have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.

Implement hot aisles and cold aisles: In traditional data centers, racks were set up in what is sometimes referred to as "classroom style," where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past 10 years demonstrate that organizing rows into hot aisles and cold aisles is better for controlling the flow of air in the data center.

Install sensors: A small number of individual sensors can be placed in areas where temperature problems are suspected. Simple sensors store temperature data that can be manually collected and transferred into a spreadsheet, where it can be further analyzed. Even this minimal investment in instrumentation can provide great insight into the dynamics of possible data center temperature problems and can provide a method for analyzing the results of improvements made to data center cooling.

Implement cold-aisle or hot-aisle containment: Once a data center has been organized around hot aisles and cold aisles, dramatically improved separation of cold supply air and hot exhaust air through containment becomes an option. For most users, hot-aisle containment or cold-aisle containment will have the single largest payback of any of these energy efficiency best practices.

Raise the temperature in the data center: Many data centers are run colder than an efficient standard. ASHRAE (the American Society of Heating, Refrigerating, and Air-Conditioning Engineers) has increased the top end of allowable supply-side air temperatures from 77 to 80 degrees Fahrenheit. Not all data centers should be run at the top end of this temperature range, but a step-by-step increase, even to the 75 to 76 F range, would have a beneficial effect on data center electrical use.

Install variable-speed fans and pumps: Traditional CRAC and CRAH (computer room air handler) units contain fans that run at a single speed. Emerging best practices suggest that variable-speed fans be used whenever possible. A reduction of 10 percent in fan speed yields an approximately 27 percent reduction in the fan's electrical use, and a 20 percent reduction in speed yields electrical savings of approximately 49 percent.

Exploit "free cooling": Free cooling is the general term for any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economization and water-side economization. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.

Design new data centers using modular cooling: Traditional raised-floor-perimeter air distribution systems have long been the method used to cool data centers. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more energy-efficient data center cooling strategy.

The entire report ($195) can be found on the Gartner Web site: "How to Save a Million Kilowatt Hours in Your Data Center."

 
 
 
 
Chris Preimesberger Chris Preimesberger was named Editor-in-Chief of Features & Analysis at eWEEK in November 2011. Previously he served eWEEK as Senior Writer, covering a range of IT sectors that include data center systems, cloud computing, storage, virtualization, green IT, e-discovery and IT governance. His blog, Storage Station, is considered a go-to information source. Chris won a national Folio Award for magazine writing in November 2011 for a cover story on Salesforce.com and CEO-founder Marc Benioff, and he has served as a judge for the SIIA Codie Awards since 2005. In previous IT journalism, Chris was a founding editor of both IT Manager's Journal and DevX.com and was managing editor of Software Development magazine. His diverse resume also includes: sportswriter for the Los Angeles Daily News, covering NCAA and NBA basketball, television critic for the Palo Alto Times Tribune, and Sports Information Director at Stanford University. He has served as a correspondent for The Associated Press, covering Stanford and NCAA tournament basketball, since 1983. He has covered a number of major events, including the 1984 Democratic National Convention, a Presidential press conference at the White House in 1993, the Emmy Awards (three times), two Rose Bowls, the Fiesta Bowl, several NCAA men's and women's basketball tournaments, a Formula One Grand Prix auto race, a heavyweight boxing championship bout (Ali vs. Spinks, 1978), and the 1985 Super Bowl. A 1975 graduate of Pepperdine University in Malibu, Calif., Chris has won more than a dozen regional and national awards for his work. He and his wife, Rebecca, have four children and reside in Redwood City, Calif.Follow on Twitter: editingwhiz
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date
Rocket Fuel