Here is Gartner’s list of green IT best practices for data center managers:
Plug holes in the raised floor: Most raised-floor environments exhibit cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10 percent of the energy used for data center cooling.
Install blanking panels: Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air leaving one piece of equipment from entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22 degrees Fahrenheit, greatly reducing the electricity consumed by fans in the IT equipment and potentially alleviating hot spots in the data center.
Coordinate CRAC units: Older CRAC (computer room air-conditioning) units operate independently with respect to cooling and dehumidifying the air. These units should be tied together with newer technologies so that their efforts are coordinated, or managers should remove humidification responsibilities from them altogether and place those responsibilities on a newer piece of technology.
Improve underfloor airflow: Older data centers typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old data centers have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.
Implement hot aisles and cold aisles: In traditional data centers, racks were set up in what is sometimes referred to as “classroom style,” where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past 10 years demonstrate that organizing rows into hot aisles and cold aisles is better for controlling the flow of air in the data center.
Install sensors: A small number of individual sensors can be placed in areas where temperature problems are suspected. Simple sensors store temperature data that can be manually collected and transferred into a spreadsheet, where it can be further analyzed. Even this minimal investment in instrumentation can provide great insight into the dynamics of possible data center temperature problems and can provide a method for analyzing the results of improvements made to data center cooling.
Implement cold-aisle or hot-aisle containment: Once a data center has been organized around hot aisles and cold aisles, dramatically improved separation of cold supply air and hot exhaust air through containment becomes an option. For most users, hot-aisle containment or cold-aisle containment will have the single largest payback of any of these energy efficiency best practices.
Raise the temperature in the data center: Many data centers are run colder than an efficient standard. ASHRAE (the American Society of Heating, Refrigerating, and Air-Conditioning Engineers) has increased the top end of allowable supply-side air temperatures from 77 to 80 degrees Fahrenheit. Not all data centers should be run at the top end of this temperature range, but a step-by-step increase, even to the 75 to 76 F range, would have a beneficial effect on data center electrical use.
Install variable-speed fans and pumps: Traditional CRAC and CRAH (computer room air handler) units contain fans that run at a single speed. Emerging best practices suggest that variable-speed fans be used whenever possible. A reduction of 10 percent in fan speed yields an approximately 27 percent reduction in the fan’s electrical use, and a 20 percent reduction in speed yields electrical savings of approximately 49 percent.
Exploit “free cooling”: Free cooling is the general term for any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economization and water-side economization. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.
Design new data centers using modular cooling: Traditional raised-floor-perimeter air distribution systems have long been the method used to cool data centers. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more energy-efficient data center cooling strategy.
The entire report ($195) can be found on the Gartner Web site: “How to Save a Million Kilowatt Hours in Your Data Center.”