How to Keep Things Cool in Your Data Center: 11 Tips for Summer
}Tip #6: If you have an overhead ducted system, make sure that the cool air outlets are over the front of the racks and the return ducts are over the rear of the racks. I have seen sites where the ceiling vents and returns are poorly located; the room is very hot, yet the capacity of the cooling system has not been exceeded. Some basic duct work may help to resolve this problem. Tip #7: Consider adding temporary "roll-in" type cooling units only if you can exhaust the heat into an external area. Running the exhaust ducts into a ceiling that goes back to the CRAC does not work. The heat exhaust ducts of the roll-in must exhaust into an area outside of the controlled space. Tip #8: When the room is not occupied, turn off the lights. This can save 1 to 3 percent of electrical and heat load which, in a marginal cooling situation, may lower the temperature 1 to 2 degrees. Tip #9: Check to see if there is any equipment that is still plugged in and powered up but is no longer in production. This is a fairly common occurrence and has an easy fix: Just shut it off! Tip #10: If you still see 80 degrees Fahrenheit in the cold aisle, don't panic! Yes, while this is hotter than the proverbial 68-70 degrees Fahrenheit data center "standard" (and you may not enjoy working in the room), it may not be as bad for the servers as you think. If the highest reading is below 85 degrees Fahrenheit (under the worst conditions), your servers are relatively safe. Most modern servers have a higher operating temperature rating, so they can safely operate at over 80 degrees Fahrenheit and even up to 90 degrees Fahrenheit. While not ideal, it is within most manufacturers' published specifications. Check with your server and other equipment vendors. Many servers have internal temperature monitors that are accessible via the management software. Even ASHREA has begun to change its recommendations (TC9.9) to allow for equipment operations at higher temperatures and to improve energy efficiency. Tip #11: If all else fails, have a fall-back plan to shut down the least critical systems. This way, the more critical servers can remain operational. Locate the critical systems in the coolest area. This is better than having random systems unexpectedly shutting down from overheating. While there is no true quick fix when your heat load totally exceeds your cooling system's capacity, sometimes just improving the airflow may increase the overall efficiency 5 to 20 percent. This may get you through the hottest days until you can upgrade your cooling systems. In any event, it will lower your energy cost-which is always a good thing. Julius Neudorfer is the Director of Network Services and a founder of North American Access Technologies, Inc. Since 1987, Julius has been involved with designing Data and Voice Networks and Data Center Infrastructure. He personally holds a patent for a network-based facsimile PBX system. Julius is also the primary designer of the NAAT Mobile Emergency Data Center. Over the last 20 years, Julius has designed and overseen the implementation of many advanced Integrated Network Solutions for clients. He can be reached at email@example.com.
Tip #5: If possible, try to evenly distribute and spread the heat loads into every rack to avoid "hot spots." Remember, check the temperature in the racks at the top, middle and bottom-before you move the servers-and just relocate the warmest servers to a cooler area. Then use blanking panels to fill any gaps.