As the dog days of summer are upon us, many small and midsize firms see their data center’s cooling systems reaching their limits and beyond. This is especially true for rooms that are not using large, dedicated, chilled water systems with extra capacity. Many IT departments are sweating out the summer, hoping that they will not have servers suddenly crashing from over-temperature shutdowns.
Many times, when the actual capacity of the cooling system is not overwhelmingly exceeded by the actual heat load of the equipment, optimizing the airflow may improve the situation until a new or additional cooling system is installed. There are a number of other things that can also help. Here are 11 tips, tricks and techniques that may not solve the long-term problem, but may help enough to get you through the summer.
Tip #1: Take temperature measurements at the front of the servers. This is where the servers draw in the cool air, and is really the only valid and most important measurement. Take readings at the top, middle and bottom of the front of the racks (assuming that you have a Hot Aisle – Cold Aisle layout). If the bottom areas of the racks are cooler, and you have the space, try to rearrange the servers to the coolest areas.
Tip #2: Make sure that you block off any open, unused space in the front of the racks by using blanking panels. This will prevent hot air from the rear recirculating into the front of the racks.
How to Keep Things Cool in Your Data Center: 11 Tips for Summer
}
Tip #3: If you have a raised floor, make sure that the floor grates (or perforated tiles) are located in front of where the hottest racks are. If necessary, rearrange or change to different floor grates so you match the airflow to the heat load. Be careful not to locate floor grates too close to the CRACs. This will “short circuit” the cool airflow immediately back into the CRACs and rob the rest of the room/row of sufficient cool air.
Tip #4: Check for floor openings inside the cabinets. Cable openings in the floor allow air to escape the raised floor plenum where it is not needed, and lowers the available cold air to the vents (in the cold aisles). Use air containment, brush-type collars kits to minimize this problem.
How to Keep Things Cool in Your Data Center: 11 Tips for Summer
}
Tip #5: If possible, try to evenly distribute and spread the heat loads into every rack to avoid “hot spots.” Remember, check the temperature in the racks at the top, middle and bottom-before you move the servers-and just relocate the warmest servers to a cooler area. Then use blanking panels to fill any gaps.
Tip #6: If you have an overhead ducted system, make sure that the cool air outlets are over the front of the racks and the return ducts are over the rear of the racks. I have seen sites where the ceiling vents and returns are poorly located; the room is very hot, yet the capacity of the cooling system has not been exceeded. Some basic duct work may help to resolve this problem.
Tip #7: Consider adding temporary “roll-in” type cooling units only if you can exhaust the heat into an external area. Running the exhaust ducts into a ceiling that goes back to the CRAC does not work. The heat exhaust ducts of the roll-in must exhaust into an area outside of the controlled space.
Tip #8: When the room is not occupied, turn off the lights. This can save 1 to 3 percent of electrical and heat load which, in a marginal cooling situation, may lower the temperature 1 to 2 degrees.
Tip #9: Check to see if there is any equipment that is still plugged in and powered up but is no longer in production. This is a fairly common occurrence and has an easy fix: Just shut it off!
Tip #10: If you still see 80 degrees Fahrenheit in the cold aisle, don’t panic! Yes, while this is hotter than the proverbial 68-70 degrees Fahrenheit data center “standard” (and you may not enjoy working in the room), it may not be as bad for the servers as you think. If the highest reading is below 85 degrees Fahrenheit (under the worst conditions), your servers are relatively safe. Most modern servers have a higher operating temperature rating, so they can safely operate at over 80 degrees Fahrenheit and even up to 90 degrees Fahrenheit.
While not ideal, it is within most manufacturers’ published specifications. Check with your server and other equipment vendors. Many servers have internal temperature monitors that are accessible via the management software. Even ASHREA has begun to change its recommendations (TC9.9) to allow for equipment operations at higher temperatures and to improve energy efficiency.
Tip #11: If all else fails, have a fall-back plan to shut down the least critical systems. This way, the more critical servers can remain operational. Locate the critical systems in the coolest area. This is better than having random systems unexpectedly shutting down from overheating.
While there is no true quick fix when your heat load totally exceeds your cooling system’s capacity, sometimes just improving the airflow may increase the overall efficiency 5 to 20 percent. This may get you through the hottest days until you can upgrade your cooling systems. In any event, it will lower your energy cost-which is always a good thing.