By now you’ve heard so much about green IT that you’re likely tired of it. You already know that your operation can’t afford to replace its data center. In fact, your budget can’t support anything new that doesn’t have an immediate payoff. On the other hand, a dramatic cut in energy costs would help your budget.
But what about all those interesting green IT solutions you keep hearing about? You know that you can buy servers that are much more efficient than the machines you’ve had in your data center for the last couple of years. You also know you can spend money on licenses so you can virtualize your environment and make better use of your servers. But even though all that gear will pay dividends in energy savings down the road, you don’t have the money to spend on it.
Fortunately, there’s a lot you can do to cut your energy consumption for maximum impact and with minimal attention and effort. Even better, some of the most effective areas of energy savings don’t involve expensive new servers and upgraded cooling.
“The No. 1 thing to look at is the cooling side,” said Kevin Brown, vice president of global data center solutions for Schneider Electric. “Look at your airflow patterns and look at whether you’re getting efficient distribution of air. Look under the raised floor.”
Brown said that air flow blockages and inefficiencies are perhaps the biggest waste of energy in data centers. In data centers with raised floors, this can mean problems like missing tiles or wrongly placed air vents. But it can also mean that there’s junk under the floor that’s keeping the cooling system from working properly. To wit: “I found a Christmas tree under a data center floor,” Brown said.
A major goal, added Brown, is to eliminate as much cooling as possible. “If they do a really good job cleaning up their airflow, they might be able to start turning off their CRACs [computer room air conditioners]. That’s the low-hanging fruit,” he said.
Making sure your AC is fully loaded is one way to save significant energy in the data center, said Daniel Golding, vice president and research director of Tier 1 Research. “If you have two running at 50 percent, it’s better to have one at 100 percent,” he said.
Golding added that most data centers are overcooled. “Simply raise the temperature of your data center,” he said. “The idea that you need 60-degree data centers is completely inaccurate. The idea that you need it below the level of human comfort is inaccurate-80 to 85 degrees Fahrenheit is fine.”
While the idea of a warm data center may be anathema to some IT managers, the fact is that modern equipment is designed to work in an environment with input air temperatures as high as 90 degrees Fahrenheit.
Golding also suggested using outside air where possible so you don’t even have to run the AC. He pointed out that there’s no science to support the widely held belief that outside air, with its pollution and dirt, will damage computer equipment “unless maybe you’re in China or L.A.”
Lex Coors isn’t so sure he agrees with Golding about simply using outside air instead of air conditioning, but he agrees on a lot of other things. Coors, vice president of data center technology for Amsterdam-based Interxion, runs co-located data centers for companies around the world. He said that some of the most effective steps are also some of the easiest: “closing the air leaks in tiles, walls, doors, putting blanking plates in cabinets.”
Coors also said that a dramatic improvement can be gained by the installation of hot aisle and cold aisle containment-the process of keeping hot and cold air from mixing and reducing efficiency.
Consolidating Operations on Newer Servers
Of course, there are other things you can do. Brown suggests that you can probably save a lot of energy, and thus money, by consolidating your operations on your newer servers using a virtualized environment and simply shutting off your old servers.
Julius Neudorfer, CTO of North American Access Technologies, agrees, and says that a combination of picking the right servers for the work and the right power and cooling solution for the servers will result in a major savings of energy without having to replace much, if any, of your data center.
Neudorfer suggests focusing instead on more efficient use. He said the greatest return is on the cooling side, and for that, a data center needs a strategy for containment and keeping hot and cold air from mixing.
Neudorfer said that products are available to allow containment to be retrofitted in existing data centers, including solutions from APC, Rittal and Knuerr. “You can do it rack by rack without any major interruption of service,” he said. Neudorfer noted that in many cases, retrofitting your data center can simply mean placing a transparent cover over the hot or cold aisles between racks.
Along with cover plates on the racks themselves, this will provide an easy, do-it-yourself containment solution. Neudorfer also suggests moving the servers to the bottom of the racks, since the coldest air is at the bottom.
Neudorfer and Coors both recommend finding ways to use what they call “free cooling.” This means that you can use your existing ventilation system to bring in outside air, as Golding suggests, or use outside air to cool your HVAC instead of using a compressor, as is done with most AC systems. Either way, you avoid use of the air conditioning compressor-perhaps the single biggest energy use in the data center.
But still, what about all those servers? Kosten Metreweli suggests making sure you actually need them before you even worry about moving them or virtualizing them. He said the biggest problem faced by many large data centers is “not knowing what’s in your data center.”
Metreweli, vice president of Tideway, said that when he audits a data center, it’s not unusual to find that data center inventories are wrong. “If we compare what we’ve found with what the customer has recorded, they’ve only documented what they have 50 to 75 percent of the time. They have servers sitting there that no one knows what they do.”
Metreweli said that it’s common to find servers that support applications or databases that are no longer in use, and that simply shutting these servers down will go unnoticed.
He said he also finds switches, routers and other network devices that are no longer being used but that are still running, still using energy and still requiring cooling.
Equally important, Metreweli noted, is that many data centers have relatively new servers that are badly underused and could support virtualization, while other, much older servers are still running and supporting relatively little work. He said that these older servers could be taken out of operation and their work shifted to newer platforms that still have plenty of capacity.
“We’re starting to see companies getting assessments of their data centers,” Brown noted. “They have to sell the CIO that there has to be some investment made in their data center.”
Brown noted that, in many cases, an accurate assessment, plus some simple moves such as improving the efficiency of the data center, will not only pay for themselves but will bring enough of a return that companies can take next steps, such as server consolidation and perhaps server upgrades.
Contributing Analysts Wayne Rash can be reached at wrash@eweek.com.