Dec 06 2010
Data Center

Keep Your Cool

Adopt these cooling technologies and techniques to boost energy efficiency in the data center.

Adopt these cooling technologies and techniques to boost energy efficiency in the data center.

January 2011 E-newsletter

University Deploys UPS Systems

Software Monitors Energy Use

Calculate PUE

Raritan Power IQ

Cooling Techniques Boost Efficiency

For many IT managers, cooling the data center's IT equipment poses a formidable challenge. Common problems include providing too little or too much air, excessively dry or moist air, air that is too cool or too warm, or air conditions that vary significantly from one corner of the data center to another. To remedy this situation, IT leaders often end up implementing excess cooling that wastes energy.

The Green Grid, a consortium dedicated to improving energy efficiency in data centers and computing environments, identified seven strategies that can optimize cooling:

1. Manage airflow. Make a carefully considered air management strategy the starting point of a data center energy savings program because it can significantly reduce operating costs. Without management, air will follow the natural dynamics set up by a facility's physical layout and the positioning and characteristics of its IT and cooling equipment. This could lead to the hot air and cold air mixing, and it could produce uncertainty in the matching of equipment deployment and relative rack capacity.

2. Move cooling closer to the load. Locating cooling closer to IT equipment can reduce data center cooling costs by more than 30 percent compared with historical approaches to cooling. Air conditioning unit fans are known to consume a significant portion of energy in most data center cooling systems. Mounting the cooling modules as closely as possible to the source of heat – directly above, alongside or within high-density racks – reduces the distance that the fans must move the air. This can provide energy savings of up to 70 percent.

This approach can give existing data centers the flexibility to support new and greater IT loads, especially if there's no upgrade path available. New data centers may take a hybrid approach, in which certain racks use high-density supplemental cooling (rack-specific or localized cooling), while others are supported by traditional room cooling (perimeter air cooling units with IT equipment on a raised floor).

3. Operate at a higher delta-T. The term “delta-T” refers to the difference in temperatures of two measured points, and it can describe either the heating of cool air as it passes through IT hardware, or the cooling of hot air as it passes through cooling equipment.

Until very recently, all IT equipment operated with constant-speed fans to accommodate worst-case inlet temperatures. These IT systems were designed in an era when peak cooling and energy efficiency were lower priorities. Today, we know that constant-speed fans are very inefficient for IT equipment whose inlet air temperatures are in a lower, more typical temperature range. Significant power savings can result by operating at a higher delta-T because airflow can be throttled down when it is not needed.

4. Install economizers. Economizers eject waste heat outside of the data center, which reduces the need for refrigeration. There are two main types of economization: airside and waterside. In airside economization, hot return air is vented directly outside and cooler outside air is drawn into the air-handling system for conditioning, humidification and filtration. It is then delivered to the data center as cool supply air. Airside economization produces energy savings any time the outside air temperature is below the return air temperature.

In waterside economization, heat in the return air is transferred to a chilled water loop with traditional air-handling equipment. It is then transported outside via multiple heat exchange mechanisms. Waterside economization is generally limited to use in chilled water systems and is often deployed at a relatively large scale.

5. Use higher-performing equipment. IT teams will encounter many choices in cooling technologies as they seek to retrofit their facilities. Their choices will affect not only capital and operational costs, but the efficiency of the cooling system and data center as a whole. Examples include variable-speed fan drives, high-efficiency motors, instrumentation and controls, humidifiers and more.

As density and the cost of energy continue to increase, total cost of ownership (TCO) and return on investment (ROI) calculations must be factored into the decision-making process. For example, something as simple as a high-efficiency pump or fan motor may cost 25 percent more than a less efficient alternative, but very often the power savings of that pump or fan motor will soon offset the higher initial expense.

6. Use dynamic controls. Take advantage of dynamic controls that allow power demands of cooling equipment to scale with the thermal load of the IT equipment. One way to achieve this is through airflow management, which uses a dynamic fan speed for data center cooling equipment that can lead to significant energy savings. Even a modest change to cooling-equipment airflow will provide a profound power reduction. Another method is to use compressor mass flow management, where incremental efficiency may be gained by modulating the compressor capacity as needed to achieve a balance between cooling and IT equipment thermal load.

7. Maintain a higher operating temperature.Most data centers run at temperatures much lower than is necessary for IT equipment. Consider a higher operational temperature within ASHRAE recommended guidelines for data center environmental conditions to boost efficiency.

Coupled with the energy-saving options of air cooling units (such as economizers, mentioned above), an increase in operating temperature offers the opportunity of saving a large amount of energy; 5 percent or more of the cooling system energy is not uncommon.

It's vital that design choices for a new, expanded or retrofitted data center be made by the entire data center team, including facility owners, IT owners and those individuals responsible for capital and operational budgets.

Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT