Keeping School Data Centers Cool
With a data center the size of a small classroom shoehorned into the Washingtonville Central School District’s one high school, it’s no wonder that cooling has been a problem for the district for years.
The data center, home to equipment that houses all data for the New York district’s five schools and 4,700 students, had its first makeover in decades a few years ago when it annexed an adjacent closet, giving it an extra 6 feet of space. This additional space made it possible to upgrade the data center’s struggling cooling system.
“Our data center is very small, and it only had two air conditioners to service it. One of them was an old-fashioned window air conditioning unit that failed or froze up frequently, and our systems were overheating a lot,” explains Justin Schaef, the school district’s director of data management and technology.
After briefly considering a raised-floor system, Schaef decided on a more innovative approach that would essentially create enclosed cooled spaces throughout the data center where needed. The new system, implemented in summer 2010, uses APC NetShelter SX 42U rack enclosures with APC’s InRow cooling and temperature sensors to maintain the correct temperature within the enclosures. The racks contain Cisco Systems UCS servers, virtualized with VMware. Because each rack is cooled independently, a problem in one rack won’t affect the cooling of other components in the data center.
The difference is night and day, Schaef says.
“Before, we had constant overheating problems that were so bad that we replaced our metal door with a metal grate door just to let air in,” he says. “Now we can go back to a solid metal door that’s more secure, and we don’t have to worry about cooling issues.”
Many organizations are moving to hot aisle/cold aisle containment — available from manufacturers such as APC, HP, Tripp Lite and Black Box — to improve cooling efficiency and reduce costs.
“It’s pretty much a given for new build-outs because higher-density equipment is so common today,” says Jason Schafer, a research manager at Tier1 Research of Bethesda, Md. “Even five years ago, 1 to 2 kilowatts per rack was average, but now it can be 10kW per rack or more. That makes hot-and cold-aisle containment pretty important.”
It’s more difficult to retrofit existing data centers with hot- or cold-aisle containment. In many cases, it’s worth the effort, Schafer says.
That’s one of the routes Steven Clagg is considering. Clagg, CIO of Aurora Public Schools in Colorado, has made retrofitting the district’s data center one of his priorities since taking the job in 2010.
“When you walk through the data center, it’s pretty obvious that it isn’t efficient or cool enough. There are a lot of hot spots, and in some places, it’s actually warmer than it is outside,” he says.
Part of the problem is the way the data center developed over time. The rectangular room has two air conditioning units on one end and no cooling mechanism on the other. The server racks are also arranged inefficiently, with hot air blowing into cold air areas.
Clagg put a few enterprising student interns to work studying the problem, and they have come up with some viable solutions. The least expensive option is to rearrange the aisles and create a heat plenum.
Creating hot-aisle and/or cold-aisle containment is another option. Although it would cost more in the short run, it may make economic sense because of the improvements it provides in cooling efficiency.
“We’re considering all options at this point, and we hope to get it worked out by next year,” Clagg says.
Hot- Versus Cold-Aisle Containment
Without some type of containment system, heat will inevitably leak into the environment, reducing the efficiency of data center cooling efforts. The idea of containment — whether hot or cold — is to prevent the recirculation of air that has not been cooled, explains Jason Schafer, research manager at Tier1 Research.
With hot-aisle containment, racks are arranged in rows with the backs of the servers facing each other. Exhaust from the servers is emitted into the hot aisle and routed back to the computer room air conditioner (CRAC) unit. This way, hot air emitted from the servers is captured and prevented from entering the rest of the data center.
In a cold-aisle containment configuration, racks are arranged front to front, and the area between the rows is contained. That way, supply and return air are fully separated. Cold air reaches the cold aisles through a raised floor, and the hot air exhausted from the servers is routed from the racks back into the CRAC unit.
Each approach has its pros and cons, but combining them provides the best of both worlds, Schafer says.