Jul 10 2013
Data Center

Hot Topic: Cooling Off the Data Center

Major universities are deploying innovative technologies to use space more efficiently and reduce energy consumption in data centers.

Several colleges and universities are leading the way when it comes to building energy-efficient data centers that are cutting energy bills while meeting ever-increasing user and data demands.

Three institutions in particular, Syracuse University, the University of Rochester and the University of Colorado Boulder, are leveraging innovations in cooling — typically a huge consumer of energy for data centers — while optimizing the use of space.

Orange You Glad?

Syracuse University (SU) can tout a $12 million project that includes on-site power generation, a novel approach to cooling and a significant reduction in computer hardware.

The university installed 12 natural-gas microturbines in its 12,000-square-foot data center to generate power on site. Exhaust gas from those microturbines runs through an absorption chiller that supplies chilled water used to cool the equipment.

At the same time, the university cools its data center with the use of heat exchangers that effectively replace the back door of each server rack. The heat exchangers suck in hot air from the front of the rack and then send cooled air back into the room. The approach obviates the need for room-cooling units and increases the flexibility to configure racks at high density.

“To our knowledge, we were the first of our kind to integrate microturbines and heat exchangers all in one place,” says Chris Sedore, SU’s chief information officer.

Taking the energy efficiency further, SU took an aggressive stance on virtualizing its server infrastructure. The university started with more than 350 physical servers and pushed that number well below 100. That smaller number of physical servers supports 1,200 virtual servers.

“The goodness doesn’t end with just electricity consumption,” Sedore says. “It also has a cascading effect through the chain, including the manufacture of servers,” as well as the floor space required for server infrastructure.

Consolidating servers to preserve floor space is finding broad appeal, says Mark Bowker, senior analyst at Enterprise Strategy Group (ESG) in Milford, Mass.

“Inside just about every data center I walk into now, virtualization had a significant impact on floor space,” he says.

Cooling the ROC

Just west of Syracuse, on the New York State Thruway, the University of Rochester is also cooling at the server rack, using technology that enables roughly a 30 percent increase in server density within the data center’s server cabinets, says CIO David Lewis.

“You can only put so many servers in a rack if you’re doing traditional cooling,” Lewis says. “A lot of issues with data centers arise because they’re out of floor space.”

Lewis estimates the university is achieving 98 percent cooling efficiency through the technology, called OptiCool, and that it will save more than $750,000 in the first five years, assuming flat energy costs.

Dry As a . . . Buffalo?

90%

Percentage of organizations managing 2,000 servers or more that are also measuring power usage effectiveness; nearly half of the survey respondents (45%) are measuring PUE at the most detailed levels.

SOURCE: “Data Center Industry Survey,” Uptime Institute (2012)

At the University of Colorado Boulder, the Green Data Center Team at the National Snow and Ice Data Center (NSIDC) deploys technology known as indirect evaporative cooling, which is as much as 90 percent more efficient than traditional compressor-driven air-conditioning systems, says Ron Weaver, a manager at NSIDC.

Indirect evaporative cooling cools hot air from the data center and then dumps out more humid air as a waste product, ensuring that the air flowing into the computer room is cool and dry, as required by hardware systems. The more commonly used process of direct evaporative cooling raises the relative humidity of the air.

With indirect cooling, “the big upside is you get a cooled airstream below ambient temperature, where the relative humidity isn’t high and is in the serviceable range for the equipment,” Weaver says.

Because of Colorado’s generally cool climate and cold winters, the data center is able to use outside air to help in the cooling process and reduce consumption — an increasingly common practice in commercial data centers. ESG’s Bowker recently toured a Microsoft Azure data center that used the same approach. The reduction in energy consumption at the data center is dramatic: Summer operations previously averaged 32KWh per day and now average 8kwh. In the winter, the consumption has been reduced from 35kwh to 1kwh.

“You can see why we’re pretty excited,” Weaver says.

The system, funded by grants, will pay for itself within a five- to six-year period, by way of reduced costs, Weaver estimates: “That’s pretty darn good by our standards.”

<p>Wavebreak Media/Thinkstock</p>
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT