Like most organizations, IT departments at colleges have limited budgets and space to run their data centers, so it simply makes sense for them to rethink the way they organize IT equipment and manage power and cooling.
Such was the case at the University of Maryland’s A. James Clark School of Engineering in College Park, Md., whose 25-year-old data center had taken on a number of resource-intensive applications over the years, including media streaming, advanced lifecycle engineering and mobile web applications.
Jim Zahniser, the school’s executive director of engineering IT, says the challenge for his staff was that, as new needs developed, there was no master plan for laying out the 400-square-foot data center.
“It just evolved over time,” he says. “When we needed to put up a new rack, we found a space and put up a new rack.”
The result was that by 2009, the engineering school had surpassed the compute power of its servers and had run out of space to expand, making it difficult to accommodate any new services.
Recognizing that there was no money to build a new data center, the engineering school brought in Emerson Network Power to run a full data center assessment. Emerson installed sensors to study the data center and then delivered detailed models that showed precisely where the hot and cold air flowed in the room.
Following the study, Zahniser and his team started by reinstalling the data center’s servers in a single row of high-density server racks in a hot aisle/cool aisle configuration (running a cool aisle in front of the rack and a hot aisle behind the servers).
This new configuration is supported by Emerson’s Liebert CRV row-based precision cooling system, backed by Liebert iCOM controls. The Liebert CRV uses energy-efficient variable-speed fans and compressors. By delivering row-based cooling closer to the heat load, overall efficiency improves. The iCOM controls let IT managers monitor the data center environment, including individual rack temperatures, via a web interface.
“The CRV delivers much more efficient cooling,” Zahniser says. “It also lets us build out cooling capacity as we need to add servers and applications.”
The benefits have been dramatic. Zahniser says the combination of the Liebert CRV with the reconfigured server racks has improved the data center power usage effectiveness (PUE) by 5.5 percent. This translates to roughly a $5,000 annual savings on the data center’s electric bill, or potentially $100,000 over the 20-year lifecycle of the Liebert gear.
Technology Officer Omar Siddique points out that the Liebert CRV is running at about 40 percent capacity, which means the college could easily double the number of machines it runs without doing anything else from a cooling perspective.
“We’ve gone from squeezing things into corners and spots in the existing racks to having the ability to build out the entire additional equivalent of our infrastructure again,” Siddique concludes.
“It is imperative that IT managers properly provision base power and cooling infrastructure for future growth — failing to do so leads to expensive fixes down the road.”
— Bill Courtright, Executive Director, Parallel Data Laboratory, Carnegie Mellon University, Pittsburgh
“Power and cooling has become a very important consideration for researchers. At the University of Florida, we are using this as an extra argument to convince researchers to join in our collocation effort to make support for research computing and teaching and training of high-performance computing a shared and centralized activity.”
— Erik Deumens, Director of Research Computing, University of Florida, Gainesville, Fla.
“Implementing practices which enhance air flow within the data center, keeping our hot and cold aisles intact, and improving the cooling efficiency of our computer room air conditioning units is a major focus for us in reducing our consumption of electric power.”
— John H. Kettlewell Jr., Director of Technology -Support Services, George Mason University, Fairfax, Va.