College Saves Big by Upgrading to Modern Data Center
Talk about a makeover. A few years ago, the School of Visual Arts’ old-fashioned data center lacked virtualized servers or centralized storage, and air conditioners ran all day.
“We were constantly addressing issues with power and cooling,” says CIO Cosmin Tomescu of the New York City school. “There were battery failures, fluctuations in power and servers that would overheat — we never knew what was going to happen next.”
Tomescu convinced the administration that the best way to deliver modern classrooms and media spaces where students and faculty could collaborate was by completely overhauling the network and data center. The IT department, which supports 17 buildings on the east and west sides of Manhattan, moved its data center to a new 20-foot-by-20-foot facility on the eighth floor of a building in Chelsea.
It was an enormous undertaking, Tomescu says. For starters, 5,500-pound Emerson Network Power Liebert uninterruptible power supply battery cabinets exceeded the 100-pounds-per-square-inch load rating of the floor, so contractors had to build a foot-deep concrete slab with reinforced steel to support the new data center gear.
The school had a modular chiller tower with redundant pumps installed on the 12th floor and run to the eighth floor, where it was integrated with three Liebert CRV in-row cooling systems. The CRVs cool the server hardware, while Liebert NX uninterruptible power supplies deliver backup power. All the Emerson racks and CRVs are powered by a custom-fitted Liebert Modular 400Amp busway and monitored by SiteScan, Emerson’s advanced monitoring and control system.
Today, 175 virtual machines reside on 12 blades, and storage runs over a 400-terabyte storage area network. “The entire setup with virtualization and improved power and cooling has reduced our man-hours spent in the data center on power and cooling issues by 96 percent and slashed our electric bill $60,000 annually,” Tomescu says. “Our department has become proactive as opposed to reactive. Now, we can focus on new technology projects and educational applications.”
David Cappuccio, a managing vice president for Gartner and chief of research for the infrastructure teams, says that while the School of Visual Arts’ deployment was a bit more elaborate because of the special requirements in an older building, more colleges and universities are moving to in-row cooling. “The idea is to bring the solution to the problem,” he says. “Why cool the entire room when you can focus on cooling the equipment where the heat is coming from? We’ve seen organizations reduce their power and cooling costs by as much as 30 percent.”
Out With the Old
Before Coppin State University installed a new power and cooling system, if the power went out in the main administration building, the IT staff would have no cooling and only a five-minute window before losing power; only 15 minutes if power went out in the Grace Jacobs building, the site where many of the Baltimore college’s classes are held.
SOURCE: “2013 Study on Data Center Outages” (Ponemon Institute, Sept. 2013)
Thomas Smith III, director of campus network services, says Coppin State replaced the power and cooling equipment in the main administration building about three years ago. The team deployed an emergency generator, two Liebert CRV InRow cooling systems, two supplemental Computer Room Air Conditioning (CRAC) units and two Liebert NX uninterruptible power supplies. “Now, if the power goes down in the main administration building, we have a window of closer to 45 minutes on top of the power we get from the generators,” he says.
The Grace Jacobs building received two CRVs and two CRACs, and Smith eventually plans to replace the existing Powerware UPSs with two Liebert NX UPSs in that building as well.
Smith says with the in-room cooling, “now we’re only cooling what’s needed as opposed to the entire room, so we’re saving a considerable amount on our electric bill.”
Three Tips for Upgrading a Data Center
Most data centers must be upgraded while work continues. Gartner’s David Cappuccio offers three tips for retrofitting an existing data center.
- Break floor space into discrete sections. Clear out a small section of floor space — roughly four racks of space — for an in-row cooling unit. It could be as small as 60 to 120 square feet and reside on an existing section of raised floor or on a slab.
Depending on the vendor selected, the self-contained rack unit will require power from an existing power distribution unit, or in some cases, a refrigerant or cooling distribution unit. Assume an increase in per-rack space of about 20 percent to take into account additional supporting equipment. - Reconfigure and defragment the floor. It’s unlikely that the workloads moved to the new enclosure will all come from the same racks, which means that the older section of the room will now be heavily fragmented. Move workloads out of underutilized racks to free up additional floor space for the next self-contained installation.
This reconfiguration will take time and affect servers, storage and networking components and connections. Much of the activity will need to happen in off-hours or on weekends, so it’s critical to integrate this work into the organization’s change control processes. - Reconfigure again. By implementing a phased retrofit, data center managers can attain significant growth within the facility while reducing power and cooling requirements.
Implementing more efficient cooling can boost equipment density and PDU utilization at the rack level. A more efficient cooling delivery system also requires less overall power to support a given IT load, freeing up additional power for future growth.