The infrastructure at Southern California University of Health Sciences is more stable now, thanks to a combination of a SAN and server virtualization, say John Gillett and Ernesto Lopez.

What a Relief

Universities identify best practices for combining virtual servers and SANs to ease stress on their networks.

John Gillett, network administrator at Southern California University of Health Sciences, cringes when he thinks back about how common it was for certain sections of campus to fall off the network.

Before the Whittier, Calif., college of chiropractic and acupuncture/oriental medicine integrated virtual servers with a storage area network, Gillett says the network used to go down nearly every week.

“Now, we ask each other when we last got a network call,” he says, with a relieved laugh.

At the beginning of 2010, SCUHS deployed an HP LeftHand SAN, plus HP servers running Microsoft Windows Server 2008 R2 Datacenter. Microsoft Windows Hyper-V server virtualization technology runs under Datacenter and clusters as many as 38 virtual servers per cluster.

Before this deployment, the network was very unstable, Gillett says. Today, the new setup has improved network reliability by a factor of two.

Money Saver

Gillett suggested the idea of server virtualization a few years ago, when he was still a consultant working for the university.

“We decided we were going to need a test network upon which to check out a server virtualization solution before propagating it out campuswide,” Gillett says.

The test network was one way he could keep a close eye on the budget, something he did then and does now. From the start, “we were trying to utilize certain equipment without spending a lot of money on new stuff.”

To start off, Gillett and his team, including network engineer Ernesto Lopez, tackled a series of network issues head-on. The IT team ran fiber to each building directly, assigned subnets to each location and replaced all the switches. They also isolated each building on its own network to contain viruses.

Gillett then chose to run free versions of VMware on four of the college's 13 physical servers, creating four virtual servers per machine, running 11 to 13 virtual servers per host.

“VMware didn't advise doing 12 virtual servers on a freeware product,” he warns. But it was during this experiment that he realized attaching virtual servers to a SAN delivered better resource utilization and increased reliability.

To support the virtualization project, Gillett and his team invested in four HP ProLiant DL385 G6 servers, each with 64 gigabytes of RAM and four quad-core processors. The old servers were not only approaching maximum capacity, but were about to come out of warranty, so the new equipment investment made sense.

“It used to be that I lost sleep over losing even one of the old servers,” says Gillett. “Now, even if one of the cluster servers goes down, it will failover to the next one with no downtime at all. Now I go home and enjoy the weekend without worrying.”

Based on his experiences, Gillett and other IT managers offer some best practices for integrating virtual servers with a SAN.

Consider all your storage needs, both current and future. Think about the different types of data to be stored and the parties both on campus and off who need access. And, as best you can, consider projected needs and solutions that offer room to expand, such as blade servers.

Jim Davis, the Iowa State University CIO, says he and his team wanted a comprehensive storage architecture where all the pieces fit together for a wide variety of applications. Among the university's needs: high-availability storage and bulk storage for large research data sets, perhaps 50 to 100 terabytes in size.

Start small. Funding authorities at SCUHS balked when Gillett prepared his first estimate of what a server virtualization strategy might cost, so he started by testing freeware on an existing box. Only after he proved it could work did SCUHS decide to broaden its server virtualization strategy.

Iowa State's Davis advises including as many voices as possible in the early discussions of what a SAN/server virtualization strategy would look like. “I think it makes [such a solution] better accepted, because it is designed to offer the services people are looking for right out of the box,” he says.

80%  The amount of power capacity typically used in a data center with the remainder serving as a buffer to absorb usage spikes.

Source: Intel

Look for ways that software can further increase storage efficiency. Andrew Reichman, senior analyst at Forrester Research, says you don't always need to buy expensive new hardware because today the software layer offers some significant features. There are many products that can lighten the data-management workload. After all, Reichman says, an enterprise application such as Microsoft Exchange knows a lot about the data: which data sets are heavily used and which are not, or what has not been accessed in a long time.

Encourage IT staffs from different colleges within the university to collaborate. At Oregon State University, IT staffers within the colleges of agriculture and engineering share their experiences with Microsoft's System Center Configuration Manager, a product that deploys and updates real and virtual desktops and servers.

Davis' group at Iowa State lets the IT staffs at the various colleges know about the storage overhaul project. As the other staff learned more, “we got a great deal of buy-in from the colleges as we talked about the design and sustainability of it,” says Davis.

Hold off on making snap judgments on measurable benefits. Catherine Williams, OSU's director of enterprise computing services, says power consumption dropped since her group eliminated half a dozen physical servers, but there are several reasons for such a decrease. “We are using less power, but we also just moved to more efficient CPUs for a standalone system we have,” she says.

“It feels premature to measure [potential improvements],” says Iowa State's Davis. While he expects to halve the maintenance costs of the thin clients that make use of the improved networked storage capacity, he is reluctant to conjecture. “We'll learn more about this over the coming years,” he concludes.

Thomas Michael Alleman
May 11 2010

Sponsors