Internet SCSI is a less expensive and less complicated route to implementing a SAN than Fibre Channel.
The information technology department at North Carolina Central University had a problem: Tech support staffers were wearing out the soles of their shoes.
The 60 servers scattered about the 103-acre campus in Durham, N.C., were serving the 8,000 students and 1,600 faculty and staff members just fine. But the techies charged with supporting data storage had unwittingly become cross-country runners.
“We had to send people out all across campus to add storage, move data, and back up and clean up old data,” says CIO Greg Marrow. “And each operating system had its own quirks, which support people had to be aware of.”
That's history now that the school has installed three storage area networks (SANs), which consolidate storage in the data center and in the disaster recovery facility.
Marrow and others at NCCU had been yearning for a SAN for a number of years, but Fibre Channel, the primary transport medium, was too expensive. Like most higher education IT departments, NCCU relied on direct attached storage (DAS), in which the hard disk and arrays are connected to each individual server.
Then along came Internet Small Computer System Interface (iSCSI), a network transport protocol standard ratified in 2003 and now gaining traction in the market. iSCSI is enabling universities to implement SANs using their existing Ethernet network equipment and skills. As a result, schools that couldn't afford Fibre Channel are now in the SAN market.
Many IT staff members find SANs much more efficient than DAS. And with good reason.
In the DAS days, many departments at NCCU bought new servers when all they really needed was more storage. And the school couldn't recapture spare capacity on little-used servers in order to beef up those that needed more space.
“We had some servers using only 10 or 20 percent of their capacity, while other departments were shopping for a $30,000 workstation because they ran out of storage,” says Cecil White, who is now project manager. Before the SAN implementation, he was in charge of storage management.
Possibly the most important tipping point in favor of a SAN was the school's imminent purchase of an enterprise resource planning system. The ERP would require more storage, not only for the application's primary database, but also for remote replication that NCCU planned to implement at a disaster recovery site 15 miles from campus.
Early in 2005, NCCU embarked on a three-month SAN evaluation project. SANs based on Fibre Channel held the leading edge in the market and immediately made the short list. But just as quickly, the school's evaluation team identified serious problems. “We had absolutely no experience with Fibre Channel, and it was a complex technology,” says White.
The lack of in-house expertise and the cost of Fibre Channel host bus adaptors for each server, switches and pricey cabling would probably have doomed the project, White believes.
Mark Bowker, an analyst with the Enterprise Strategy Group of Milford, Mass., says many enterprises find themselves in a similar situation: “Schools and businesses have invested a lot of money in IP training and Ethernet equipment. Except for large institutions, such as banks, few organizations are willing to start over with a new protocol and network devices.”
Fibre Channel did prove to have a few advantages over iSCSI, but the school determined that they were inconsequential and temporary. The first was OS support. At the time of NCCU's evaluation, iSCSI initiator software was not available for the NetWare version NCCU was running. But the university found third-party patches and other workarounds. Currently, Apple's Mac OS X is the only major operating system that doesn't support iSCSI.
Fibre Channel also is faster, with throughput of up to 4 gigabytes per second; iSCSI has 1GBps. But speed is only an important issue if you need it, and NCCU hasn't noticed degradation of performance while using SANs.
Finally, near the end of 2005, NCCU installed two EqualLogic iSCSI arrays in the data center and a third in its disaster center. The process, which involves installing software on each server and moving data, took only a few days, Marrow says. The bottom-line cost for the initial purchase of 50 terabytes of storage was $200,000, and that included onsite training and maintenance. About $50,000 of that was attributed to added storage needed for the new ERP system.
The technology was as effective as expected, but Marrow discovered some new management challenges. One big one: allocation of storage limited by the size of the disks connected to the server, says Marrow. “But when departments find we've got terabytes of capacity, they naturally assume they can use as much as they want. There is a cost for storage, and we have to make determinations based on requirements.”
Marrow adds that he's now working more closely with the academic community to find new applications made possible by the increased storage capacity. Some examples:
- The new Biomanufacturing Research Institute & Technology Enterprise Center for Excellence will make use of the SAN as it conducts critical research.
- Some departments are posting podcasts online – and the school's radio station is considering putting all or most of its music files online.
- Students who want to back up or off-load some of the data on their notebooks may be able to use the extra storage capacity in the future.
Says Marrow, “This gives us a lot of flexibility.”
What's so great about iSCSI?
Cost: iSCSI uses standard, off-the-shelf Ethernet (or any other TCP/IP-capable network) switches and adapters, which are less expensive and more ubiquitous than Fibre Channelanalogous equipment. This lowers the initial cost of adaptation.
Implementation: All university IT shops have skilled TCP/IP professionals. They may need some training to use the SAN, but they won't have to learn a new network protocol, as they would with Fibre Channel.