Oct 31 2006

What's Taxing Your Servers?

Ensure the continued health of all operations on your campus by keeping ahead of user needs and adopting server policies that recognize the interconnected nature of your systems.

Watch almost any rollout of a new operating system, and you’ll see that file-search functions have become one of the main features required by any such product. A key to new computer search functions is that the files, and often the data themselves, are treated as being independent of the program in which they were created or by which they’ll be accessed.

Talk to many IT professionals, and you’ll begin to wonder whether the problem isn’t a little simpler than it appears. They’ll often tell you that the e-mail program has become the storage place of choice for many users. People are storing countless text e-mails, Hyper Text Markup Language (HTML) e-mails, attached files with images and even sound, and much more in e-mail file folders. That’s having an impact on the server end.

Many programs by default off-load the storage of e-mail onto the end user’s computer. In other cases —including that of the faculty and staff at Hartwick College in Oneonta, N. Y.—e-mail files are stored on central servers that use Microsoft Exchange, a messaging and collaboration program that runs on servers and enables users to send and receive e-mail and other forms of interactive communication through computer networks. Consequently, they are under the control and responsibility of IT departments.

“When you think of the paperless office, in some ways, e-mail is becoming [like] the traditional file cabinet where people keep important communications, and they don’t want that to go away,” says Bill Beyer, Hartwick’s director of technology services and chief technology officer. E-mail used to be simple text messages, but now “you have large attachments, embedded graphics, personal folders, personal calendars, shared folders and shared calendars, so there’s more functionality built into the Exchange software that people rely on.”

The IT department’s duties increase as users access that greater functionality from a wider array of accessories—handheld devices, cell phones and personal digital assistants—that need support.

The impact on the server is a need for more storage and power. Hartwick College leases its Hewlett-Packard servers, and every three years, the college gets new boxes to ensure that its users don’t outgrow the servers’ capabilities. In 2005, Hartwick is in one of its upgrade years, and the new HP boxes have more disk storage, more memory and faster central processing units (CPUs). The college’s needs drive the hardware purchases.

“We use Microsoft Exchange here, so as Microsoft has upped the software requirements and users have demanded more functionality, we’ve had to plan for ongoing server upgrades,” says Beyer. And because backups are getting bigger, “we’ve needed to come up with ways to continue to perform backups quickly within allotted time constraints, which requires new backup hardware and media.”

The University of Wisconsin-Green Bay is another Microsoft Exchange shop. To deal with its workload, UWGB brings in new mail servers about every two years, using Gateway 980 and 9510 models running dual Xeon MP chips with at least 4 gigabytes of memory, multiple Ultra320 Small Computer System Interface dual-channel controllers, and striped and mirrored 73GB 15,000 revolutions-per-minute Ultra320 SCSI drives in external fault-tolerant arrays. On the policy side, the university tries to reduce the e-mail load by implementing quotas and using automated routines to clean out “logically—and not truly—deleted items for students,” says David Kieper, UWGB’s manager of network and infrastructure services. “We also clean out quarantined spam folders for messages that are over 30 days old.”

Administrative Muscle

Beyond e-mail, servers are being taxed by the volume of work carried out by financial aid, human resources, registration and various other administrative departments. The University of Illinois (UI) runs its SCT Banner Enterprise Resource Planning (ERP) system, an administrative suite of software, on a Sun Fire Enterprise F15000 domain from Sun Microsystems, with 36 CPUs and 72GB of RAM.

“We have several other Sun domains supporting related services, such as hot standbys, OpenEAI messaging [an open source protocol for enterprise application integration] and reporting,” says Jason Heimbaugh, interim director of applications and infrastructure.

Heimbaugh adds that storage disks are managed on EMC hardware with high-availability and high-redundancy capabilities. “Recognizing the need to get data onto and off disks rapidly,” he says, UI devotes “many resources—cache, striping, fiber channeling, etc.—to disk, resulting in extremely fast read and write times, even during heavy demand periods.”

It’s no surprise that the heaviest user activity on the system is during the daytime hours between 8 a.m. and 6 p.m. Batch activity is heaviest between 6 p.m. and 6 a.m.

To deal with what Heimbaugh characterizes as rare occasions when performance has been hurt, UI follows a strategy that includes evaluation and making recommendations to avoid or monitor similar events in the future.

“Maintaining active and high-level service contracts with all vendors is especially critical when troubleshooting problems that occur in a multilayered and distributed environment,” says Mike Cornell, enterprise infrastructure specialist at UI.

Pacific Lutheran University (PLU) in Tacoma, Wash., migrated its ERP system from mostly client-server setups to a Web-based SunGard SCT Banner/Oracle system. That migration “has been good news to us, but has also introduced a few challenges,” says Chris Sanders, PLU’s director of computing.

“The good news is that we don’t have all of the client [installations] to manage. The challenge, however, is to create responsive, fault-tolerant servers that are not cost-prohibitive.”

One task was to avoid adding a significant load to the application servers by having them host the Secure Sockets Layer (SSL) encryption, and the university’s technology staff figured it would be desirable to off-load that overhead. PLU eventually selected Foundry’s Server Iron products to handle the load balancing, Web acceleration and SSL encryption.

This “greatly increased the amount of capacity on our servers,” Sanders says. “Plus, with the addition of the Foundry Networks Application Switches, we are able to cluster several small servers together in order to offer fault tolerance and performance improvements.

Major Server Challenges

What are the top-of-mind issues that IT experts face? Here are additional comments and insights from UWGB’s Kieper, Hartwick’s Beyer, and UI’s Heimbaugh and Cornell.

E-mail: “E-mail is becoming the de facto ‘file cabinet’ for storage of long-term information and communication between faculty and staff, so long-term retention of e-mail will cause overall storage needs to continue to increase,” says Kieper. He estimates a growth rate in UWGB’s e-mail storage needs of 50 percent to 60 percent a year. That’s in addition to handling regular e-mail traffic. About 80 percent of incoming mail is spam, according to Kieper.

File storage: Kieper says his main storage challenges are “growth in the size and type of files we are asked to store and back up centrally.” The use of graphics, sound and video is causing huge growth in his storage needs.

Kieper uses Symantec Backup Exec products, along with Gateway 980 servers configured with Gateway 840 3-terabyte Serial Advanced Technology Attachment (SATA) arrays. “We do over-the-network, disk-to-disk backups, then we do disk-to-tape backups to multiple Quantum Superloader Super DLT [digital linear tape] 320 drives on the Gateway systems that have the SATA arrays attached to them,” he says.

Expanding usage times: One of the biggest challenges is that some vendors don’t recognize the 24 x 7 nature of campus technology these days, says Beyer. “With faculty and staff on [their computers] during the day running the business of the college and students on just about all night, the only quiet time is probably between 2 a.m. and 6 a.m.,” he says.

In general, campuses report that peak usage times continue to be the obvious ones: heavy daytime faculty, staff and student lab traffic, and heavy evening/nighttime student housing traffic. But with students and faculty using the campus systems to work on course work (and entertainment) at all hours, the heavy daytime traffic times have started to bloat and encroach on evening schedules.

“Usage timeframes continue to expand, but we still see a relative lull between 3 a.m. and 7 a.m.,” Kieper says.

Heimbaugh and Cornell stress the need to “test, test, test” to understand the performance implications of a campus’s peak loads. UI “invested much in the analysis and development of stress and performance tests to adequately predict loads and sized the hardware accordingly,” Heimbaugh says. “So far, predictions have held true, and the initial ‘live’ performance in production has been acceptable.”

Stress times: After one year of its fully implemented ERP setup, which was created in stages beginning in 2002, UI identified two stress periods for its server performance. The first was during online registration, “especially at the beginning of terms when all three university campuses—Chicago, Urbana-Champaign and Springfield—are in session,” Cornell explains. “This occurs three times per year, once for each term.”

The second stress period was when activities that normally do not overlap come into conflict. For example, the Oracle database is tuned to work well in batch operations that run during nighttime hours and also during online transaction processing during daytime hours.

“When batch operations spilled into daytime hours due to heavy volume, we had periods when the system did not respond as well,” Heimbaugh says. He adds that “actively monitoring for these occasions, delaying batch runs and tuning batch operations for rapid [executions] and co-executions have improved this [situation] considerably.”

How do you ensure healthy server environments?

“Whether it is storage for e-mail or files, track this data at least monthly,” advises David Kieper, manager of network and infrastructure services at the University of Wisconsin-Green Bay (UWGB). Also, regularly track server performance (CPU utilization, memory utilization, disk input and output statistics) on many key servers at peak times, such as at the start and end of semesters and during registration periods, he adds.

UWGB uses what Kieper calls a “trickle-down” method, in which key servers that must perform best and be most reliable are replaced every two years. The replaced hardware is recycled to other, lower-demand uses. “This keeps the fastest and most reliable servers running the most critical applications,” he says.

The University of Illinois spent three years doing analysis and planning before embarking on the transition from a mainframe-based system to a distributed enterprise resource planning system. It then followed a strategy of “monitor, tune and keep current,” explains Jason Heimbaugh, interim director of applications and infrastructure at UI, who says, “The end benefits of these strategies far outweigh the investments.”

Hartwick College leases its servers, which helps it keep up with the needs of the system with budgeted, fixed monthly costs.


When campuses sported a handful of servers, each located in—and usually maintained by—individual departments, those boxes sat largely unconnected to the outside world. Problems on those servers caused local headaches, but not campuswide shutdowns.

Some departments or offices continue to need their own specialized servers or to run locally areas that require departmental knowledge. However, interconnectivity on every level has led to widespread centralization of servers and their maintenance and protection. It also has brought the trauma of spamming, worldwide viruses and Trojan horses, and hacking attacks.

That makes every server—and everyone connected to it—a security concern. “You want to secure the data, the devices [and] the access to the network,” suggests Cathy Martin, Hewlett-Packard’s director of education, government, health and education.

The University of Illinois follows strict deployment policies and segregates its networks with switches and point-to-point firewalls. “Deny everything by default,” counsels Jason Heimbaugh, UI’s interim director of applications and infrastructure. “We open only the minimum required ports.”

In the event of a serious problem with the server system, UI can “restore full services at an offsite location with a limited amount of downtime,” says Mike Cornell, enterprise infrastructure specialist. “Every component in the environment has redundant systems associated with it, with hot failover available where possible and practical,” he adds. (Under failover, a system automatically and seamlessly switches to a backup database, server or network in the event of failure.)

David Kieper, manager of network and infrastructure services at the University of Wisconsin-Green Bay, stresses the need to stay current with software. To handle the regular deluge of e-mail on its Microsoft Exchange system, UWGB keeps up to date with the latest Exchange versions and uses McAfee GroupShield antivirus and Sunbelt iHateSpam antispam programs to keep the university’s e-mail clean and to reduce problems.

John Burton is a freelance technology writer based in San Francisco.