Chad Rumbarger says Sinclair Community College depends on an app-centric load balancer to manage its network traffic.

Aug 05 2011
Networking

Directing Traffic: How Colleges Keep Servers Running Smoothly With Load Balancers

Colleges find that new app-centric functions in load balancers enhance security and ease management.

The data traffic load at Sinclair Community College in Dayton, Ohio, is enormous. The network supports approximately 30,000 users and 10,000 nodes spread across 24 buildings on five campuses.

Fortunately for Chad Rumbarger, a network engineer at Sinclair, the management features packed into today's server load-balancing devices are so sophisticated and intuitive that with a little bit of training, his IT colleagues can help him manage the load.

First-generation load balancers almost always required an administrator versed in command-line script procedures to make even minor adjustments to traffic allocation for specific applications. Today, a load balancer that requires a network engineer to handle the smallest task would be impractical and inefficient.

"We needed something more intuitive, so that a web person could go in and analyze the traffic, or a systems person could do a reboot," says Rumbarger, who along with fellow network engineer Darnell Brown works at an institution that boasts flourishing programs in nursing and hospitality management, as well as Ohio's lowest tuition for in-state residents.

Rumbarger is running F5's load balancer and also an older appliance from Foundry (now part of Brocade). He especially likes the F5 product for its ease of use. What's more, the F5 device contains security features that can block all traffic to a specific directory if it's determined that the directory has been compromised.

Even better, relatively easy tasks can be shared, once role-based access has been set to F5's management console. "I can give a web staffer permission to connect and reconnect servers," he says. In the past, when a server problem arose, Rumbarger or Brown would have been tapped to solve the issue via command-line scripts – something a web staffer would probably not know how to do.

Adjusting to ADCs

Sinclair and other colleges are making the most of the advanced features that manufacturers are now including as standard items in their load-balancing products, which today are more accurately described as application delivery controllers (ADCs), says Joe Skorupa, research vice president in Gartner's data center transformation and security group.

But Skorupa points out that the transition to ADCs has been quite an adjustment for many IT departments. As the functionality of load balancers has become richer, the advanced features tend to be used by just a fraction of the people whose networks might benefit from them. Many haven't turned on more than the load balancing and simple protocol offload, Skorupa says, even though more informed use of these high-end management features could reclaim between 20 percent and 50 percent of otherwise wasted processing power.

In effect, the features in these products have developed ahead of the ability of many organizations to absorb and apply them. Skorupa says one way to respond to this situation is for IT staffs to start thinking about these products as ADCs rather than as mere server load balancers.

"Do they do load balancing? Sure. But they can also run synthetic transactions and let you program new rules and fire up virtual machines," he says. "They are remarkably sophisticated and powerful products."

Hofstra University in Hempstead, N.Y., is another school that appreciates the potential of ADCs.

Toon-Chien Wong, associate technical director of systems and operations, says that load-balancing tasks and others related to efficient application delivery can be handed off to people around campus based on their roles within the overall IT structure at Hofstra – partly because the graphical interfaces that walk users through the tasks are so simple to use. Right now, about 10 people have access to the Coyote Point Equalizer E650GX installed two years ago. Hofstra also runs the E350GX in two smaller data centers, one of which serves as a failover site should a natural or man-made event take the primary data center offline.

"A lot of universities have their networking folks manage their load balancer," says Wong. "But we have our applications people manage it. At the end of the day, the load balancer is there to provide [reliable] application delivery."

Because of that, Wong says it makes sense to train staff who work in different silos to perform job-specific functions to keep the applications running smoothly.

Wong says he appreciates the "fine-grained" access Coyote Point makes possible, as well as its graphical user interface, which means that the users have to know the appropriate commands, and they also need special permission to perform server-related tasks that don't fit within their job descriptions.

Indeed, a mature load-balancing product shields users from all kinds of unnecessary information, both technical and organizational: Staffer X does not have to intimately understand Staffer Y's day-to-day responsibilities in order for each to optimize the load balancer's performance.

"The apps team doesn't always understand how the router works, but it doesn't matter,'' says Wong. "They don't need to know that to manage an application."

Hofstra's choice to divvy up responsibility for the load-balancing products is reflected in its IT organizational chart. While Wong says he has responsibility for data center operations, from databases to servers to security, application availability and performance belong partly within his purview and partly with the applications group.

Virtual Features

At Seton Hall University, in South Orange, N.J., Matthew ­Stevenson, director of networking and architecture, says SHU's initial investment in load-balancing products from Cisco Systems five years ago was made to provide redundancy. But his view of load balancers is changing.

"There appear to be some new features that integrate them closely to a virtual environment, and this could certainly help us with rapid deployments that include the load balancer as part of the automated virtual machine provisioning process," he says.

Neglecting to activate advanced management features, such as support for virtual environments or role-based access, is likely to compound problems for network managers, Skorupa says. As networks grow more complex – with multimedia or streaming applications, the ever-growing number of mobile and fixed nodes and the mix of products in use – network managers need all the help they can get.

But the ability to redirect traffic on the fly doesn't necessarily require a load-balancing product. For example, the Enterasys switches that replaced some obsolescent HP networking gear in the network of Lyon College in Batesville, Ark., include load-balancing features.

"The hardware load-balancing equipment currently available from several manufacturers consolidates and integrates network traffic management and acceleration as well as network security into one powerful system," says Charles Neal, Lyon's director of information services.

If one server runs at 90% utilization, offloading the Secure Sockets Layer processing to an ADC or to a load balancer will reduce that utilization to 60%.

SOURCE: F5

"A single piece of equipment now handles and does a much better job of enhancing network access and response times as well as increasing the security of critical data," he says.

Sinclair Community College has already ported ­Microsoft's Exchange and App-V as well as a point-of-sale app supporting the student card-swipe payment system to the F5. Another big application, Sinclair's learning management system, is slated for transfer over the summer.

The new LMS alone – which students use to take tests, upload videos, check grades on their Androids and perform a multitude of other tasks online – demands 36 servers, up from two a couple of years ago, Rumbarger says. Smart load balancing, and all that goes with it, helps Sinclair and other colleges meet the continuing demand.

Training Day

Most IT managers wouldn't hand off control of their server load balancer to just anyone, but it's almost certainly the case that some important individuals will merit access to portions of it.

Sinclair Community College's Chad Rumbarger has a quick training scenario worked out for his designated users. "I give them read-only access first, then we have an hour-long meeting where I have screen shots on the projector and I ask them what they want to know," he explains. People managing applications or assuring web uptime can be granted views appropriate to their roles within the organization.

Toon-Chien Wong of Hofstra University likes how granular Coyote Point's user access can be. "People only have access to one or two apps, usually," he says. "Only two of us have access to everything."

Sharing these duties with others who have been trained on the product "frees me up to do other things," he adds.

Gartner's Joe Skorupa says F5 led the way to providing the types of features that would make this sort of shared-management scenario plausible – and practical.

"That's what marked the transition away from load balancers to application delivery controllers," he says.

<p>Chris Cone</p>
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT