Colleges reap rewards from stepping up to cloud computing.
California Southern University and North Carolina State University would at first glance seem to have little in common.
One is on the West Coast in Irvine, Calif.; the other in the East, in Raleigh, N.C. One has its students take all their courses online; the other's students mainly attend class in lecture halls and classrooms that span 10 colleges. One is 30 years old; the other is 120 years old.
But both have one essential thing in common: They have adopted and adapted cloud-computing environments – the delivery of scalable IT applications, services and infrastructure over the Internet – to serve the day-to-day processing and system demands of their students and faculty. In both instances, technologists at the two universities also embrace a new vision of the IT-user relationship that cloud computing engenders: Ultimate control over the processing environment lies in the hands of the end users.
From CalSouthern's perspective, “it's not just about whether you can create a school on the web,” says Kevin Mack, senior network engineer for the online university. “You have to be able to capture the sense of that student on the web and determine what you can give them to make them successful.”
That same notion of success is at play at NC State, which for the past five years has hosted a cloud that serves increasingly more users, both on campus and off. The cloud, known as the Virtual Computing Laboratory, (an open-source Apache project), gives its users the flexibility to pick the system components they want versus the components that a traditional IT infrastructure previously would have dictated to them, points out Mladen Vouk, head of the university's Computer Science Department and its associate vice provost for IT. The VCL can support all types of user setups, from desktops and groups of classroom or lab computers to server collections and high-performance computing clusters.
Although California Southern and NC State may be ahead of their time in higher education's adoption of cloud computing — the delivery of scalable IT (applications, services and infrastructure) over the Internet — they will likely have plenty of company in the not-too-distant future, based on Gartner's research. Gartner foresees broad use of cloud computing taking hold in higher education over the next two to five years, even though fewer than 1 percent of colleges are using computing-as-a-service (CaaS) now and about 20 percent are using cloud e-mail. Why? In part because education's use of consortium and shared-services models will make the acceptance hurdle low.
That's definitely a view held by Indiana University CIO Brad Wheeler, who prefers the phrase “above-campus services” to describe cloud computing. “I am also confident that the long-standing trust and shared values among higher education gives us an opportunity for consortium sourcing that may provide superior, long-term economics over generic commercial offerings,” says Wheeler, whose university leads a group of 10 schools participating in the FutureGrid effort to develop the next generation of grid and cloud computing for researchers.
Jan-Martin Lowendahl, a Gartner research director, identifies cloud computing in general as an important technology and cloud high-performance computing specifically as a transformational technology for colleges and universities. One contributing factor is cost. In the NC state program, for instance, a CPU hour cost of 27 cents dropped to 10 to 15 cents based on the university's 2008 numbers. Given the tight budgets for most institutions and the continuing demands on IT, cost incentives will be a major driver, he says. “This has the potential to be a great equalizer for many small and medium-size universities when it comes to serving their students and staff with computing power.”
But economic factors alone won't dictate a move to the cloud. Other motivators include the ease of serving distance learning; the perfect-storm possibilities when blending cloud, virtualization and blade technologies; and the need-it-now computing demands of users.
California Southern began as a correspondence school and made the transition to an online distance-learning environment.
Built around an HP ProLiant cluster running Microsoft Hyper-V and Citrix XenServer virtual machines, the CalSouthern data center is home to about 35 virtual machines housed on 15 servers that host SQL Server, web and infrastructure services. The platform, while stable and served by a decent network connection, lacked failover in case of disaster and had limited on-the-fly expansion capability.
“We had to be able to plan for growth,” Mack says. “If we double the number of students, could we support that growth and could we support it fairly quickly – both from a data storage and a bandwidth perspective? We wanted the answer to be â€˜yes.'”
The ultimate goal was high availability to support round-the-clock learners and the faculty that teach and mentor them. The IT team looked at collocation options for establishing a secondary site. Then, last year, as CalSouthern prepared to move from Santa Ana to Irvine, it began considering cloud computing, based on the recommendations of its chief IT supplier, CDW•G. “They were really hands-on in helping us” evaluate possible cloud services that fit the existing infrastructure and connected CalSouthern with the Terremark platform, says Mack.
The university signed service level agreements for guaranteed bandwidth and resources (storage, memory and processing power) over a 30-day period. As to storage capacity, “we're at about 1 terabyte locally and about a half T in the cloud. It's relatively inexpensive to move to the next level,” Mack says.
Being among a small vanguard of education cloud users is OK with Mack: “It's fun to be Daniel Boone sometimes.”
More than 90% of reservations for use of NC State cloud resources are immediate. The Virtual Computing Laboratory provisions most of these services (more than 80%) within 30 seconds to 2 minutes. Less common requests might take 5 to 30 minutes.
Source: NC State VCL, vcl.ncsu.edu
The return on investment is fairly straightforward, Mack says. “If you buy a new server, I don't care who you are – you need a week to set it up and rack it and stack it.” Plus, there are other infrastructure expenses, migration costs and monthly recurring expenses.
Take that thinking and ramp it up several notches. “Nowadays, with one chassis of blades with 12 to 14 blades, you can support 200 users. That is very cost effective,” says NC State's Vouk. That's the dynamic at play in North Carolina, where the NC State cloud serves education users statewide. There are several institutions beyond the state's borders – George Mason University; the University of Maryland, Baltimore County; Old Dominion University; Southern University, Baton Rouge; and California State University, East Bay – that have or are implementing VCL-based clouds.
NC State runs a massive IBM Blade Center environment of 2,000 blades to support its dynamic cloud services, which rely on a reservation front end for users to acquire cloud resources, referred to as images. Vouk notes that although the VCL mainly uses IBM blades, any type of blade can plug into the environment.
When the team began this effort, a chief hurdle was showing management that it could provide services efficiently. “We maintain these 2,000 boxes on a staff of less than half a dozen – very conservative in terms of resources,” he says.
Although virtualization was not fully mature when the VCL was launched in 2004, it now takes full advantage of virtual machines, and through its dynamic load-balancing capabilities, it reconstructs and repurposes blade resources to get maximum utilization from the installed base and to optimize provisioning.
Whether in an institution- or consortium-run environment or a commercial cloud, users are in charge of their processing destiny.
True cloud computing “has to put the users in control. For the users, that's a real breakthrough. They can now customize the hardware structure, operating systems and application stacks as needed to actually get their work done, and then save these configurations as images capable of being shared with other users in the cloud,” says Patrick Dreher, an adjunct professor in NC State's Department of Computer Science and chief domain scientist for cloud computing for the Renaissance Computing Institute. RENCI is a University of North Carolina research institute with ties to Duke University, NC State and other UNC campuses. It applies technologies to problems identified by the state of North Carolina and to university research initiatives.
Cloud computing is “as much a business paradigm shift as a technology shift,” says Dreher. “It's at the level of a transformational change in IT that only happens once or twice every decade.”
What became abundantly clear early on for NC State was that users want most services on the fly and from wherever they are working. Physical computer labs have become more about collaboration and academic socializing and less about delivering processing capabilities, says Vouk.
Meeting the cloud users' needs requires more back-end boot-strapping by IT than in traditional computing infrastructures, Vouk says, but its automated. The VCL uses simple and composite image instances assigned to each user who places a reservation. An image instance (a copy of the master image), locked to a specific user's IP address, is a stack that includes an OS, application software and possibly a hypervisor that only one authenticated user can reserve and modify.
“This is effective in conserving central IT resources and staff,” he says. “Plus, it empowers the end users. It's a one-click system.”
In a commercial scenario, a similar approach prevails, explains Mack. CalSouthern's IT staff can control every aspect of its cloud service: the OS, the firewall and the applications.
Both Vouk and Mack point out that this shift to user-driven computing does not absolve IT from its systems administration and management duties.
“You have to maintain service and provide adequate bandwidth,” Mack says. “It helps you work better with the resources that you have.”
As universities and colleges continue to roll day-to-day computational services into the cloud, they will also push ahead on a major cloud issue crucial to academic users: managing voluminous data sets necessary for research.
Both the Renaissance Computing Institute in North Carolina and the Pervasive Technologies Institute at Indiana University have this processing conundrum in their sights.
The IU-led FutureGrid is not aimed at daily computational work, IU CIO Brad Wheeler says. “Rather, it is an experiment factory where cloud and grid computing environments can be simulated and tested on a massive scale.”
The driving questions address where data needs to reside and how to provide computational access for geographically distributed researchers. Today, the data sets for many projects are so large that it is impossible to transport them to different sites for computational processing and analysis.
It is more practical to build and test cloud computing images locally and then install the image near the location of the data set for analysis and modeling calculations, says Patrick Dreher, RENCI's chief domain scientist for cloud computing. Researchers can then stop re-creating data sets or settling for computational operating systems and application stack environments at the remote site that are not customized to their requirements.
Again, it's a question of scale: being able to flex infrastructure components dynamically and with immediacy, says Wheeler. “The software and systems will have to be developed that can immediately â€˜manufacture' a computation cloud that consists of X number of Intel cores at Y number of sites using OS and software configurations of Z for time period Q.”
RENCI's Patrick Dreher says seven main ingredients must be on hand for a successful environment: