In Higher Education, Cloud Projects Are Delivering Real ROI
For researchers looking to discover new insights into plant biology, the answers are in the clouds.
A group of colleges led by the University of Arizona has built a private cloud, providing scientists, teachers and students worldwide with computing, storage and software resources to support their research into various aspects of plant life.
The iPlant Collaborative's cloud services have proved so popular among the group's 1,200 users that IT administrators plan to purchase more blade servers to increase the number of virtual machines (VMs) they can support. The organization also is planning to leverage the public cloud to provide users with even more computing resources.
"Because the demand has been so great, we've needed to scale out our hardware capacity," says Edwin Skidmore, assistant director of infrastructure for the iPlant Collaborative at the University of Arizona.
"The biggest misconception about the cloud is it's all about cost savings."
Greg Schulz, Server and StorageIO Group
Colleges and universities that embrace cloud computing report that it lets them optimize IT services while also cutting costs. According to CDW•G's 2013 State of the Cloud Report, higher education IT administrators say they expect to reduce IT costs by 14 percent the first year they move to the cloud, and by 20 percent within four years. Aside from saving money, top benefits cited by the 1,242 IT decision-makers surveyed include increased efficiency, improved employee mobility and the ability to innovate and offer new services.
"The biggest misconception about the cloud is it's all about cost savings. But it also enables people in higher education to be more effective and productive," says Greg Schulz, founder and senior analyst of the Server and StorageIO Group consultancy. "IT staff can offload the management of IT resources. It frees them up, so they can be more creative and innovative."
But barriers to cloud adoption still exist. According to the study, the three biggest objections concern the security of proprietary data and applications (52 percent), the performance of cloud services (31 percent) and the ambiguity of service-level agreements over who's responsible for what (27 percent).
Planting a Stake in the Cloud
Launched in 2008, the iPlant Collaborative aims to provide the plant science community with the technology infrastructure and collaborative environment required for research, and the cloud is a big piece of that puzzle.
Through a five-year, $50 million grant from the National Science Foundation, the organization has built an infrastructure that includes not only private-cloud services, but also web-based tools for data analysis, computing grids, high-performance computing (HPC) and a social network to communicate and share ideas. "Our goal is for users to do science when, how and with whom they want," Skidmore says.
Some expert users want a command-line environment and need HPC to perform large computations, while others are more comfortable with web-based applications, where they can point and click to do what they need to do. The collaborative built the iPlant private cloud, called Atmosphere, in 2010 to provide users with a middle ground, Skidmore says.
Edwin Skidmore, Assistant Director of Infrastructure at the University of Arizona, says the cloud built to support the iPlant Collaborative enables efficient, global research. Credit: Steve Craft
"Atmosphere addresses a need that HPC can't easily provide," he says. "The cloud allows users to create on-demand, reproducible environments. They can customize a VM and install software. They can also develop a complex software stack and share it, so other people can use the software with their own data."
Atmosphere, which provides infrastructure, platform and software as a service offerings, is built with 32 HP ProLiant SL blade servers and a 50-terabyte Nexsan storage area network (SAN). The cloud currently offers up to 200 VM instances. Complementing Atmosphere is a cloud-based central repository for users' research data, the iPlant Data Store.
The cloud infrastructure is a constant work in progress, Skidmore says, because cloud technology is evolving, along with users' needs.
Skidmore says he plans to triple the number of blade servers by summer to provide users with additional computing resources. He also plans to triple the amount of cloud storage available, from 180TB to 540TB.
The IT staff has been working to migrate the private cloud from Eucalyptus open-source cloud software to OpenStack, which provides more innovation and flexibility, including application programming interfaces that allow the IT staff to connect iPlant's private cloud to public-cloud offerings, Skidmore says.
The iPlant staff planned to test the OpenStack version of the cloud in January and add support to public-cloud services this spring. The private cloud is free, but organizers must limit users' server resources. If users have the funds — and the need — more computing capacity can be accessed through public-cloud providers. "We really want a hybrid-cloud approach, so users will not feel constrained in their research," Skidmore says. "Connecting Atmosphere to the public cloud is a natural evolution."
Forging Ahead
In Washington, D.C., George Washington University's Columbian College of Arts and Sciences built its own private cloud to provide users with technology that's more effective and cost-efficient.
In late 2009, the university's traditional IT architecture — built with Novell software for authentication and file sharing — became too expensive and was not meeting the college's needs. In fact, the college was paying the university's central IT department $300 per user and $100 annually per 1 gigabyte of storage, says Sean P. Connolly, director of information technology at the Columbian College of Arts and Sciences.
Near the end, the college could only afford to provide the services to its administrative users, forcing faculty to purchase their own data storage technology through their research grants, he says.
To rectify the problem, Connolly set a plan to move to the cloud, where he could store data more affordably by centralizing and tiering data storage. His plan called for moving to faster, more expensive storage for cloud application and desktop services, and for using slower, less expensive storage for user data. He also proposed virtualizing applications and desktops to streamline software licensing and simplify IT management.
By putting everything in the cloud, users would be able to securely access the data, software and computing resources they needed on any computer, tablet or smartphone — regardless of operating system, Connolly says.
"We wanted to extend the benefits of virtualization to the end user, enabling new platform-agnostic IT services," he says.
When the central IT department showed no interest in the cloud idea, Connolly and his IT team pursued the project on their own. In early 2010, they built a cloud using an HP c7000 Series BladeSystem, Citrix Systems XenServer and VMware vSphere, and a 30TB HP LeftHand P4000 SAN. The college has since outgrown the SAN's original size and increased capacity to 140TB.
"From an ROI perspective, it now costs $50 to deploy a user account versus $300 for a Novell account, and 27 cents annually per 1GB of storage versus $100," he says. "That made the difference in being able to deliver these services to our faculty."
The IT staff virtualized applications using Citrix XenApp and created virtual desktops using Citrix XenDesktop. With desktop virtualization, servers are partitioned into different VMs, allowing users to access their own virtual computer running Windows 7.
"The key thing is that this gives users flexibility," Connolly says. "If faculty are using a Mac and they want to run Visio on Windows, they just launch Visio. It looks like it's running locally, but it's actually running on the cloud. And a virtual desktop is immersive; it looks like your computer."
Technology is critical, but for a cloud implementation to succeed, the IT department must meet the needs of its users — and that means meeting with them regularly to make sure they get the services they need. Connolly not only meets with each academic department, but his staff also meets individually with each faculty member to determine their ongoing technology needs.
"We are creating an inventory of the needs of each department," he says. "That's how we do our storage projections and determine how many licenses we need for different software applications."
Today, the primary use of the private cloud is for faculty and graduate student research. The need to comply with privacy and security standards such as HIPAA is one reason why Connolly built a private cloud rather than use the public cloud.
Public cloud providers replicate data across multiple data centers, so it's unclear how many copies of the data exist. By keeping data in-house, the college has more control and can ensure researchers are in compliance with regulations, Connolly says.
"If our researchers need to delete sensitive data, we can make sure all the copies are deleted."
The private cloud also bolsters security. Research data is now more secure because it's housed centrally in the cloud. In the past, faculty and graduate students purchased their own external hard drives in their research labs for storage. But now that data is centralized in the data center and backed up every six hours, Connolly points out that it's better protected.
Connolly's team continues to make improvements to the cloud. The IT department has invested in a second set of servers and networked storage, so it can create a secondary site for disaster recovery.
Moving forward, the university's central IT division has expressed interest in investing in desktop virtualization. Connolly plans to collaborate with central IT and unify their efforts, a move that will allow his college and the university at large to provide cloud services to undergraduate students.
"One of my big goals in the next year or two is to develop a plan for convergence," he says.