From cloud computing to nanotechnology, there is a long list of innovations that have emerged from world-class academic research. Providing faculty and researchers with access to IT resources and advanced research computing capabilities has been — and will continue to be — key to developing next-generation innovations.
But as universities aspire to increase research, they also face decreasing resources and budgets, increasing expenses and heightened demands for IT. Those challenges, combined with game-changing paradigms such as Big Data, mobility, software-defined networking and information security, all demand a fresh look and new approach to strengthening the link between IT and academic research. What is — or should be — IT’s role in planning or enhancing a world-class research environment in higher education?
We know the role of central IT as a service provider is shifting to a partnership role across all aspects of the institution, especially for computing-related research. Traditionally, research computing and support has been one of the most distributed services on campus, but this is changing. We have a tremendous opportunity to engage in the world of innovation and discovery that can truly have a transformational impact. And it starts with collaboration.
Many institutions that engage extensively in research provide central high-performance computing (HPC) environments and a support team to assist researchers with those tools. Researchers rely more heavily on IT for their work, which is changing their level of interaction with central IT. Now IT teams are viewed as a value-add and are invited to participate more closely with researchers as they acquire grants, manage studies, publish findings, store data, archive publications and navigate compliance programs. I have witnessed this transition at my own institution.
My office recently partnered with the University of Arizona’s Office for Research and Discovery to establish the first centrally funded and managed Research Data Center for university researchers, faculty and students. The collaboration was a first step in offering a central resource to provide HPC and high-throughput computing (HTC) systems, high-capacity data storage, a 3D immersive visualization facility, consulting services, visualization and statistics. We established a governance committee that comprises faculty researchers and campus IT professionals to oversee operations, establish policies and guidelines, and make recommendations for the future direction of the center.
In a somewhat unique approach, we created a “windfall” model that allows researchers and sponsored students to access the HPC/HTC systems during otherwise idle computing time. The model encourages full utilization of all processors in the systems. All centrally funded systems are available to all users with equal priority at no cost.
Increased processing power is key. In April 2014, the National Science Foundation awarded a $1.3 million grant to fund the implementation of a new supercomputer on campus, the Extremely LarGe Advanced TechnOlogy system, known as “El Gato,” which ranks among the world’s top 500 fastest supercomputers. El Gato’s processing power is more than 13 times greater than the previous generation of our HPC systems. The increased power allows researchers to run more complex computations at a faster rate and receive more detailed results.
While every university is unique, with its own budget, resources and staff, I have a few recommendations for achieving a more cohesive relationship between IT and the research environment: