2. Supercomputing
A laptop has one central processing unit, but the Sol supercomputer at Arizona State University runs on thousands of CPUs.
“Think about your iPhone, which may have 6 gigabytes of memory to run programs. Sol contains five large memory nodes with 2 terabytes each — that’s 2,000 gigabytes of system RAM,” says Douglas Jennewein, senior director of the Research Technology Office at ASU’s Knowledge Enterprise.
Sol also employs hundreds of graphical processing units, which are used for machine learning and 3D modeling and simulation.
The University of Michigan also has advanced computing technology, the Great Lakes high-performance computing cluster, which contains 13,000 CPUs and assists researchers with projects involving simulations, modeling, machine learning, data science and genomics.
WATCH NOW: Accelerating university research with network upgrades.
3. Data Storage
As humans, we generate vast amounts of information in our lifetimes, and this ever-growing collection of data needs to be available for research. At the University of Michigan, data stores increase by 1 petabyte every two months. That’s 1 million GB every other month, all of which requires both primary and backup storage.
Institutions rely on a mix of on-premises storage and cloud-based services — neither of which are cheap — to solve the storage problem. For example, although the cloud is popular, it’s not always the answer for research. “Providing the right kind of storage ecosystem has almost become an art,” says Pendse.
The cloud is not the best option for constant, daily downloading and uploading of data. This use drives the cost up significantly.
“Where you store the data is really dependent on how you’re using the data,” says Shafaq Chaudhry, director of research technology at the University of Central Florida.
Factors to consider include how often data needs to be accessed and what other services, such as workflow automation, are needed.
ASU uses high-speed, parallel file storage to support simultaneous data analysis on supercomputers. With 18,000 CPUs, Sol can access information and at the same time run multiple simulations or analysis rapidly.
“If you’re doing massive, distributed simulations, you have to have all these different CPUs writing data, presenting simulation data or reading data analytics from the same location at the same time. It’s very high-speed, high-capacity data storage, which lends itself well to advanced computing,” says Jennewein.
LEARN MORE: Integrated storage solutions reduce silos for researchers.
4. Secure Infrastructure
Every project requires a different level of security. Researchers often must work within the constraints of grant requirements, federal government contracts, a sponsoring company’s proprietary data rules or data privacy regulations such as HIPPA. In addition, even when research is based on public data, it is still considered intellectual property that needs to be protected from tampering or theft.
Maintaining a heightened level of cybersecurity is paramount. “The floor of cybersecurity is starting to be closer to the roof,” says Chad Macuszonok, assistant vice president of research IT at the University of Central Florida.
5. Common Licenses
It might come as a surprise to see Zoom, Microsoft OneDrive and Google Drive listed as research tools, but these general licenses are just as important in facilitating research. “Having these types of tools that people don’t always think about becomes critical,” says Pendse.
Adobe Creative Cloud, for example, is useful for documenting research and illustrating results.
6. People
Arguably the most important tool at research universities, the people who make up the research technology departments include a conglomerate of experts in coding, data analysis and storage, computing, and a host of other niche tech domains. Their job is to help faculty use the high-end tools that power academic research.
“Whereas a typical IT department would set up general-purpose tools and areas that anyone can use, our team supports research,” says Chaudhry. For example, she says, every project must be considered on a granular level, looking at what capabilities the researcher needs — such as the ability to collaborate with other institutions — and how the data needs to be formatted.
While faculty devote themselves to their research, the support team handles the logistics of using the high-end technology, such as grant directives and security requirements, coding and software development, data analysis and system engineering.
DISCOVER: Scaling the future of research computing in the cloud.
Also, given their bird’s-eye view of research across an institution, research technology staff can often help connect researchers from different disciplines who may be working on similar problems, find local organizations that researchers can partner with, or identify resources at other institutions that faculty can use.
“As research facilitators, we have to be aware of not only what the university has, but also what’s available regionally and nationally, while understanding what the research requires,” says Chaudhry.
Every faculty member has unique needs and a unique project, which is where this “dream team” comes into play, explains Pendse.
“Our goal is to support world-class research and teaching using frictionless interfaces,” he says. “We want to ensure that faculty members are focused on their core competencies and don’t have to worry about figuring out the technology. How can we meet you where you are? We want to provide user-centric, researcher-centric technology.”