Nov 16 2021

What Is Edge Computing, and How Can It Be Leveraged for Higher Ed?

A distributed computing framework offers a faster and cheaper approach to processing Big Data.

Edge computing adoption is on the rise. As noted by InfoWorld, 50 percent of organizations plan to deploy edge computing solutions within the next 18 months to help manage Internet of Things (IoT) devices, improve data processing and capture actionable insights.

The shift might make sense from a business perspective, but in what scenarios could higher education benefit from edge networks? Here’s a look at what edge servers can actualize for higher education.

What Is an Edge Computing Network?

To understand potential use cases for higher education, it is important to first understand what edge computing is.

“Edge computing is an architecture,” says ISACA ambassador Ramsés Gallego. “In the world of distributed computing, it brings infrastructure such as servers and network devices closer to data.”

In practice, edge networks move hardware such as computing and storage servers closer to the people and processes that produce data. This shortens the distance that data must travel to get processed and stored. Instead of sending data to a centralized cloud or onsite server for processing, the processing occurs when the data is collected.

Edge server systems eliminate the need for a computing middleman and reduce total latency.

 Edge Computing vs. Cloud Computing: What’s the Difference?

There’s a common misconception that cloud and edge computing are synonymous because many cloud providers — such as DellAmazon Web Services and Google — also offer edge-based services. For example, an edge cloud architecture can decentralize processing power to a network’s edge.

But there are key differences between cloud and edge computing. “You can use cloud for some of the edge computing journey,” Gallego says. “But can you put edge computing in the cloud? Not really. If you put it back in the cloud, it’s not closer to the data.”

Gallego notes that while cloud services have been around for more than a decade, edge computing is still considered an emerging technology. As a result, colleges and universities often lack the in-house skills and capabilities to make use of this technology. If that’s the case, an institution may want to work with a partner to help it get started.

GET THE WHITE PAPER: As cloud adoption accelerates, security must keep pace.

What Is Edge Computing Used for in Higher Ed?

The most common use case for edge computing is supporting IoT capabilities. By bringing servers closer to connected sensors and devices, institutions can leverage Big Data to gain actionable insights more quickly.

By placing clouds in edge environments, institutions can also cut costs by reducing the distance that data must travel. For an increasingly connected campus, edge computing can also help reduce bandwidth requirements.

As campuses prepare to support the next generation of students (the children of millennials), edge computing will play a key role in bolstering campus networks. Sometimes called “Generation AI,” this cohort will be using AI technologies in almost every aspect of their lives. To support an exponential amount of AI-enabled IoT technologies connecting to campus networks, universities and colleges will need 5G networks and mobile edge computing.

MORE ON EDTECH: Georgia Tech researcher discusses how AI can improve student success.

What Are Some Examples of Edge Computing?

Edge solutions make it possible for post-secondary campuses to adopt what Gallego describes as a three-tiered computing model: on-premises, at the edge and in the cloud, with each fulfilling a specific purpose.

Onsite servers might be used to securely store confidential financial or research data, while the cloud underpins hybrid and remote learning frameworks. Edge computing, meanwhile, offers benefits for data-driven research, especially time-sensitive research projects that require immediate data processing.

Ramsés Gallego
It’s the Internet of Things, but it can also become the Internet of Threats.”

Ramsés Gallego ISACA ambassador

“Edge computing is beneficial for anything related to research or high computational needs,” says Gallego. “That includes machine learning, DNA research, quantum computing and healthcare research.” In the case of machine learning, unsupervised learning frameworks that train algorithms to recognize anomalous behaviors are better off closer to the data they rely on.

He also points to the benefits of edge computing in creating a larger computing network that can be shared by two or three campus sites rather than duplicating infrastructure for each one. “Schools have oceans of data and waves of information,” says Gallego. “Edge solutions play especially well in a collaborative framework.”

According to Gallego, edge computing in higher ed is all about finding the ideal place for specific operations. “The cloud or on-premises solutions may be effective,” he says, “but they’re not always efficient.”

Potential Pitfalls in Edge Adoption

Security and complexity are the two most common pitfalls in edge computing adoption — and they often go hand in hand. While edge deployments naturally increase the scalability, reliability and on-demand availability of higher ed networks, security is not inherent. When new devices, services and connections are added, infrastructure complexity and attack surfaces increase.

“We live the in age of IoT,” says Gallego. “It’s the Internet of Things, but it can also become the Internet of Threats. When you bring in a new platform, you need the right processes, protocols and procedures. You need to harden the platform. From a security perspective, this means ensuring things like encryption and tokenization. You need to be sure that it can be trusted.”

Still, Gallego says, he believes edge computing is the foundation of next-generation education initiatives. “We can be more intelligent. We can be agents of change. And we can be more generous in the way we share and interact with data,” he says.

filo/Getty Images