Jun 02 2023
Data Analytics

How Learning Analytics Impacts Higher Education

Institutions that can successfully harness and share learning data put their students on the path to success.

Analytics have taken hold in modern American society, with so-called Big Data helping to disrupt everything from politics and baseball to the ads you’re being fed on this very browser.

Collecting, interpreting and disseminating data is not a revolutionary concept. Yet, as technology has allowed for information to be gathered and digested more quickly and easily, the field of data analytics has grown tremendously, a trend that is expected to continue. The U.S. Bureau of Labor Statistics anticipates 23 percent job growth in the field by 2031.

In higher education, colleges and universities are preparing their students for careers in the field while simultaneously collecting data on those students to help them earn their degrees and succeed. That study of student behavior and outcomes is broadly known as learning analytics; if used correctly, it can impact students, faculty, staff and administrators across campus.

Click the banner below for exclusive content about data analytics in higher ed.

What Is Learning Analytics?

The Society for Learning Analytics Research defines the field as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs,” but the short version is that learning analytics, as the name suggests, is any data-driven analysis of student learning.

Educators have been collecting data on their students for as long as they’ve been teaching. The decision to award a student a letter grade of A, B, C, D or F, based on their performance, creates a data point for someone to analyze. Contextualize those grades by considering the method with which a person was taught, their home life, their sleep habits or any number of other variables and — congratulations! — you’ve generated some very primitive learning analytics to assess.

Bart Collins was working in IT at Purdue University in the early 2000s when he and his colleagues first started looking for ways to harness the information they were gathering through early learning management systems.

“We were converging on common environments and trying to develop ways of using the data that might be in those environments to solve institutional problems,” he says.

LEARN MORE: How the modern data platform fuels success.

Collins, who now works as the director of graduate studies for Purdue’s Brian Lamb School of Communication, was involved in developing learning analytics–driven solutions during the early years of his tenure at the West Lafayette, Ind., university. He collected and analyzed data as part of an effort that eventually led to the creation of Course Signals, a solution launched in 2009 that tracked student progress in an effort to keep instructors and students well informed and allow for early intervention when warranted.

When it was released, Course Signals was a first-of-its-kind program, and it did more than just track students’ grades. It factored in demographic information, data collected from student activity within the course management system, the students’ prior academic histories and much more.

“What can we learn, uniquely, from the learning management environments that might give us better, additional, earlier insight into how students are doing?” says Collins in describing the creation of Course Signals.

While the efficacy of solution — particularly when it came to boosting student retention — was heavily scrutinized in the years after its release, the technology itself was a major step forward for learning analytics. It packaged the mountains of information institutions were collecting in an easy-to-understand way, with a targeted goal of helping both students to learn better and instructors to teach better.

How Can Learning Analytics Help Improve Student Outcomes?

As with any data set, the effectiveness of learning analytics depends a great deal on the quality and quantity of data coming in and on the careful use of that data by the people reviewing it.

To develop useful learning analytics, Collins believes that aim must be clear from the outset and that programs should be designed with the eventual outputs in mind. For example, when analyzing data taken from an LMS, the more detailed and deliberate the course design, the more granular the data you’ll be able to harvest. More granular data can help draw more specific conclusions.

“If you use the learning management system in trivial ways, you’ll get trivial information out of it,” says Collins. “Richer courses that plug in more features, have more content, enable more services and have more variation in what students are doing — that provides more diagnostically useful information.”

There are also challenges in asking the right questions of a particular set of data. Misinterpreting data, whether deliberately or not, can lead to wrong conclusions. Even asking the right questions might not be enough, because students are people, and analyzing human behavior without taking myriad known and unknown personal factors into consideration can have consequences.

“There are a lot of variables, a lot of assumptions, a lot of mitigating factors. It’s hard to even pose a clean question, in my mind,” says Collins.

Bart Collins
I see a whole lot more prescription around faculty use of those tools and student use of those tools because of their central role in the success of the program.”

Bart Collins Director of Graduate Studies at Brian Lamb School of Communication, Purdue University

One way to drown out the statistical noise is with a large enough sample size. To do that at an institutional level takes buy-in from the top down. Because learning analytics can involve combining data and resources from several arms of an institution — including academic departments, the IT department, admissions and even the registrar — it requires a major, universitywide commitment.

“It’s expensive. It takes a lot of resources,” Collins says. “It’s got to be an institutional priority, above and beyond all the other things people do.”

Strong institutional buy-in can lead to greater volumes of data for analysis, which is great for a single campus. However, for learning analytics to be useful to the field of higher education at large — for example, by providing answers to big questions, such as how to retain students — Collins believes the data from one campus is not sufficient to make any sweeping conclusions.

“We don’t have enough multi-institutional, collaborative efforts around which to build a coherent approach to measuring these things,” he says.

DIG DEEPER: Why the best IT roadmap is a flexible and inclusive one.

Even if good data is fed into the analytics system, and large volumes are collected and specific questions are answered by that data, Collins says, it’s still incumbent on deans, faculty members and students to take that information and put it into action.

If the information is properly implemented, it can do more than just boost student success, Collins says. It can allow faculty members to better tailor their lessons, leaving them more time for research and grant writing. It can also help university administrators choose the technology and teaching tools most likely to be effective for their students.

Because of all that, Collins continues to believe in the potential of learning analytics, especially in the age of online learning.

“I’m really excited, and we’re certainly in a better place today to get data, organize it and do things with it than we were 20 years ago,” he says. “Our ability to visualize, mine and connect data is much better. We should be in a better-than-ever position to answer questions about the role of analytics.”

How Have Modern Learning Analytics Evolved?

Twenty years ago, when Collins and his colleagues first began to have interest in learning analytics, higher education looked a lot different than it does today.

“It was done in the service of the traditional, brick-and-mortar classrooms that were starting to use a learning management system,” he says.

Today, nearly everyone is using an LMS of some kind, and almost all universities offer some kind of online or hybrid learning option. Those modalities rely heavily on the use of LMSs, some of which now have their own self-contained learning analytics tools — such as AI tutors or personalized learning pathways — that adjust to a students’ progress.

The ubiquity of online teaching and learning has also accelerated the adoption of technology by faculty across disciplines. Just about every instructor now incorporates some kind of tech into their classroom. That understanding of how tools like an LMS work can lead to better data.

“I see a whole lot more prescription around faculty use of those tools and student use of those tools because of their central role in the success of the program,” Collins says. “The data’s going to be more valuable, there’s more variation, there’s much more of it, and a significantly larger percentage of activities that affect learning outcomes are happening in those environments.”

FatCamera/Getty Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT