Jun 20 2019
Internet

Breaking Down Data Governance: Data Quality

The success of any data-driven initiative depends on that data being relevant and trustworthy.

As more colleges and universities look to data as the linchpin of responsible strategizing and planning, many are coming to the same realization: Some data is better than others — and bad data can be even worse than no data at all.

According to one recent whitepaper, more than 6 out of 10 higher education institutions are either considering launching a campus data analytics program or already have such a program in place. 

Most see the insights they can gain from data as critical to improving operational efficiency, but a growing number also use data and analytics to help drive student success. Across all of these initiatives, those involved in the management and utilization of data depend on relevance and reliability.

But this component of data governance — the assurance of data quality — can be a difficult proposition for any institution. Given the sheer volume and variety of data available, how can one know what is good and what is not?

The answer, experts say, involves a multipronged approach, including deploying specialized tools to streamline data collection and a commitment to best practices through professional support and training. 

In a 2018 study on data governance in higher education, Cary Jim and Hsia-Ching Chang write that data quality is the “foundation of the data-driven decision-making process.” 

Data must be “genuine and trustworthy,” the authors note, or “the output will be misleading and ineffective.” Here’s an overview of the data quality issues many universities face as they turn to analytics, and a closer look at how they might be resolved.

MORE FROM EDTECH: Check out how universities are using data analytics to support academic advising.

Data Quality Problems in Higher Education

Perhaps the biggest data quality challenge institutions repeatedly face has to do with the vast array of potential data sources

Whether they’re running a data analytics initiative or not, universities collect data on anything and everything, from student demographics to on- and off-campus housing to a wide range of business variables that impact overall performance. 

As Chris Frederick, business intelligence manager for the University of Notre Dame, explains in a 2018 Q&A with EdTech, “if you were to ask how many students we have, different groups might give you five different answers.” 

Those groups all have to be “on the same page,” Frederick says, to ensure they are “working from a source that we all trust and agree on.

Illustrating a similar point, the education consulting company EAB describes what it believes are the primary data quality challenges on the academic front alone. 

Among them, the firm reports, are inaccurate tabulations of individual instructors’ responsibilities, outdated or disorganized professor and department codes and inconsistencies in the way institutions document maximum capacity for course sections. 

This last issue, for example, can result in the miscalculation of class section fill rates. That, in turn, can lead to surplus sections and result in “wasted resources,” states EAB.

MORE FROM EDTECH: See how universities are addressing the complexities of data analytics programs.

Data Governance in Higher Education Can Solve Data Quality Dilemma

According to IBM, high-quality data includes four “key attributes”: 

  • Completeness: Related data must be linked from all possible sources.
  • Accuracy: Data must be correct and consistent, with no misspellings, for example.
  • Availability: Data must be available upon demand.
  • Timeliness: Current data must be available. 

Because colleges and universities deal with so much data, most choose to implement data quality tools — see Gartner’s 2019 Magic Quadrant for top vendors — that automate much of the quality assurance process through data cleansing, matching, monitoring and other means.

Beyond the available IT solutions, the surest path to data quality assurance typically involves a combination of education and collaboration

At Purdue University, for example, its Office of Institutional Research, Assessment and Effectiveness established several committees dedicated to standards and data governance in higher education, and even maintains a data quality subcommittee focused on finding solutions to data quality problems. 

And at Vanderbilt University, where they rely on “automated data quality processes,” including IT systems that identify data entry errors, all data issues are ultimately addressed by its institutional data governance team.

The team’s mission, according to the university, entails “establishing data governance policies, procedures, standards, and guidelines” to maximize the value of Vanderbilt’s data.

EAB, for its part, suggests IT professionals and faculty work together to establish policies and processes that drive universities toward better data management. 

Data governance, the company says, should fall under the purview of two main groups: a “prioritization committee” of executives, and a “definition- and access-focused committee of technologists and data custodians” representing the institution’s various departments. 

Ask all stakeholders to play a role in data governance, it and others in the industry recommend, and soon those nagging “data quality issues” will become opportunities for institutional growth.

PeopleImages/Getty Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT