Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Jan 02 2026
Artificial Intelligence

Is Your AI Falling Short? It’s Probably Bad Data

A strong data governance foundation is essential for higher education institutions to deploy trustworthy, effective artificial intelligence.

Higher education institutions are excited about the potential of artificial intelligence, but they’re often unprepared for what it demands of their data. Effective AI in higher ed isn’t just about deploying new tools. It’s about understanding, governing and operationalizing your data so that AI can be trusted, adopted and scaled.

Click the banner below for expert guidance on incorporating AI in higher education.

 

AI Adoption Starts With Data Governance

When institutions ask us about AI adoption, they often start with the tools. Which platform is the best? How can they integrate tools like Copilot or Gemini into their existing environments? Should they build their own models?

But implementing AI means taking a step back from tool selection and focusing on your foundation, and the foundation of all AI tools and models is data. Institutions must first understand the data they have; where it's located; who owns it; and how it is classified, secured and maintained.

When you’re dealing with unclassified sensitive data, data with confusing or inconsistent definitions, or unsecured or poor-quality data, it can produce unexpected — and sometimes dangerous — outcomes when inputted into a large language model. 

WATCH: Industry experts discuss AI’s 2026 trajectory. 

Embedded AI vs. Custom AI 

There are two broad flavors of AI in higher ed: embedded AI, in which capabilities are built into tools institutions already use, and custom AI, in which models and solutions are built on a college or university’s own institutional data. They require different levels of scrutiny, but both depend on the same underlying truth. If your data is wrong, out of date, poorly classified or badly secured, the AI will amplify those problems.

Even in embedded tools, we see trust issues. If AI drafts content in the wrong voice, uses a different language variant, or surfaces outdated or irrelevant information, users quickly lose confidence and disengage. That’s an operational problem as much as a technical one: If people don’t trust the outputs, they won’t adopt the tools, minimizing your ROI.

In higher education, custom AI solutions could include student success models, retention analytics or personalized advising, and here the stakes are even higher. Low-quality data leads to bad predictions, which lead to bad decisions. In a student’s life, that might mean misaligned course recommendations, missed risk signals for retention or an inaccurate assessment of support needs.

RELATED: Discover affordable strategies to strengthen your cybersecurity maturity.

What Data and AI Governance Look Like in Higher Ed

When we talk about data governance in universities, we’re talking about much more than a committee or a policy document. Governance defines where the data sits, who owns it and its level of quality. That sounds simple, but in higher ed, this can be a challenge.

In practice, poor data governance could look like inconsistent representations of “United States” in a dataset (“USA,” “US” and “United States of America,” for example) all treated as separate values. It could look like a confusing form field that means different things to different people. It could involve individual departments pulling and sharing their own data without considering the institutionwide implications. 

Governance puts structure around this to eliminate confusion and create a single source of truth when it comes to definitions and expectations. It also addresses access controls, indicating who is allowed to see what, how sensitive data is protected and what is off-limits for AI use.

AI governance builds on that by focusing on how models are trained, evaluated and used. This ensures institutions are training AI models on data that they are allowed to use, that this data is fresh, and that there is a way to track and correct harmful or biased outcomes. AI governance also ensures that an institution can explain and defend how important decisions are being made if necessary.

SUBSCRIBE: Sign up to get the latest EdTech content delivered to your inbox weekly.

 

Siloed Data, Centralized Control and Departmental Freedom

Higher ed is notoriously siloed, with departments, colleges and administrative units acting as separate entities. For AI and data analytics, that poses a huge challenge.

Sometimes institutions try to solve this by giving users broad access to institutional data and telling departments to just figure it out. There are risks involved with this method. Inconsistencies around data blending, definitions, metrics and security can all lead to poor outcomes.

Governance doesn’t have to mean locking everything down; it means centralizing control and standards while enabling broader, safer access. This could look like a centralized data platform where institutional data is curated, documented and secured; clear data products or predefined datasets for common questions; and self-service access for departments, within defined guardrails.

This is how you support a scenario such as an academic department asking, “Can we justify a new 400-level course?” The department should be able to get a high-quality, governed data set, rather than stitching together mismatched exports and hoping for the best.

DISCOVER: AI agents are transforming student services and support.

At CDW, when we engage with a university on data and AI, we don’t start with a massive, yearslong transformation plan. We start by helping you understand where you are today and what the future could look like.

In our data governance workshop, we sit down with key institution stakeholders to identify their information lifecycle; current governance structures; and means for determining stewardship, metadata, access and quality. Then we define goals and outline a realistic roadmap to get there.

The next step is a data quality assessment, in which we address accuracy, completeness, consistency, timeliness, refresh cycles and alignment with institutional definitions. The goal is to identify the most impactful gaps and prioritize remediation so future AI and analytics efforts have a solid foundation.

From there, we help institutions design and build an ecosystem that supports sustainable governance and trustworthy AI.

AI implementation is a journey, but without taking those first, intentional steps toward governance, AI in higher education will remain more promise than practice.

This article is part of EdTech: Focus on Higher Education’s UniversITy blog series featuring analysis and recommendations from CDW experts.

universITy logo
Getty Images: Aree Sarak