What, and Who, Is Involved in AI Governance?
AI governance should include ethical guardrails, and in higher education, that could mean ensuring academic integrity or data privacy, according to Fadi Fadhil, director of public sector field strategy at Palo Alto Networks. He would like to see universities give AI compliance “teeth” to prevent unauthorized use of AI tools.
“The reason we do this is to ensure ethical, legal and secure, mission-aligned use across the three legs of teaching, research and administration,” Fadhil says. “This helps us establish standards and oversights and, most important, accountability for AI implementation decisions.”
Fadhil recommends that universities treat AI governance as an international cultural norm and not as part of a silo or fad. That involves creating “ongoing feedback loops,” which would be different for undergrads, grads, faculty, researchers and staff, he says.
The president, provost or board set the strategic direction, while a steering committee would conduct cross-functional work with academic affairs. Faculty must interpret the educational use cases and provide guidance on AI use. Meanwhile, students must also be represented on AI policies in the classroom, according to Fadhil.
WATCH: Industry experts share the biggest AI trends in 2026.
How to Create Clear AI Usage Policies for Faculty and Students
A growing number of universities consider acceptable use policies around AI to be important to the development of a responsible AI framework. The number of institutions with AI-related AUPs rose from 23% in 2024 to 39% in 2025, according to EDUCAUSE’s 2025 AI Landscape Study.
“Institutions should establish clear unit- or department-level AI policies,” says Jenay Robert, senior researcher at EDUCAUSE. “Specifically, establish AI policies that align with institutional guidelines, maintain compliance and balance structure with professional autonomy.”
Only 9% of respondents considered their institutions’ cybersecurity and privacy policies to be adequate to address AI-related risks, according to the EDUCAUSE study. Academic integrity policies can mitigate AI-related risk and strengthen AI governance, Robert says.
Establishing AI governance in education involves compliance with the Family Educational Rights and Privacy Act, the Americans with Disabilities Act and the Cybersecurity Maturity Model Certification, according to Fadhil. He recommends introducing a rigorous data governance framework that defines ownership, quality and access control for data.
“This aligns with adoption of a zero-trust mindset and actionable security versus collecting a lot of logs and not many people being able to act on it,” he says.
In addition, universities should conduct annual policy reviews on AI with stakeholders across campus, Fadhil advises.
DISCOVER: Data governance is a key element of AI implementation.
Robert recommends that AI governance involve professionals across an institution, including IT, data security and privacy, academic units, pedagogy experts and business units.
Molina also sees AI governance as an effort that must involve professors, academic leaders, administrators, technologists and in-house lawyers.
“In teaching, always consider the benefits and drawbacks for all students,” Molina says. “For research, be mindful of methodology, grant, publication and protocol requirements. Administrators must be particularly careful about compliance and regulations.”
However, Molina notes that students are rarely consulted in AI governance. He shares how clear usage policies are established at Drexel:
“At Drexel University, several of us — some administrators, but mostly faculty members — serve on the Standing Committee on AI under the leadership of Professor Steven Weber. We stand on the shoulders of giants, so we read what other organizations have written about this before us. Few fully understand what AI is, does or can do, so we invite experts, both in-house and external, to help us think about these issues.”
Molina adds that universities should look for AI problems such as biases, fabrications and hallucinations.
