Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Dec 10 2025
Artificial Intelligence

AI in Higher Education: A Guide for Teachers

Discover how higher ed leaders are using AI in academia to boost engagement, efficiency and critical thinking.

For many faculty members in higher ed, conversations about artificial intelligence in academia often include the same concerns: There isn’t enough time in the day, AI will erode critical thinking, educators are already stretched thin, and we have to consider compromised data and privacy concerns.

The list of fears and frustrations from faculty go on, but as universities explore the benefits of generative AI in higher education and look to the future of their classrooms and what’s best for students, it’s obvious that AI needs to find a place on the syllabus.

“At a time when everybody’s overwhelmed, having to do more new things is hard,” says Laura Morrow, senior director for the Center for Teaching and Learning at Lipscomb University. “Fear of what’s going to happen is a big barrier.”

For institutions beginning this journey, practical guidance matters just as much as philosophical alignment.

Click the banner below for expert guidance on incorporating AI in higher education.

 

From leadership of teaching centers and deans of colleges to faculty fellows in AI, universities are finding success in a grassroots movement, creating safe spaces for faculty to try new tools and even allowing the time for them to grieve the loss of the classroom they’ve long known and the favorite assignments they’ve always turned to.

The key to turning the tide on teaching AI in academia lies in returning to the basics: focusing on strong, sound pedagogy and learning to see AI as a new tool that can add to what’s in labs and lecture halls.

The Rise of AI in Higher Education Classrooms

In 2022, higher education began to see ripples — the fears, concerns and excitement about AI. Morrow recalls that the first message they shared with faculty focused on helping students realize and understand the responsibility that comes with using AI tools.

Across the country, students were already experimenting with AI in ways that shaped these early conversations. Many were using AI to summarize dense readings, generate study prompts, brainstorm ideas or refine early drafts of papers.

Other students use these AI tools to search for information or ask nuanced questions they might hesitate to raise in class. Naming these behaviors helped faculty frame responsible use not as a theoretical exercise but as a response to real student habits.

LEARN MORE: How some universities are integrating AI ethically.

Morrow and her team gave faculty the time to vent about the altering landscape of higher education. In this safe space, faculty advocates started to come forward. These early adopters helped demystify and debunk how the tools could be used in the classroom safely, ethically and without sacrificing the rigor of the learning experience.

At institutions like Arizona State University, students were already engaging with AI-powered tutoring systems that support math and writing. Meanwhile, at Georgia State University, AI chatbots answer routine advising and financial aid questions, which frees staff to focus on more complicated interactions.

As early adopters experimented, a clearer picture emerged of what ethical, student-centered AI adoption should look like in practice, with real use cases that actively incorporate AI into their learning ecosystems.

Universities such as Lipscomb University and Dartmouth College have begun establishing ethical AI frameworks that guide everything from first-year writing support to discipline-specific applications in engineering, business and digital media.

Laura Morrow
Really good pedagogy is the antidote for using AI.”

Laura Morrow Senior Director of the Center for Teaching and Learning, Lipscomb University

The rising voices at Lipscomb gave way to more faculty feeling comfortable, says Adam Wilson, assistant director at the Center for Teaching and Learning: “This was really faculty-driven, very much from the bottom up.”

Professor Sarah Gibson, Lipscomb’s faculty fellow in AI, echoes the importance of these initiatives being faculty focused.

“It’s so important for leaders to remember that AI adoption is more about mindset than the tools. For this reason, it really needs to be a grassroots movement focused on faculty and their needs,” says Gibson, who is also director and professor in the School of Communication and director of the Center for Digital Media Ethics at Lipscomb.

Having administration and IT leadership as partners is also critical. Lipscomb offers two universitywide AI tools to eliminate equity concerns, particularly with regard to equal access.

As universities scale these efforts, trusted technology partners can play a critical role in helping institutions balance innovation with privacy, accessibility and security.

EXPLORE: How industry partnerships help universities deliver a secure generative AI experience.

How AI Is Transforming Pedagogy and Teaching Methods

Such efforts give faculty a clearer sense of how to use AI in higher education in ways that strengthen, rather than compete with, their existing teaching approach.

Morrow believes “really good pedagogy is the antidote for using AI.” She references the value of an iterative process, scaffolding assignments and getting to know students every semester. She encourages faculty to remember they’ve always been capable of these connections — and of catching cheating prior to the incorporation of technology, she jokes.

Instead, Morrow says, this “day of reckoning” is a “beautiful opportunity to look at things with fresh eyes and go back to some of those basic, good pedagogy principles.”

At the same time, professors across the country have found practical ways to use AI to support their own teaching. Faculty at Lipscomb, Dartmouth and other higher education institutions have begun using AI to generate writing prompts, provide structure for assignments, identify potential curriculum gaps and create prototypes for data science or design courses.

Supporting Faculty to Help Them Use AI Effectively

The Center for Teaching and Learning at Lipscomb holds monthly sessions for faculty to learn from each other, showcasing examples of how AI is being used in the classroom.

Carlos Guevara, director of the Office of Educational Technology and co-director of the Center for Teaching and Learning at Hostos Community College, encourages leaders to create safe spaces for faculty to test out ideas. Whether through “AI hackathons” or “sandbox” sessions, Guevara believes that “when faculty members have agency, support and trust, they are more likely to participate.”

Many institutions are building similar spaces for exploration in a way that gives faculty clearer insight into where and how AI can add value while remaining anchored in strong pedagogy. At the University of Florida, AI literacy modules are now embedded across general education courses, while Northeastern University uses AI-driven tools to help students map skills and prepare for co-op placements.

At Northcentral Technical College, creating a solid framework paved the path to successful adoption, says Brooke Schindler, dean of the School of Liberal Arts Transfer, Education and General Studies. NTC’s AI framework was created to answer the question, does this integration of AI “count”?

“We needed a way for faculty to determine a meaningful and intentional curricular integration of AI,” says Schindler.

Using AI in the Classroom: Practical Apps and Tools for Educators

The key is also to start small — both for faculty and students. Rather than asking faculty to completely rewrite their curriculum or incorporate AI into every possible avenue, help them find moments.

“Adoption becomes cultural rather than technical when educational institutions present AI as a pedagogical partner rather than merely a new necessity,” says Guevara.

At Lipscomb, faculty use bots to help students through some of the hardest moments of learning, such as reading difficult poems or understanding academic research. By programming bots to give guiding questions, rather than answers, students are encouraged and supported in their critical thinking outside of office hours. Wilson has created a simulation that allows students to interview residents of historical Jamestown; other simulations have been created to assist with interview preparation.

“We’ve watched AI transform the student experience. It doesn’t shortcut faculty work, as some suspected might happen early on. It actually deepens the conversation between faculty and students,” says Gibson.

DISCOVER: How higher ed institutions are deploying AI across campus.

At NTC, AI allows students in the video production program to practice interactions with clients. Auto-transcription saves them time in post-production.

“There are a lot of misconceptions about AI, that it is the end of critical thinking as we know it, and what we’re learning is that the rigor in some classes has actually increased with really well-designed AI uses,” says Morrow.

FG Trade/Getty Images