Navigating Generative AI Use in the Classroom
In the classroom, questions about generative AI use remain, particularly among faculty. In “Driving Instructional Value with Practical Generative AI Implementation,” representatives from the University of Texas at Austin discussed their partnership with Grammarly in development of a faculty toolkit with lesson plans and activities to implement AI for teaching and learning.
Julie Schell, assistant vice provost of academic technology at UTA, discussed some of the nuances present with AI use in the classroom. When ChatGPT first hit the scene, UT Austin banned the use of AI detection software because they believed this software is in conflict with the university’s academic honor code. Students want and will need to know how to use AI in the future, but to do so successfully and ethically, they will have to navigate AI’s ambiguity.
LEARN MORE: This university developed an AI use policy.
“I think it's really important to know that you're smarter than the generative AI,” Schell said. “It doesn't have your expertise, it doesn't have your experience, it doesn't have your ethics, and it doesn't have your creativity, and we need to help students see that.”
As the UT Austin team went through their internal learning technology adoption process to vet Grammarly as a partner in this initiative, the UT Austin Office of Academic Technology was also developing its AI Forward – AI Responsible Framework.
“We worked with stakeholders to identify the six key limitations of engaging with generative AI: security and privacy, hallucinations, misalignment, bias, ethics and cognitive offloading,” Schell said. “We wanted to make sure that whenever anyone introduced anything related to generative AI in the classroom, our students were also exposed to those six key limitations.”
Taking this one step further, in the fall of 2023, Academic Technology Manager Evan Daniel was brought on at UT Austin to begin development of the generative AI faculty toolkit, known as “The Faculty Guide to Getting Started With AI.” The goal was to design a guide that would be useful for a faculty members across disciplines with a range of AI experience.
AI can be either transactional or transformational, Schell said. Users can either feed it a prompt, get a result and accept that result without question, or they can enter a refined prompt and use higher-order skills to critically review the result.
“AI is full of so many paradoxes. It's a paradox cocktail,” she said. “It can teach you almost anything, but it can also teach you bad information. It can speed you up and help you do work faster, but it can make you sloppy. It can help you expand your creative voice and help you have more creative ideas, but it can also take new ideas from other people and make you less original.”
Taking this into account, the resulting toolkit is designed to help faculty think critically about the information they’re receiving from generative AI.
EXPLORE: Set guardrails for using generative AI at your institution.
“For the folks who are really just at the beginning, having a step-by-step guide is by far the easiest place to get started, and something that they can do feeling confident and having some agency over the process,” Daniel said.
The feedback on the toolkit has been “overwhelmingly positive,” Daniel said. The goal now is for faculty to build on this work and share their findings with each other.
“We are in the process of taking this this experience and developing these lesson plans, which are pedagogically forward,” Schell said. “We have the instructors thinking about what they want. What are some of the common challenges or misconceptions that students might have for the content that they're going to be learning? What are the learning outcomes? What do you want students to know and be able to do after engaging with this lesson plan? That's good pedagogy, and we want to scale that, particularly with our AI engagements.”
Keep up with EdTech: Focus on Higher Education’s coverage on our EDUCAUSE event page and via X.