Keeping up with new teaching strategies is an ongoing race for any faculty member. This is particularly true when considering the challenges all professionals face while learning new things about technology and data.
In the book Teaching Smart People How to Learn, Harvard professor Chris Argyris acknowledges that many successful professionals “have rarely failed,” and, as such, “have never learned how to learn from failure.”
So, upon approaching something new to learn, they begin to see weaknesses in their own knowledge: “They become defensive, screen out criticism, and put the ‘blame’ on anyone and everyone but themselves. In short, their ability to learn shuts down precisely at the moment they need it most.”
Recently, we asked faculty members at the University of Pikeville Kentucky College of Osteopathic Medicine (KYCOM) to adopt an embedded-assessment approach in order to better prepare students for the Comprehensive Osteopathic Medical Licensing Examination (COMLEX).
To say that faculty members were at first resistant to the idea is an understatement.
Within a medical school setting, as within any educational space, faculty often approach their teaching based on their feelings about how lessons have performed in the past, with little statistical data to support their observations.
Herein lies an issue: Whereas faculty need to “reflect critically on their own behavior, identify the ways they often inadvertently contribute to the organization’s problems, and then change how they act,” according to Argyris, their complacency with using traditional, non-data-driven teaching techniques might actually hinder their teaching effectiveness.
We learned in the process of onboarding faculty to the embedded-assessment approach that there are practical steps any institution can take to guide faculty to modify their pedagogical strategies in real time, based on students’ actual learning needs.
At KYCOM, we conducted an internal “piloting” process of new computer-based testing (CBT) and analytics software (ExamSoft) to encourage faculty to more actively use assessment data in modifying their teaching strategies.
The move was a delicate transition, and as faculty became more supportive of the new strategy, we learned a few lessons that may be helpful for other universities looking to improve teaching strategies through embedded assessment:
Our first step was focused solely on the benefits of CBT — a reminder to faculty that students find paper-and-pencil testing outdated, rigid and arduous. For example, once, a student complained about being given a test about chest X-rays that he could not read because of the poor image quality on the paper copy.
The instructor’s solution was to allow the student an extra hour for the test, but this, of course, did not improve the quality of the image. We also reminded faculty of a visit from a representative of the National Board of Medical Examiners.
In the meeting with the representative, we were shown a sample board-exam question that included a video, and another question that included a sound clip — neither of which can be produced on paper. Anecdotes such as these helped to prove to faculty the value in using CBT.
Next, we worked with some faculty members (who were the ones most likely to become superusers of the system) on a one-on-one basis. Then, at the beginning of the 2012 academic year, we held a full demonstration for all faculty members, which focused on how the new assessment system benefitted them. We walked faculty step-by-step through the functionalities, features and reports, making sure we tied them to specific classroom objectives.
In our case, faculty members realized the value when they saw that the system’s reports could break down assessment performance data by item or artifact, by subject or objective, by class, and by student, and then arrange that data in a visual manner so they could use reports to see in which areas students are struggling the most.
It is easier for anyone to process data — and, in this case, assessment data — when that data has been represented in a visual, digestible form. When showing faculty these reports, we also especially emphasized the time-saving benefits. Instead of having to aggregate the data by hand, the system allowed the instructors to generate reports with just a couple of clicks of the mouse.
Many of us gravitate toward education in order to ensure that students succeed, and the opportunity to produce scholarship of teaching and learning is greatly enhanced when a teaching- and learning- analytics system is in place. Easily accessible and actionable data support fine-tuning in teaching methodologies.
The documentation of those adjustments allows faculty to use their data to contribute to SOTL. Several KYCOM members, including myself, have submitted abstracts for and been accepted to speak at teaching and learning conferences and/or assessment conferences, which gives us great confidence in pushing for this change.
KYCOM admitted its largest-ever first-year class in the fall of 2012. This class is nearly double the size of our second-year class, and this first-year cohort has so far been the only one that has been tested with the new CBT and embedded-assessment analytics system.
Around the beginning of the spring semester, our second-year students began saying that they wished they could be tested with the CBT and analytics system as well, because they were envious of the feedback their first-year counterparts were receiving.
This unexpected occurrence has perhaps been the most convincing part of our faculty onboarding process: When students demand a change in assessment practice, faculty will follow.