Technology has always been a priority at the University of Southern California’s Annenberg School of Communication and Journalism. We believe our students benefit from a robust technology environment that enhances and supports their academic experience here and in their careers. Accordingly, we recently reimagined our annual student technology survey and, in the process, gained valuable insights into both our internal process and our students’ use of technology.
Analyze Survey Data to Make Better IT Purchases
To help us understand our technology environment, my team has conducted a student technology survey every year since 2010. Our data has helped us prioritize investment decisions and align them with the curriculum. In fall 2013, for example, our survey showed that 99 percent of Annenberg students owned notebooks or desktops, and close to 85 percent had smartphones.
That information gave us the opportunity to implement a platform-agnostic, mandatory notebook policy in fall 2014. This allowed us to accomplish four goals: eliminate seven computer labs, with roughly 250 computers; reallocate that portion of our annual operational and capital investment to the school’s network infrastructure; invest in technology tools, including an enterprise agreement with Adobe that delivered Creative Cloud to all of our students, faculty and staff; and create the Digital Lounge, an extensive online resource for students and faculty.
Surveys Push Tech Quality over Quantity
Annenberg has always had high participation in our surveys — in some years, up to 65 percent. From the start, students created our surveys with themselves and their peers in mind. For the first five years, we focused on quantitative data. But in 2016, we took a qualitative approach, to go beyond measuring the types and number of devices our students were using. We reasoned that because we already had solid baseline data, what we needed to know was how and why students were using their devices.
For example, to what extent do students use various portals? How much do they use their devices to communicate with faculty and peers? Do they use devices to create content? Manage schedules? Do research or work on assignments? Search for jobs or internships? How much time do they spend on social media?
We collected results from more than 600 students, a response rate lower than in past years, but one that we expected because these surveys were more time-intensive. Our efforts to focus on quality over quantity paid off, as we gained insights into user experience that complement our previous surveys.
This new approach, combined with our past experiences, has yielded several best practices for a survey program that will deliver baseline information and all-important user experience data to help support curricular requirements and drive investment decisions:
Track the installed base, then opt for a user experience survey. It makes sense to get an initial benchmark, but once you have determined that the college has a strong installed base, move forward and find out more about the user experience. In 2016, we learned that students used their notebooks and desktops for all the activities we wanted to measure, such as communicating with faculty and working on assignments. The only functions that scored a bit lower were managing schedules and communicating with peers (students primarily used smartphones for that). We also learned that most students like to learn new technology by figuring it out on their own, asking peers or viewing videos on YouTube or Lynda.com.
Engage students. From the beginning, students have been key to our success. Starting in 2011, my student team managed all facets of the survey. With a little guidance and the recognition that they know how to communicate effectively with their peers, the team developed our overall strategy. They created a detailed plan that included a series of tabletop sessions in our lobbies (complete with snacks) to give students a chance to learn more about the survey.
Hire professional researchers. It’s fairly easy to send out surveys about total device usage, but we decided we needed experts to gather user experience insights. Given how different this survey was, we wanted a team experienced in developing user experience questionnaires who would give the survey their full attention. In 2016, we hired MO Studio, a human-centered design company, to develop and manage the surveys.
Challenge conventional thinking. Over the years, we have used similar techniques to review whether we would continue a technology service for students, and we transferred those concepts to the surveys. When we deploy a service tool, we constantly ask ourselves how well we are deploying the tool and what we could do better. We do the same with the surveys. This helps us update them every year and led us to work with MO Studio.
Learn from the outcomes. This year, we found that most students still work on notebook and desktop computers. Although some students do schoolwork on smartphones, only a small percentage say they work on tablets.
That told us that while we need to work more closely with students to support a range of mobile devices, we should continue to develop our core infrastructure and improve our wireless network so students can more effectively use notebooks and other devices that become popular in the future. Students also said they wanted more training on Adobe applications, analytics and data science, so we have enhanced those support resources.
The 2016 survey gave us a user experience baseline against which to measure future survey results. For example, although tablet numbers are currently low, that could change as tablet functionality and our infrastructure continues to expand. I also expect that results will vary by institution and how students use the devices. At Annenberg especially, many of our students are content creators, so they may always need notebooks for more demanding tasks.
Whichever devices our students use, we are confident that because we have been thoughtful in gathering insights into their technology habits and preferences, our investment decisions will always be based on good information.