May 08 2020

How Higher Ed Chief Privacy Officers and CISOs Can Boost Student Privacy

As a new era of Big Data emerges, it is pertinent that CPOs and CISOs work together to ensure data collection is ethical and secure.

Facial recognition. Eye movement tracking. Video footage of students taking tests in their rooms. The abrupt shift to remote learning has led to the rise of automated proctoring services, forcing some universities to collect an unprecedented amount of data in order to prevent cheating during online exams. As higher ed institutions also contemplate using COVID contact tracing apps to help them reopen, colleges are on the precipice of collecting more data than they ever have before. 

With a new era of Big Data emerging, it is pertinent that Chief Privacy Officers and CISOs work together to ensure data collection and use in colleges and universities is ethical and secure. “Both security and privacy have their roles when it comes to protecting data about people,” says Kent Wada, CPO at University of California, Los Angeles. “The importance of our roles escalates as cyber risks grow daily.”

But at some institutions, this collaboration may not come as easily. For years, there has been a tendency for security and privacy departments to clash with one another. “It’s often stated that there’s a conflict of interest in allowing security to address privacy: Security wants to inspect everything, privacy wants to restrict that,” says Michael Corn, CISO for the University of California, San Diego. “I think that’s an old rivalry we need to move past.” 

MORE ON EDTECH: Here's 3 Ways Artificial Intelligence Can Improve Campus Cybersecurity

To encourage collaboration between CPOs and CISOs, Corn brings up a common misunderstanding about how security departments do their jobs. He believes there is a misconception that security offices must invade privacy to monitor campus networks for intrusions and attacks. 

“That kind of activity often involves monitoring websites being visited and looking at general network traffic. But we don’t look at content, we don’t read emails. We’re only looking for patterns,” Corn says. “In some institutions, this is clamped down on and looked at as unnecessary. Security professionals are obsessive about protecting privacy and treat the data we need to do our jobs with the utmost sensitivity.”

How to Improve Collaboration Between Privacy and Security Teams

The key to lessening tensions between the two departments, according to Corn, is establishing a common framework. “Security has been around longer in higher ed. It has a more mature establishment in an institution. Some newer, privacy staff are struggling to catch up,” he says. “If you don’t have a common understanding of scope and function, that can create friction.”

Meanwhile, Wada advises people to improve communication by avoiding jargon. “Privacy and security officers often come from different backgrounds,” he says. “We each need to learn to communicate with people who have a different vocabulary.” 

This is especially important when it comes to working on projects where partnership is critical. And at UCLA and UCSD, security and privacy teams frequently collaborate to tackle big issues. 

MORE ON EDTECH: Could Artificial Intelligence Solve Cybersecurity Staffing Shortages?

For instance, whenever UCLA considers purchasing new software that involves sharing data with a third party, it triggers a review by both departments. 

“My CISO and I work closely together during the purchasing process. He has a certain number of things he evaluates. I do the same thing for privacy. But there is overlap. We’re both often talking about data — how the data is going to be used, how it’s going to be stored,” Wada says. “If we can’t work well, or if we can’t work closely, that process simply won’t work.

Emerging Ethical Concerns to Consider

Proctoring services and COVID contact tracing apps are just the tip of the iceberg when it comes to privacy boundaries that universities are in the midst of defining. There is a mountain of ethical considerations when it comes to civil liberties and the future of Big Data.

For instance, where is the line between intrusive — and necessary — when it comes to early warning systems? 

“In early warning systems, we could watch cellphones on campus and see that so and so hasn’t left the dorm room for two weeks and is streaming Netflix all day. We’re not going to do that, but it’s possible. But is this something we should do? Where’s the line between using this sort of information for the betterment of students, say for suicide prevention, and when does it become too intrusive?” Corn says. “These are really nuanced issues that higher ed institutions across the board are not attacking as aggressively as we should.” 

Another important privacy issue that comes up during discussions of distance learning is whether instructors will be able to gauge, remotely, if students are paying attention. “When you’re teaching in a physical classroom, you can see if students are not showing up or if they’re reading other books,” Corn says. “Should you be extending the same access to digital data? To be able to see if someone is there, or not paying attention? Maybe. But the technology filling these gaps feels intrusive. It needs to be explored carefully.”

MORE ON EDTECH: How to Protect Data Privacy in a Remote Learning Landscape

“It’s not a math equation with a clear answer,” he says. “I recommend giving schools a little more time to think through these issues — if it’s a race to the end, you finish before understanding the nuances. We need to spend more time teasing apart what privacy means in higher ed.”

SARINYAPINNGAM/ iStock / Getty Images Plus