Close

See How IT Leaders Are Tackling AI Challenges and Opportunities

New research from CDW reveals insights from AI experts and IT leaders.

Aug 05 2025
Artificial Intelligence

AI in the SOC: How Universities Are Using AI to Fill Cybersecurity Gaps

Higher education IT teams are responding to staff shortages with innovative solutions to improve cyber defenses.

Almost 7 in 10 cybersecurity leaders are experiencing staffing shortages, with a global shortage of 4.7 million skilled workers, according to the 2024 ISC2 Cybersecurity Workforce Study. In many colleges and universities, that means turning to student workers to keep their security operations centers covered.

And in the age of artificial intelligence, some SOC leaders are finding that new tools can not only simplify the process of onboarding student workers but can also extend the capabilities of the entire security team.

Using AI tools “has definitely helped our incoming student employees to get up to speed much more quickly, as well as contributing to the improvement of our detection rules and automation sooner,” says Emily Longman, SOC manager at Oregon State University, where the SOC staff includes five professionals and 10 part-time student employees.

“We’re regularly bringing on new student employees and getting them trained,” she adds. “Using AI has helped them get to the more advanced job roles much faster, which gives the full-time folks more time to lead projects and work with the broader university community.”

Click the banner below for more higher ed insights from the 2025 CDW AI Research Report.

 

Microsoft reports that organizations using Security Copilot have seen a 54% reduction in the time needed to resolve device policy conflicts (and a 22.8% drop in alerts per incident) within three months of adoption, freeing up teams to focus on more strategic work.

AI in cybersecurity is a tool to assist — not replace — IT and security teams. As SOC staff members tackle the onslaught of cyberthreats, AI tools can help by handling more repetitive, time-consuming tasks and providing on-demand access to expert-level security experience.

“The reality is that cybersecurity threats are growing in both volume and sophistication, and colleges and universities are looking for ways to strengthen their defenses without overburdening already-stretched IT teams,” says Corey Lee, security CTO at Microsoft. “That’s where AI-powered security tools come in.”

AI Expands the Capabilities of Existing Cybersecurity Tools

At Auburn University, the SOC has been using Microsoft’s Security Copilot since 2024. Not only has the tool helped the team automatically analyze incidents and summarize what happened, but it has also provided training for student workers and performed tasks that team members weren’t ready to do on their own, says Jay James, Auburn’s cybersecurity operations lead.

In the past, when a security incident occurred, “analysts would have had to go through all the data and build a report,” James says. But with AI, the team just asks for an overview of the alert and accesses an automatic report.

“It has also helped us build executive dashboards and complete other tasks that would have taken a lot of time,” he says. “It’s a lot easier to debug and fix issues than to start from scratch.”

Jay James
We don’t have to hand-hold students as much, and it provides skill sets that we didn’t have.”

Jay James Cybersecurity Operations Lead, Auburn University

Also, student workers can use the AI tools to get up to speed quickly. When the system returns an executive-level report of an incident, the student can ask for a simplified version to help them break it down and understand it easily. As students use the tool to investigate incidents or potential phishing emails, for example, they can also ask the tool to critique their prompts and provide feedback about what they should consider moving forward, James says.

“It has absolutely freed up staff,” James says. “We don’t have to hand-hold students as much, and it provides skill sets that we didn’t have.”

RELATED: Learn how AI can help universities improve the student experience.

AI in the SOC Can Help Universities Manage Risk More Efficiently

Like any powerful tool, AI must be used thoughtfully and responsibly. That starts with developing policies, procedures and governance for the use of AI, James recommends.

“It changes so often, and you need rules in place so you can control it,” he says. “Also, require staff training on how to use AI and use it responsibly. The tools must be well vetted, and it’s important to manage how they’re collecting data.”

The riskiest way to apply AI to security “is to dump the entire job on AI and pat yourself on the back for saving a lot of staffing money,” Longman says. “AI will always be limited in its ability to reason and judge complex situations, so it will never be able to do the whole job.”

AI is best used for incident triage, data gathering, correlation and summarizing — not as a replacement for analysts, whose roles require constant inductive reasoning and judgment calls, Longman says.

Getting Started With AI Security in Higher Education

College and university SOCs that are not yet using AI have a number of options.

“There are so many great tools out there, many of them free, that can help improve security operations,” Longman says. “Even just using them to generate some reporting templates, phishing email examples and community awareness documents can do a lot to stretch a very budget-restricted security program.”

When selecting AI tools, Longman recommends choosing products that integrate with the existing environment.

“At OSU, we’re already heavily invested in the Microsoft ecosystem for software licenses, endpoint management, communication and security tools,” she says. “This made Microsoft’s Security Copilot a great choice since it natively integrates with most of what we already use. But a university using a mix of tools would probably have to work a lot harder to get the same value out of it, so they might be better off using a product designed to fit their setup.”