Unlock Exclusive Cybersecurity Insights
Complete the form below to be redirected to CDW's exclusive proprietary research report on Cybersecurity. Once the form is submitted, you’ll be opted into our Security email stream.
Many K–12 districts are grappling with how to use artificial intelligence securely. There are major concerns regarding sensitive information, such as student data being fed inadvertently into AI environments, unauthorized access to AI environments enabling data breaches and tampering with AI environments to alter outputs, among other risks.
AI security is even more challenging for districts that are subject to additional cybersecurity and privacy laws or regulations. K–12 districts also face budget and staffing shortages, which could hinder their ability to explore the potential advantages of leveraging AI technology in support of their missions.
One option for schools is to leverage their existing cybersecurity tools to safeguard their AI use. For example, Microsoft offers a variety of tools that can be used in combination to improve AI security. Districts that already have access to these tools can evaluate them and determine how they might address AI-related cybersecurity and privacy risks.
Let’s take a closer look at several Microsoft tools that could help schools secure their AI environments and AI use.
Complete the form below to be redirected to CDW's exclusive proprietary research report on Cybersecurity. Once the form is submitted, you’ll be opted into our Security email stream.
School districts that use Microsoft endpoints, applications and cloud instances may already have access to multiple Microsoft AI solutions, such as Copilot.
Microsoft frames Copilot as an “AI-powered assistant” that can help individual employees perform their daily tasks. For example, schools can use Microsoft 365 Copilot by itself, and they can also add role-based agents or create their own agents designed to help in particular roles. As part of Microsoft 365, Copilot is already subject to all of Microsoft 365’s cybersecurity and privacy policies and requirements.
Microsoft offers a range of extensible AI solutions under its Azure AI brand. For districts that want to create AI-powered apps, Microsoft provides the Azure AI Foundry toolkit. As of this writing, there are over 11,000 AI models available for use with AI Foundry, most developed by third parties.
The same AI models that are used with Azure AI Foundry are also available within Azure Machine Learning workspaces. Here, districts can customize and deploy machine learning models, such as large language models (LLMs).
Ensuring the security of internally developed AI apps or models, especially with such a wide variety of starting models to choose from, is bound to be a much larger undertaking than securing the internal use of a Copilot agent. It will require the use of several other tools.
READ MORE: How to secure AI tools within your K–12 digital environment.
Microsoft’s Azure AI Content Safety serves several purposes, such as blocking content that violates district policies. One of the service’s features, Prompt Shields, is of particular interest for AI environment security. Prompt Shields can monitor all prompts and other inputs to Azure-based LLMs and carefully analyze them to identify attacks and any other attempts to circumvent the model’s protections.
For example, Prompt Shields could identify someone attempting to steal sensitive information contained in an LLM or to cause the LLM to produce output that violates the school’s policies. this could include using inappropriate language or directing the LLM to ignore its existing security and safety policies.
Groundedness Detection, another service offered as part of Azure AI Content Safety, essentially looks for AI-generated output that is not solidly based in reliable data. In other words, it can identify and stop some AI hallucinations.
Microsoft provides Defender for Cloud (formerly Azure Security Center) to assist schools with monitoring and maintaining the security of their Azure environments. This includes any Azure workloads being used to develop or host a school or district’s AI apps. Defender for Cloud can help safeguard AI apps by ensuring the platforms under them are patched and configured to eliminate known security vulnerabilities. Also, Defender for Cloud can identify the latest cyberthreats and detect and stop attacks against those platforms and the AI apps running on them. These are all important elements of safeguarding a district’s AI environments and usage.
Microsoft offers other forms of Defender, including Defender for Cloud App Security (formerly Microsoft Cloud App Security), which identifies cloud app use and reports how risky each app is. This information can be useful in finding unauthorized uses of third-party AI apps and services. Defender for Cloud App Security is also capable of monitoring your district’s Copilot use for suspicious activity.
Microsoft’s Defender for Endpoint and Defender for Servers provide additional security protection for other components of your district’s AI environments outside of Azure, such as developer and user workstations and servers.
DISCOVER: AI training options open the door to purposeful tech integration in K–12 schools.
Microsoft Purview is a suite of tools and services that work together to help schools with data governance, management and protection. Existing Purview components, such as Compliance Manager, have been enhanced to include assessments of compliance with certain AI regulations.
Components specific to AI have also been added to Purview. The Purview AI Hub can help schools monitor and identify sensitive data in AI prompts, particularly with Copilot use. The AI Hub also monitors which files are accessed through Copilot to look for attempts to access sensitive data in files. The intent of AI Hub is to ensure compliance with policies and requirements by identifying possible violations as they are occurring.
See how IT leaders are tackling AI opportunities and challenges.
Copyright © 2025 CDW LLC 200 N. Milwaukee Avenue, Vernon Hills, IL 60061
Do Not Sell My Personal Information