May 30 2025
Artificial Intelligence

How to Secure AI Tools Within Your K–12 Digital Environment

School IT leaders and educators need a clear understanding of what artificial intelligence tools do. Resources such as workshops can help them adopt a security posture to prepare for AI grading and learning.

As schools increasingly adopt artificial intelligence tools for learning, the right security measures must be in place. While there are exceptions, such as Georgia’s Gwinnett County Public Schools, most school districts don’t have a CISO or an AI expert on staff to make decisions on AI security procedures.

“AI is being pushed into schools as necessary without all of the fundamentals in place,” says Clara Lin Hawking, cofounder and executive director at Kompass Education.

AI security in schools will require a culture change and an understanding of who the technology impacts rather than simply a new IT infrastructure, according to Hawking. 

“AI security isn’t just building a better firewall or using better passwords,” she says. “It’s about understanding the risks, opportunities and limitations that come along with the use of AI, and that affects every stakeholder in the school.”

Click the banner below to learn more about the current state of AI in education.

 

Secure AI by Prioritizing Data Privacy Agreements and Protecting PII

Lisa Schwartz, regional educational technology coordinator for the Learning Technology Center in Illinois, stresses the importance of adhering to the Student Online Personal Information Protection Act and having vendors sign data privacy agreements before rolling out AI tools in schools.

K–12 IT leaders can find data privacy agreements on the Student Data Privacy Consortium, Schwartz notes. When schools in the state evaluate AI apps to use, they evaluate what data an app collects, such as IP addresses of teachers. Schools can look at other agreements for schools in the state and see if they need a new agreement or if they can piggyback off one signed by another school. 

Otherwise, educators, administrators and tech directors should use tools only if they do not input any personally identifiable information, Schwartz says. “Without reading the fine print, we’re not sure what these vendors are doing with PII.”

Clara Lin Hawking
AI is being pushed into schools as necessary without all of the fundamentals in place.”

Clara Lin Hawking Co-Founder and Executive Director, Kompass Education

Schwartz speaks with teachers to ensure they understand what data apps are collecting and demonstrates how to extract PII from data, such as on a spreadsheet uploaded into a large language model using Microsoft Copilot.

“What I find with these AI tools is, it’s brought to the forefront not only with educators and in the school setting but also with people in general. They’re becoming more aware of what data is collected on them, whether it’s using an AI tool or some of the apps on their phone,” she says. 

DISCOVER: How are endpoint detection and response technologies protecting schools?

In an online presentation, Schwartz and her former colleague Matt Jacobson recommend lesson plans from Common Sense Education for guidance on how to secure AI. They also cite the International Society for Technology Education, MIT’s AI + Data Privacy Activities for K-9 Students, and the AI4All Open Learning Curriculum from the Computer Science Teachers Association’s AI4K12 initiative.   

Workshops Train K–12 IT Professionals on AI Security Best Practices

CDW offers AI security readiness workshops to school IT directors and staff on how to secure the use of data involving children. It also offers additional workshops for faculty and staff on how to use AI systems.

These workshops take attendees through two phases: a broad overview of security posture in multiple industries and a conversation with organizations on how to customize AI tools for their specific situations, explains Pete Johnson, CDW’s artificial intelligence field CTO.

“After we go through that base education, we’ll have an open conversation about what they might want to look at specifically, given the tooling they have in place,” he says.

LEARN MORE: What is the role of AI in adaptive personalized learning?

CDW’s workshops provide guidance on implementing technology to block users from entering private information into public tools. They also feature training on how to use AI models internally and how to carry out data and security governance strategies.

 “Any AI solution you put in place is going to magnify your existing security and data holes,” Johnson says. “AI is not going to fix any of those. It’s going to amplify their impact across your organization.”

Pete Johnson
Any AI solution you put in place is going to magnify your existing security and data holes.”

Pete Johnson Artificial Intelligence Field CTO, CDW

CDW’s workshops also address the guardrails that urge users to analyze data before it’s added into a model. This data governance practice can reveal whether data needs to be modified to make it appropriate for the model to answer questions, Johnson says.

“Most of the conversation is about how to protect your own data and prevent it from being part of the training set,” Johnson says. “That’s the lion’s share of the conversation, but often some conversations about guardrails and prompting come up as well.”

AI Security Guidelines for K–12 Schools

Here are some suggestions to help districts use AI securely in their schools.

Deploy a Private Instance

To secure AI in public schools, Johnson recommends using an in-house solution that lets students, faculty and staff experiment with an AI chat app without exposing data in the public sphere. Schools can also work with a public model that has the right privacy protections in place.

“All of the big three hyperscalersAmazon, Microsoft and Google — have in their data privacy agreements that they will not use any of your prompt content to retrain models,” Johnson says. “In that way, you’re protected even if you don’t have that AI program on-premises. If you use it in the cloud, the data privacy agreement guarantees that they won’t use your data to retrain models.”

Establish an Action Plan

An action plan should detail what to do if a data breach occurs or if a mass phishing email circulates in a financial fraud attempt. 

“It’s incredibly important for IT professionals to understand exactly what those new attack surfaces are and what they look like, and then start building a framework for addressing that,” Hawking says. “That includes everything — the hardware, software and actual IT architecture — but also policies and regulations in place at each school to address these issues.”

Take Small Steps

As schools experiment with AI, they start small. For example, they can use data to analyze test scores or create progress reports for parents without exposing all data to an AI program at once, Johnson advises.

“Don’t take your entire data estate and make it available to some AI bot. Instead, be very prescriptive about what problems you are trying to solve,” he says. 

RELATED: Evaluate AI use cases strategically to achieve the best outcomes.

Use Corporate Accounts With AI Tools

Hawking warns against using personal email accounts with ed tech providers to avoid creating entry points for data sharing that could be used to train models without consent.

Vet AI Tools

Hawking also recommends that schools create an oversight team to vet AI tools. The team could include stakeholders such as the IT department, parents and student government leaders.

“It doesn’t mean lock down all AI, but understand exactly what’s being used on campus and why it’s being used,” Hawking says. 

Conduct a Complete Risk Assessment and Full Audit 

A thorough risk assessment allows schools to identify regulatory compliance risks and develop policies and procedures for the use of generative AI.

Hawking notes the difference in using a little chatbot and more powerful large language models inside classrooms for tasks such as determining a student’s grade.

“It’s really important, as part of an AI audit, to get a proper overview how all of those things take place on campus,” Hawking says. “That is the starting point of good governance.”

Paul Campbell/Getty Images
Close

New AI Research From CDW

See how IT leaders are tackling AI opportunities and challenges.