Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Jun 12 2024
Artificial Intelligence

Data Security Best Practices for AI Tools in Higher Education

Monitoring access and providing training can help protect valuable information.

Artificial intelligence is part of the future of higher education. And as more advanced AI solutions hit the market and are incorporated into platforms that colleges and universities are already using, they’ll need to consider some important security implications. While these tools can be great for collecting and sharing data, there’s also a much higher risk of institutions oversharing extremely valuable or personal information.

Tools such as Google Gemini and Microsoft Copilot are beneficial when it comes to productivity and efficiency, but higher ed IT teams must follow best practices to make sure that end users aren’t accidentally sharing sensitive information with these platforms.

Click the banner below to learn the benefits of a cyber resilience strategy.

 

Ensure Users Always Log In to the Correct Account

Vendors have made AI available in multiple forms so anyone with a consumer account can access some version of their tools. But for maximum data protection, higher education institutions should require users to operate under the enterprise account. These licenses sit under a university’s core services, which means your data is protected and will not be used to train the AI models. The same cannot be said for consumer accounts.

To prevent the use of unauthorized tools, institutions can consider blocking third-party apps for those logging in from their university user accounts, but that will not address people who access them with their consumer accounts. The best way to prevent the use of unauthorized tools is to set everyone up for success with the solutions your institution has invested in. Giving students, faculty and staff access to approved AI technology — and training them to use it — means they will be less likely to explore unsanctioned tools that could pose security risks.

DISCOVER: Is a chief AI officer right for your university?

Offer Regular AI Training to End Users

Another way to ensure that everyone is using AI tools as safely and securely as possible is to offer regular training and professional development. Like general cybersecurity awareness training, make AI training part of your quarterly or annual requirements.

The beginning of the school year is a great time to start. Rather than sending informational emails, hold brown bag discussions that demonstrate the value of AI while driving home the importance of proper use. Look to the AI innovators at your institution; find out who is using it responsibly and hold them up as examples of the technology’s possibilities when harnessed correctly. Consider having these innovators present their work to students, staff and faculty to show the opportunities that AI holds, even within the bounds of your institution’s rules.

Make a Habit of Monitoring and Securing the AI Environment

Once you’ve established your security parameters and trained your end users appropriately, monitoring your AI environment doesn’t have to be time-consuming. Setting aside 10 minutes each day to check your dashboards and addressing any security concerns as you’re notified of them throughout the day is enough to keep things running smoothly.

READ MORE: Make sure your infrastructure is prepared to handle the demands of AI.

As this emerging technology continues to evolve in complexity, there are bound to be questions. CDW’s experts can advise IT teams on the best course of action based on their needs and concerns.

Create a Universitywide AI Use Policy

AI will only become more prevalent in the tools we use in higher education, so institutions should develop AI use policies if they haven’t already. These policies should be flexible. You don’t want to be known as the institution that says no, but having guardrails in place to ensure you’re operating in the most secure AI environment is vital.

You should know exactly what users are doing with the technology so you can make the best decisions for your institution. This could mean setting a blanket policy for the university, or allowing faculty to determine their own policies as outlined in their syllabi. Whatever option you choose, be consistent and firm in its application for maximum safety and security.

This article is part of EdTech: Focus on Higher Education’s UniversITy blog series.

Laurence Dutton/Getty Images