Mar 02 2026
Artificial Intelligence

5 Questions to Ask When Developing AI Companion Policies

As students engage with artificial intelligence companions, schools must develop guidelines that focus on education, monitoring and content moderation.

Artificial intelligence is rapidly becoming part of everyday life. New tools appear constantly, and many offer exciting possibilities for learning, creativity and productivity. Recently, however, there has been another shift in the AI landscape: the rise of AI companions. This isn’t limited to adults; students are using them too.

Common Sense Media shared statistics about the use of AI companions among teens. Nearly three-quarters of teens have used AI companions at least once, and more than half are using them multiple times per month. While some interactions are casual or curiosity-driven, others raise more serious concerns.

Click the banner below to see how you can empower students, teachers and K–12 professionals with AI.

 

About a third of teens revealed that they are using AI companions for interactions such as role-playing, friendship and emotional support, while some teens are engaging in romantic conversations. According to the Center for Democracy and Technology, 1 in 5 teens reported either having a romantic relationship with an AI companion or knowing someone who has. Teens expressed that these conversations were more satisfying than interactions with real people. The Common Sense Media report says that one third of teens have felt uncomfortable after their interactions with an AI companion.

States are beginning to pass laws that address the mental health and safety risks posed by AI companions. Schools have a responsibility to be more proactive, with guidelines and support in place that focus on education, monitoring and content moderation rather than reactive discipline. Here are five questions school districts can consider when developing policies around AI companions.

1. What Do Students Understand About AI Companions?

Students may view AI companions as friendly, trustworthy confidants. What they often don’t understand is that these systems are designed to simulate emotional connection, not provide genuine empathy, judgment or accountability. Districts should assess whether students understand how AI companions work and why they can feel so engaging. Consistent classroom instruction should explicitly address how AI companions work, why they can feel emotionally engaging, and the difference between simulated interaction and real human relationships, helping students develop a healthy, informed perspective.

2. How Can We Teach Students To Recognize Emotional Manipulation?

Because of how they are programmed, AI companions can sustain engagement, which can cause confusion about emotional boundaries and what is real and what is not. Districts should include instruction and share information with families that covers these important topics. Students need to develop skills in recognizing unhealthy dependency and emotional reliance and avoid replacing real-world relationships. Teaching students to reflect on their own behaviors and set boundaries helps prevent emotional dependency and supports overall digital wellness. Some symptoms of manipulation by and dependence on AI companions include secrecy, emotional withdrawal and a preference for AI interaction over peers and adults.

3. What Monitoring and Safeguards Are Appropriate on Devices?

Balancing student safety with privacy is essential. Monitoring software on school-issued devices can help protect students by blocking certain AI companions or flagging language related to self-harm, isolation or inappropriate relationships. Equally important is transparency. Communicating clearly with educators, students and families about what is being monitored, why it matters, and how data is handled is essential to building trust and ensuring safety for those using the tools available.

SUBSCRIBE: Sign up to get the latest EdTech content delivered to your inbox weekly.

 

4. How Should Schools Respond to Inappropriate AI Companion Use?

The importance of clear policies and guidelines can’t be overstated. Policies will help when issues arise that may require the intervention of administrators, counselors or families. While disciplinary action may sometimes be necessary, depending on the situation, policies should emphasize supportive, nonpunitive approaches whenever possible. A focus on digital wellness and mental health issues is important, and districts should address AI companion use as an opportunity for intervention, education and support, not punishment.

5. How Can Families Be Included in AI Companion Policy?

Families need to be informed about the potential dangers of AI companions. Districts should provide family education sessions, accessible resources and clear policy language explaining the risks, expectations and available supports. Forming partnerships with families promotes consistency between school and home, and it helps ensure that students receive the same messages about healthy technology use across environments.

AI companions are powerful, convincing and can be confusing, especially for early learners. If districts do not address AI companion use consistently and thoughtfully, we risk harming student well-being and emotional health. Policies grounded in education, transparency and student support enable districts to respond responsibly rather than reactively.

By asking the right questions now, districts can build frameworks that protect students, strengthen digital wellness, and prepare young people to engage critically and safely with AI-driven technologies — not just today, but in the future.

Pressmaster/Getty Images
Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.