About a third of teens revealed that they are using AI companions for interactions such as role-playing, friendship and emotional support, while some teens are engaging in romantic conversations. According to the Center for Democracy and Technology, 1 in 5 teens reported either having a romantic relationship with an AI companion or knowing someone who has. Teens expressed that these conversations were more satisfying than interactions with real people. The Common Sense Media report says that one third of teens have felt uncomfortable after their interactions with an AI companion.
States are beginning to pass laws that address the mental health and safety risks posed by AI companions. Schools have a responsibility to be more proactive, with guidelines and support in place that focus on education, monitoring and content moderation rather than reactive discipline. Here are five questions school districts can consider when developing policies around AI companions.
1. What Do Students Understand About AI Companions?
Students may view AI companions as friendly, trustworthy confidants. What they often don’t understand is that these systems are designed to simulate emotional connection, not provide genuine empathy, judgment or accountability. Districts should assess whether students understand how AI companions work and why they can feel so engaging. Consistent classroom instruction should explicitly address how AI companions work, why they can feel emotionally engaging, and the difference between simulated interaction and real human relationships, helping students develop a healthy, informed perspective.
2. How Can We Teach Students To Recognize Emotional Manipulation?
Because of how they are programmed, AI companions can sustain engagement, which can cause confusion about emotional boundaries and what is real and what is not. Districts should include instruction and share information with families that covers these important topics. Students need to develop skills in recognizing unhealthy dependency and emotional reliance and avoid replacing real-world relationships. Teaching students to reflect on their own behaviors and set boundaries helps prevent emotional dependency and supports overall digital wellness. Some symptoms of manipulation by and dependence on AI companions include secrecy, emotional withdrawal and a preference for AI interaction over peers and adults.
3. What Monitoring and Safeguards Are Appropriate on Devices?
Balancing student safety with privacy is essential. Monitoring software on school-issued devices can help protect students by blocking certain AI companions or flagging language related to self-harm, isolation or inappropriate relationships. Equally important is transparency. Communicating clearly with educators, students and families about what is being monitored, why it matters, and how data is handled is essential to building trust and ensuring safety for those using the tools available.
SUBSCRIBE: Sign up to get the latest EdTech content delivered to your inbox weekly.
