Dec 01 2025
Security

How Digital Monitoring Became One School District’s Warning System for Student Mental Health

Littleton Public Schools in Colorado staffs a full-time employee to investigate alerts generated through digital mental-health monitoring tools.

In a four-year period at Littleton Public Schools, we experienced six student suicides. Each one was devastating, and each left us asking the same questions: Could we have known sooner? Could we do something more?

Then the pandemic hit. We couldn't see students in hallways, notice changes in behavior or have casual check-ins that might reveal if something was wrong. But we did have their digital activity, and that became our primary way of reaching struggling students. That experience demonstrated the power of digital monitoring as a possible early intervention tool, and we invested in this resource.

Since the pandemic, we've had two student suicides, down from six. Suicide is a complex issue, and finding solutions to student suicide is equally complex. Two student suicides are unquestionably two too many, and while I can't attribute the reduction to any single factor, I can say with certainty that we're intervening earlier and more often than we did before.

Click the banner below to read more about school safety strategies and tools.

 

What We're Actually Monitoring in Littleton

We use two primary solutions for technology monitoring. One allows teachers to see what students are doing online during class time. While teacher usage varies, we've had instances where a student made a concerning comment, the teacher checked their browsing history, found they were looking at concerning content, and we immediately initiated a threat assessment.

The other solution is a more comprehensive content monitoring platform that monitors our district's Google Suite accounts, starting in third grade. We state in our code of conduct that we will be monitoring school email and Google accounts. Parents and students are aware of this, and when we asked our student advisory committees how they felt about this monitoring, we were told, “We understand the sacrifice of our privacy, knowing that it could help somebody in need.” It is not uncommon for students to write a quote from a movie or song lyrics in a Google Doc and write in a disclaimer, “These are song lyrics, please don’t flag me for this.”

The Human Element: Responding to 2,000 Alerts Per Week

In an average week, we can receive more than 2,000 alerts. The alert fatigue is real, and we learned quickly that our IT staff isn’t equipped for it. So, we did something that most districts don’t do: We hired a full-time, cyber safety technician who reviews every alert. The system flags content related to anxiety, bullying, hate speech, weapons, violence, self-harm, suicidal ideation, drugs and alcohol, medical concerns, profanity and sex. There will always be false positives. Suicide flags spiked when classes were reading Romeo and Juliet. And during Colorado's bomb cyclone weather event, "bomb" triggered countless alerts.

But within those 2,000 weekly alerts, several hundred are concerning enough to warrant follow-up. Some are lifesaving catches: goodbye notes, lists of peers that students want to harm and reports of abuse. We've found content we would have had no other way of discovering, often in Google Docs students aren't sharing with anyone.

SUBSCRIBE: Sign up to get the latest EdTech content delivered to your inbox weekly.

 

The district employee responsible for reviewing alerts — a former Court Appointed Special Advocate with probation system experience — is the linchpin for us, not just in terms of managing the response, but also in knowing students and recognizing patterns. IT’s role, meanwhile, is confined to offering support in technical matters pertaining to investigations.

We’ve seen rapid response times and positive results. We typically pull students within 30 to 45 minutes of being flagged. One real-world scenario involved identifying a student who had written a goodbye note and scheduled it to be sent at 3:30 p.m., after they’d left campus. We were able to get them help that day before the school day ended.

This system also lets us celebrate the good: counselors responding to student emails at 10 p.m. on weekends and teachers building meaningful relationships that help struggling students. We find kids reaching out for help with vaping urges, depression or family problems, and staff members respond with exactly the level of support they need.

LEARN MORE: A California school district takes a similar approach to monitoring.

The True Value of Early Intervention Is Hard to Quantify

Colorado has seen more than a dozen school shootings since 1999, which has elevated student mental health as a statewide area of importance. Districts in other states may not be met with the same level of support as they try to enact similar measures.

That said, as the number of interventions in our district has increased significantly, the number of high-level, crisis interventions has stayed flat. To us, that is progress: We're catching students earlier in their struggles, before they reach that critical point.

When one thinks about the students we've helped, the crises we've responded to, and the downward trend in our most tragic outcomes, vigilance is worth it. We can't prove what we've prevented because success means not finding out what might have been. But what we do know is that kids are still here who might not be otherwise, and for Littleton, that’s impossible to put a price on.

This article is part of the ConnectIT: Bridging the Gap Between Education and Technology series.

[title]Connect IT: Bridging the Gap Between Education and Technology

SeventyFour / Getty Images
Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.