How K–12 Schools Can Use Next-Generation Content Filtering to Keep Students Safe
Congress passed the Children’s Internet Protection Act (CIPA) in 2000, tying E-rate program discounts to a school’s internet safety policy. The FCC requires three elements for compliance: content filtering to prevent access to obscene, pornographic, or harmful images; monitoring of online activities of minors; and education in appropriate online behavior and cyberbullying awareness.
“The last time [CIPA] was reviewed was 2011,” says Ed Snow, a board member of the International Society for Technology in Education (ISTE) and director of technology for the School District of Milton in Milton, Wis.
“One-to-one programs really started birthing right around 2010–2012. It might be a good time to review the policies around CIPA at a federal level,” he says.
Even in the absence of a federal update, K–12 administrators can look carefully at their current internet safety policy. The internet of today bears little resemblance to the internet of 20 years ago, and website blocking software likely needs to change, too.
MORE FROM EDTECH: Check out 5 best practices for adhering to federal privacy laws.
Content Filtering Is an Evolving Problem in K–12 Schools
As the internet evolves, administrators find themselves facing new vectors through which objectionable content can reach students.
“School districts now must contend with an increasingly mobile device-focused student body, a plethora of social media and other web apps, stealth attacks over encrypted traffic, as well as cloud-based SaaS applications,” notes Brian Patch of SonicWall.
And CIPA’s narrow focus can also cause administrators to overlook other potential threats.
“CIPA requirements are specifically about access to obscene material, but schools now need to think about safety, bullying, radicalization, and more,” explains Amy Bennett of Lightspeed Systems.
MORE FROM EDTECH: See how content monitoring tools can help K–12 administrators keep students safe.
Next-Generation Website Blockers Balance Online Access with Safety
Over its lifetime, CIPA attracted criticism for more than just its narrow scope. In some cases, schools face charges of internet censorship for overzealously blocking access to material.
“If CIPA did not exist, I believe most school districts, including mine, would still implement a content filter,” Snow says. “I still believe it’s the right thing to do.”
He emphasizes striking the right balance between safety and access is a matter of thinking through a policy that takes into account the inherent abilities of the software. And those abilities have changed over time.
What will schools miss out on if they haven’t updated their approach to CIPA compliance?
“They’re likely over-blocking; that was the norm 10 years ago. And they may be using blanket policies instead of differentiating by grade or group,” Bennett says. “They’re also probably not taking advantage of AI to protect students against things like self-harm.”
CDW•G’s Amy Passow wrote about the potential of AI to act as a “multi-faceted digital assistant,” in part by using machine learning to contextualize internet searches and flag those that raise concerns. In that way, student searches can lead to mental health interventions and save lives — a phenomenon already documented in American schools.
There’s more: older software may not be decrypting SSL traffic. Schools may use hardware solutions that could be handled for less in the cloud; in some cases, they might even run multiple solutions to manage mobile devices separately from the network, Bennett says.
The current generation of website blocking software offers far more flexibility than its predecessors and can conform to the contours of a district’s acceptable-use policy. Patch points to SonicWall’s Content Filtering Service (CFS) as an example.
“SonicWall CFS compares requested websites against a massive database in the cloud containing millions of rated URLs, IP addresses and websites. It provides administrators with granular tools to create and apply policies that allow or deny access to sites based on individual or group identity — students, faculty, visitors — or by time of day, for more than 50 predefined categories,” he says. “To block objectionable and unproductive material more effectively, administrators can also create or customize filtering lists.”
Lightspeed Systems also made significant strides in minimizing the issue of overblocking by providing minute gradations of control along a number of axes — grade, group, class — and site types.
“We know that each school has a different culture and different needs,” Bennett says. “We make things like YouTube safe. We have social media controls that let schools block, allow as read only, or allow social media sites. We have a very granular and school-specific database that lets schools allow educational games but block mature games, and to allow sex education sites while blocking non-educational sites with sexual content.”
Content Filtering Solutions Should Consider the Human Element
Content filtering, like many challenges in tech, is bound to a human element that shouldn’t be overlooked.
“Something that parents and educators should know is that it’s impossible to block every piece of unwanted material,” Snow says. “It’s called the world wide web for a reason.”
Filters that update daily as new content is generated and schools around the world update their block lists are a great start, but a complete solution involves education.
“It’s about human infrastructure,” Snow says. “The technology only takes us so far.”
This is especially true for older students, who can often access the unfiltered web via the cellular plan on their own smartphone.
“Educating our students is the most powerful tool we have,” he adds.
He points out that most modern filters allow students to request access to specific blocked material. It’s an occasion for conversation.
“Is it a hurdle? Yes. But I think it’s also an opportunity to educate them on safe internet practices and surfing,” Snow says.
In the end, a holistic approach combines the best of tech with a team attitude.
“Content filtering and student safety are best deployed as whole school initiatives, so everyone knows why you’re filtering,” Bennett says. “When IT, administrators, teachers, parents, and students understand filtering and understand what safe internet access looks like, we can all work together to do the best job of keeping students safe while providing access to the rich educational materials that will help them learn.”