Content Filtering Helps Ensure CIPA Compliance
St. Andrew’s Episcopal School’s motto, Inveniemus Viam Aut Faciemus, translates to “We will find a way — or we will make one.”
Chief Information Officer Kevin O’Malley took this philosophy to heart when considering how to manage web content filtering at the pre-K–12 school in Ridgeland, Miss. When he began rolling out a one-to-one notebook program for grades 9 through 12 in 2000, he built a wireless network that could deliver anytime, anywhere access. But he understood that with such freedom comes potential student exposure to inappropriate websites during (and after) school hours.
To minimize the threat, O’Malley and his team evaluated several filtering solutions, “but they were designed more for business environments, where the line is less gray as to what’s acceptable,” he says. They chose Websense Web Security Gateway for its filtering flexibility, ease of management and robust reporting features.
Web filtering also has helped IT staff manage malware infections on student notebooks. O’Malley says students are vocal in expressing their frustration when they encounter this safety net. “They’ve made a verb out of Websense,” he explains. “They cry that they’ve been ‘Websensed’ when a site they want to visit is blocked.”
But it’s not for nothing that students cry foul. As social media sites have evolved to include legitimate educational content, “we’ve had to adjust our [access] policies,” O’Malley says. “The use of the quota time offered by Websense permits educational use of select sites while limiting abuse. We haven’t had a problem with over- or under-blocking. We do get requests from teachers to unblock certain resources, but not often.”
The Perils of Policing
The issues are similar, if not more intense, in larger school settings. Michael Tuttle oversees 4,000 in-district computers in seven schools as chief technology officer for the Enlarged City School District of Middletown in Orange County, N.Y. “It’s very difficult to police everyone’s daily actions on the Internet,” says Tuttle, who has relied on EdgeWave’s ePrism Email Security Suite and iPrism Web Security for the past six years. The difficulty extends to sites visited by computer IP address and override accounts, as well as those visited by 7,000 students and 1,000-plus administrators, teachers and staff. “Overall, the equipment is solid,” Tuttle says. “I like that the filters are updated daily for new digital threats.”
Tuttle is proud of the filtering approach he’s employed at Middletown — and with good reason. A few years ago, the Schools and Libraries Program of the Universal Service Fund, which oversees the disbursement of federal E-Rate funding, audited his district. The auditors tested a variety of areas, including Middletown’s Internet filtering systems. The district passed all tests without any issues.
Tuttle admits to wrestling with the over-/under-blocking conundrum, however. “This is the biggest headache for technology leaders,” he says. “Based on preset configurations, you can have everything or nothing blocked.” When appropriate, “we do try to accommodate employee requests to unblock sites,” he adds.
At the heart of this challenge is the Children’s Internet Protection Act (CIPA), which imposes certain requirements on schools and districts that receive funding for telecommunications technology via the E-Rate program. CIPA specifies that schools must have an Internet safety policy with technology protection measures “that block or filter Internet access to pictures that are obscene, child pornography or harmful to minors.”
Content filtering technologies ensure compliance with CIPA while also protecting students and district resources from exposure to harmful content. Tuttle says iPrism keeps Middletown compliant by blocking the major components outlined in CIPA. At the same time, he concedes, “I don’t think an Internet filter company could last if its products didn’t meet CIPA requirements.”
CIPA in Practice
CIPA is clearly necessary because of the enormous amount of objectionable content that can be found online. Some students will seek out such content if given the opportunity, but many others inadvertently are exposed to it while doing legitimate web searches for class assignments and activities.
The State of the Law
As the Consortium for School Networking (CoSN) notes in its 2011 publication, “Acceptable Use Policies in the Web 2.0 and Mobile Era: A Guide for School Districts,” a number of states have enacted legislation pertaining to Internet use in schools. Such legislation typically falls into one of two categories:
- The first type requires school districts to filter or block harmful materials. Failure to comply can result in the removal of state funding.
- The second type protects students from cyberbullying.
Find out more at CoSN.
To protect students, “schools absolutely must implement robust web filtering technology,” says Michael Osterman, president of Osterman Research, which specializes in new and emerging technologies. Developing policies about what will be filtered, what will be allowed and how objections will be managed should be their first step, he adds.
As network security administrator for the Fayette County (Ga.) Board of Education, which serves 30 schools in the metropolitan Atlanta area, Lee Bailey is responsible for enforcing FCBOE’s acceptable-use policy (AUP) regarding content filtering. Given the scale of the area FCBOE oversees, Bailey says malware attacks and attempts happen all the time. “It’s one of those never-ending battles for truth, justice and the American way,” he says wryly. “I look at the filter log every day.”
CIPA and the threat of lost federal funding make filtering a black-and-white issue, but policy enforcement must remain a dynamic activity, Bailey continues. “Humans help to assign the categories, so sometimes things happen by mistake,” he says. For example, the county chamber of commerce’s website once was blocked as pornographic even though it wasn’t. When manually creating their blocked list, the filter company typed in the wrong IP number and erroneously blocked the site — an infrequent error, Bailey says.
Often, it’s the teachers who discover that sites have been blocked unnecessarily. When that occurs, Bailey unblocks them. In fact, district procedures specify that choosing to unblock a site should be a curricular, rather than a technology, decision. “That places the decision where it belongs: with those who support teaching and learning,” he explains.
Still, Bailey maintains that it’s essential for districts to write and regularly update AUPs for students and staff, as well as a discipline policy that spells out the consequences of noncompliance. “You can’t just have an understood policy,” he stresses.
This need for sound policies “is critical because schools are caught between a rock and a hard place,” Osterman explains. If the IT department filters too little, students can be exposed to objectionable content and the district becomes vulnerable to lawsuits, he says. But districts that block too much also can be sued. Two recent cases brought by the American Civil Liberties Union, for example, accused districts in Missouri and Tennessee of unfairly blocking access to websites catering to lesbian, gay, bisexual and transgender audiences.
The bottom line, Osterman says, is that district officials should consult with legal counsel, educational associations and others before they deploy content filters or allow web access using school resources.