Jul 05 2016
Data Center

Districts Shift to Manage Data Effectively

Schools use new strategies for data management, including solid-state storage and data deduplication.

K–12 CIOs are seeing a shift in what it takes to manage data effectively. The good news is that there’s some relief in the nearly insatiable demand for larger storage capacities seen in recent years. That’s because more information, including content generated by students and faculty, now ends up in the generous storage allotments available from public cloud providers and Security as a Service–based applications.

“We encourage users to put personal data on Google Drive as much as possible,” says Shannon Degan, technology director at Jackson County Intermediate School District in south central Michigan. “Our long-term goal is to assign our networked drives to shared users or for information that stays onsite for data privacy reasons, while standard documents go to the cloud.”

The cloud and other factors create new challenges for effectively managing information to keep data centers running efficiently. Fortunately, IT managers have new tools for addressing these hurdles.

It’s All about the IOPS

Rather than focusing on raw storage capacity, JCISD analyzes input/output operations per second (IOPS), a storage system performance measurement.

“Our primary concern isn’t what data is sitting out there in files; it’s what data is actively being utilized for daily tasks,” says Ben Muscott, IS manager. During school hours, the district currently sees an average of 17,000 concurrent IOPS from the 200 virtualized servers supporting core resources like Microsoft SQL and Oracle databases. This represents a sharp rise from about 7,000 IOPS three years ago, Muscott says.

In the past, overprovisioning storage capacity was the go-to strategy for boosting IOPS capabilities in storage-area networks. “The more drives you have, the more spindles that are available for retrieving data,” explains Terri McClure, senior analyst at Enterprise Strategy Group.

The logic is that a large number of disks working in parallel share the load and in turn reduce the time needed to retrieve the requested data. “So organizations overprovisioned storage for years to get spindles instead of capacity to help from a performance standpoint,” McClure says.

JCISD found a better way. Last fall, it added a total of 26TB from solid-state drives (SSDs) to its existing SAN, which until then relied only on 15,000-rpm serial-attached SCSI (SAS) disk drives. The IT department determined that in addition to the bump-up in storage space, it would gain capacity for 30,000 IOPS. By comparison, adding 24 standard SAS drives would yield only an additional 4,200 IOPS.

Even more revealing was the price comparison. “On a per-drive basis, SSDs are definitely more expensive. But since we were more concerned with IOPS than capacity, the numbers tipped in favor of SSDs,” he says.

The district’s experience isn’t unique. McClure says industries of all types are running the cost/performance numbers for storage and coming to similar conclusions. “At first, it seems counter-intuitive, but solid-state storage can actually help reduce overall storage costs,” she says. “Organizations can buy a lot less capacity because they’re not buying drives for performance purposes. They’re only buying the capacity they really need.”

Lower TCO isn’t the only upside with SSDs. “We’re able to process much higher throughput than in the past because latency is essentially zero with SSDs,” says Greg Wade, network engineer at JCISD. “By adding that first tier of SSDs, our whole data center is performing more efficiently.”

Dedupe Endures

Combining SSDs and high-speed SAS drives isn’t the only way school districts are optimizing storage investments. Data deduplication, or dedupe, remains an important tool in efficiency arsenals. By consolidating redundant data, dedupe helps school districts make the most of both their production and backup storage capacities.

Dedupe is particularly important for JCISD’s backup strategies. “We’re trying to keep passive data like backup stores off our SAN and instead moving it to low-cost bulk storage environments,” Muscott says.

To do that, JCISD uses an ExaGrid backup storage appliance, with a 7TB capacity and block-level dudupe capabilities. The district pairs this with VEEAM backup software for VMware environments, which includes dedupe and compression algorithms. While the software approach costs less than the appliance, the hardware offers higher performance, Muscott says. “But prices for the appliances are dropping, and they’re now at a point where we’re relooking at this option for video storage and backups for our server farm,” he adds.

New Roles for CIOs

Similar storage trends are impacting the Public Schools of Northborough and Southborough, Mass., says Leo Brehm, director of technology and digital learning. He says the onus for dedupe and capacity now mostly falls with cloud vendors, but other challenges arise. For example, storing related data in various locations leads to duplicate content and makes it difficult for people to share information. “Much of our management effort is now focused on the delivery and retrieval of data,” Brehm says. “Instead of deduping data on hard drives, we’re concerned with deduping work in process.”

Brehm is working to keep student demographic data and authentication information housed in central locations accessible by any authorized user. To do this, the schools are tapping industry protocols, such as standards from the IMS Global Learning Consortium for secure information sharing among systems. The IMS’s Learning Tools Interoperability specification enables single sign-on, which means students and faculty can enter passwords once and gain secure access to all related applications, he says.

“As data centers continue to shrink, the role of the K–12 IT staff is evolving,” Brehm says. “The traditional systems management role is becoming more of a data management position. Instead of caring primarily about what version of SQL is running on a machine, we’re more focused on where the data on the machine is going, how quickly it’s getting there, and how often it’s needed,” Brehm says.

Scott Stewart
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT