By working together, Elkhart’s John R. Hill (left) and Daniel Rice have created a simple, but effective data-driven system all teachers can use.

Apr 11 2007
Data Center

Case Study

Every school district has more data than it can use. Learn how this rural Indiana district correlates the most important information to help students progress.

Teaching by the Numbers
Every school district has more data than it can use. Learn how this rural Indiana district correlates the most important information to help students progress.

As the math department chair at Elkhart Memorial High School, Alex Holtz didn’t need to be convinced of the power of numbers. So he was pleased at the commitment and leadership district administrators demonstrated when they launched a data-driven decision initiative for teachers in 2004.

“Across the country there is a lot of pressure on schools to improve student achievement. What worked two generations ago in terms of instruction of 75 percent of the students isn’t good enough any more,” says Stephen White, a professional development associate at the Littleton, Colo.-based Center for Performance Assessment. The attention means school districts of all sizes are jumping on the data-decision bandwagon seeking answers to what works best and proof that a specific method is effective.

But what makes Elkhart Community Schools in northern Indiana stand out, White says, is its long-term commitment to get it right, even when that translates to still more changes.

Finding the Numbers

Fundamentally, the district has built a SEQUEL database to sit beside its student management system — a repository to store student demographics, test scores, grades, attendance information and disciplinary information. It’s updated daily and “our database manager spends a lot of time error-checking to make sure it’s clean,” says Daniel Rice, director of technology.

Teachers log in to the data via the secure Web portal — which requires two servers, one facing in, the other facing out — that checks their clearance credentials and admits them to the section of the student management system that corresponds with their class. The screen initially presents top-level information to tease the instructor into digging deeper — for instance, she might see a child’s standardized math score sitting next to that student’s math grade and notice a discrepancy.

“What we are trying to do is get data formatted in such a way that people see patterns,” Rice says. “If you’re looking at your class and see that one child has been absent 10 percent of the school days, that needs to be flashing in your face before it’s too late to do something about it.” Ditto the student who has six different school enrollment entries by the time he enters junior high.

When a teacher digs into a curious top-level category, a clearer picture should emerge. Say a student boasts a 190 summary score on math assignments, compared with a second child at 180. Drilling into that 180 score may reveal that his assignment scores were actually 200, 200, 200, 50, 200, and 200 — but for that anomaly score, that student would be on par with the first student. That’s a horse of a different color from a child who is struggling.

“The teacher’s goal now is to ask, ‘What did that child trip over and if I throw my coat over that mud puddle, can I help?’ ” says Rice.

Of course, none of this student information is new to Elkhart schools, but previously teachers and administrators searched through multiple filing cabinets to extract it.

Elkhart Community Schools chose to build its database in-house despite a cadre of vendors willing to tackle the job. “They’d say, ‘We can do A, B, C and D, and it will look really great.’ But E and F would cost more,” says Rice. Since he couldn’t look into a crystal ball and correctly anticipate the types of data the district may want in the future, it became more cost- effective to hire another programmer.

It also gave him the freedom to integrate data into standard spreadsheet formats. Now when teachers log in to view their data, they can click on the “export to Excel” button to work with the information in a familiar context.

Although such a system could track thousands of fields per student, Elkhart administrators are shooting to keep their records to fewer than 15 fields each. “Absolutely no more than that,” says John R. Hill, director of curriculum and instruction. “We are satisfied with what we have — the idea is to focus in tighter and tighter on smaller amounts rather than increasing the number of categories. That is exactly the wrong way to go. We can flood a system like this and incapacitate it.”

Full-time teachers 945

Per-pupil spending $10,690

Test scores

  • Elkhart SAT average: 1,444
  • Indiana SAT average: 1,493
  • National SAT average: 1,518

source: www.elkhart.k12.in.us

The Hard Part

But data for data’s sake adds up to just numbers. “The secret to data-driven decision-making that I think really energizes schools and helps teachers find themselves in terms of their craft is this: When teachers work together on collecting data and then get around the table to analyze it and talk about the results, that’s where the magic happens,” White says.

So although step two in Elkhart’s journey isn’t as clear-cut at the rank-and-file level, leaders like Hill assuredly have a road map. His role early on was to design instructional workshops and insist every teacher and administrator attend. “This is not an if or when kind of thing,” he notes.

And Rice recognized that he’d have to sell the idea, too. “Because the reality is that schools are data rich as a matter of fact — flooded with data and always have been,” he explains. “But using it to make instructional decisions, that’s the new part. And actually refining out of all of the literally thousands of pieces of data that could be put into use which ones really make the most difference in terms of instructional decisions — that’s the part everybody has to learn.” With just under two years’ worth of practice at this, Rice admits some data teams are still more proficient than others, but he has succeeded in one sense: 980 out of 1,000 teachers have been through the data training.

He’s patient. He continues to ask teachers to meet in collaborative groups of four to seven members at least once a month — by grade at the elementary level; by subject area in the upper grades — and follow a five-step structured script:

  • Obtain the data around a particular standard or subject the group needs to know, e.g., how many students are proficient at that time?
  • Look at student work to derive the traits of proficient work versus not-yet-there performances.
  • Set goals that determine what the teachers are working on with this age group, what they want them to do, how they will measure results and when. For instance, the group may want students to be proficient at word math problems by Oct. 31.
  • Assign structural strategies to help students master the topic.
  • Determine how the group will recognize when students are proficient by setting exact expectations. “This is the most critical step,” says Hill. “This is how we know when students have arrived.”

He says the richness of such a collaborative dialogue outstrips any gains a teacher can make by poring over the data individually. Holtz agrees, which is why his advice boils down to stay committed to communication and collaboration. “I’ve been a teacher for 11 years and have seen a lot of fly-by-night ideas. This isn’t one of them,” he says.

The fourth-grade teachers at Bristol Elementary School used the data to determine standards they wanted to see their students clear in math and science. Jeff Antioho, one of those fourth-grade instructors, posts his student’s aggregate test scores on a chart inside the classroom and issues each child his own folder to keep track of data showing improvement or a standstill. “We tried to keep individual scores private, but the kids get so excited, they’ll start yelling, ‘That’s my score! That’s my score!’ when I put them up,” he says. The group also repeats the numbers on a chart in the hallway so parents and other administrators can see where the grade stands.

“The program has focused teachers’ attention on specific skill developments,” Antioho adds. “Obviously we want student achievement to be important, but this also focuses on what it is we as teachers are doing, too.”

When it comes to proof, Hill is cautious, saying only that he knows as the system becomes more well-oiled and consistent, teachers will know if their students are working at proficiency. Meanwhile, 11 of the district’s 19 schools showed enough academic growth on the 2005 state tests to be placed on Indiana’s “exemplary progress” list.

“We are not going to be satisfied until this data-driven decision model is in effect in every classroom,” Hill sums up. “We are going for consistent programming for students as they progress through Elkhart Community Schools.”

Proving Progress

Although Elkhart officials expect the big payoffs from its data-driven model to come in the future, here are some gains students have made recently:

  • 30% - fourth-graders who moved from non-proficient to proficient in math in the last year
  • 4.5 - the average gain in math scores from 2003 to 2006, from 57 to 61.5

Lessons Learned

Officials at Elkhart Community Schools in Indiana have run into a few roadblocks and bumps as they iron out how to implement data-driven decision-making in their district. Here are some of the lessons learned:

  • The database isn’t an everyday tool, even though it’s updated daily. Sure, the IT department sees that new grades, test scores and attendance records are up-to-the-minute, but “the data a classroom teacher uses every day is, ‘How is this kid behaving and performing in my room?’ The data we provide is more of a view from 20,000 feet, and although we do have people go in and look at it every day, I’d love to know what they’re looking at,” says Daniel Rice, director of technology. Once a week or every two weeks is more realistic.
  • A district can’t go at this project hammer and tongs 24 x 7. At some point you reach a level where you must stop and watch the program boil. For instance, “I don’t want to make it someone’s full-time job to tweak the data warehouse and then discover only 10 teachers a week are looking at some sections,” Rice explains. “If it’s not something they use, then we need to evaluate that.”
  • Teachers will push back. Teaching traditionally is a more private, individualized profession so many will feel uncomfortable in a collaborative setting, notes John R. Hill, director of curriculum and instruction. “Sometimes it comes down to forcing the issue and saying, ‘Yes, this is the format to follow,’ ” he adds.
  • Paper is merely a training tool. Rice admits he started bristling in early meetings when other administrators insisted on seeing the data immediately. He needed at least eight months from an IT perspective to pull off the necessary programming, formatting and clean data procedures. The compromise: a paper version delivered to teachers on a semiregular basis. “It’s two steps in the wrong direction, but it got the conversation about data-driven decision-making started, which was important,” he says.
  • Make this your sole initiative. Learning how to use data in decision-making is a complicated process, says Stephen White, a professional development associate with the Center for Performance Assessment, so don’t shoot it in the foot by making it one of 10 changes in the district this year.
  • Urban, suburban or rural — who cares? When it comes to data-driven decision-making, smaller districts have just as many affordable tools and consultants at their fingertips as larger ones. White works with a number of school districts around the country, and finds the concept itself is contagious.
  • Don’t hire a programmer for this purpose. Sure, Elkhart brought on another IT staffer when it launched the data-driving initiative. But Rice made sure this was just one of several projects he could assign to a full-timer. “These days we have to support library automation, food service — and all of these things get bigger and bigger. You get to the point where you say, ‘We’re pretty silly if we don’t get somebody who can stare at this big system and make sure they talk to each other, play nicely and don’t blow up,’ ” he notes.
<p>GLENN TRIEST</p>
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT