Numbers, test scores, averages, percentiles. Everywhere you look these days, the data jump up and beg to be used, manipulated and understood. Teachers and administrators know that it’s easy to find numbers, but figuring out precisely what they mean and using them to help students learn is a lot tougher.
Some school systems have found a way to help teachers uncover the hidden values in raw data. Using these values, administrators adjust curricula, and teachers adapt lesson plans for classroom instruction based on actual student needs.
To understand test scores, educators must first know what’s being tested, then find the strands that are embedded in the testing standards, and finally categorize the information into student and standards subgroups.
ALIGN WITH STANDARDS
Illinois facilitates the understanding of data by issuing an assessment framework, the Illinois State Report Card. This tool lists test data, student demographics, grade level results and instructional settings, as well as statistics showing improvements by school or district. The results are posted on the Illinois State Board of Education Web site: www.isbe.state.il.us.
Suzette Lambert, the director of curriculum and instruction for the Harrisburg, Ill., Community Unit School District #3, regularly sends teachers in her district to the Illinois Assessment Frameworks Web site and reviews the data with them as a training method.
“Educators can check their individual school’s data against this framework,” she says, “to see how they are doing and adjust curricula to correct areas of weakness.”
For instance, the report cards reveal whether a school is making adequate yearly progress (AYP), a measure of year-to-year student achievement on statewide assessments that is part of the federal No Child Left Behind Act. “At the current time, if a school has 47.5 percent of its total students or subgroups [45 students in a particular group] meeting or exceeding the test in reading or math, the school is making adequate yearly progress,” Lambert says. “Anything below that number is not making AYP.”
Lambert gives an example of how schools can adjust their curricula based on the assessment framework. “If a school district found through the Illinois Assessment Framework that the math portion of the Prairie State Achievement Exam covered 32 percent algebra, 19 percent geometry, and 4 percent data analysis, statistics and probability, the district would adjust its curriculum to cover more algebra than geometry and put less focus on analysis, statistics and probability,” she explains.
This type of adjustment, says Lambert, “is an ongoing process of trying to be better. Once the results of an assessment are seen, we look for ways to improve immediately. Of course, the best method is to take a proactive stance instead of a reactive one.”
TEST FOR INCREMENTAL GAINS
Teachers should understand how the individual strands are grouped together to create a complete standard. “For example, if the standard refers to long division in math, the individual components might be the student’s ability to add, subtract, handle decimals and estimate,” says David Davare, director of research services for the Pennsylvania School Boards Association (PSBA).
“Once teachers are trained to understand that multiple strands are built into standards, they are much better equipped to figure out the areas in which their students need help,” he continues. “The teachers can then design lesson plans to precisely target those areas.”
Davare prescribes in-service training similar to that given by the PSBA. The training should have teachers asking questions such as the following: “If the demographics of these classes seem homogenous but test results show differences between classes, are these differences to be expected? What explanations might skew the results? What kind of learning elements can account for the differences?”
By sorting out the answers, he says, “Educators can put the data results into perspective and act on them.”
EXAMINE NATIONAL AND STATE STANDARDS
Cindy Black, the director of media and technology at the Harrisburg Community Unit School District #3, advises teachers to examine national and state standards when writing lesson plans. “Teachers need to better correlate their individual grade-level assignments with the entire K-12 curriculum,” she says. “They should be sure their grade assignments mesh with the knowledge the kids will be expected to have at later grade levels.”
Truly understanding and judiciously applying test results may be a long-term ideal in many districts, but it’s clear that some schools — including Harrisburg District #3 — are leading the way toward comprehension and insight.
BARRIERS TO THE EFFECTIVE USE OF DATA
Lack of training and interoperability are the main barriers to more effective data-driven decision-making, according to a 2004 survey conducted by Bethesda, Md.-based Grunwald Associates on behalf of the Consortium for School Networking, which is based in Washington, D.C.
50% Lack of training
42% Lack of interoperability &emdash; systems that are unable to share or exchange data
39% Lack of understanding of what to do with the data
36% Absence of clear priorities on what data should be collected
35% Failure to collect data in a uniform manner
31% Outdated technology/legacy systems
24% Low-quality data &emdash; inaccurate or incomplete
24% Timing of data collection
22% User interface is too complicated to understand reports
TIPS FROM THE PROS
Just as the medical profession practices evidence-based medicine, educators can apply researched and tested best practices to deal with data bombardment. Here’s some advice from the experts:
• Don’t compare pineapples and mangoes: Be sure you’re looking at comparable data.
• If your district has a curriculum coordinator, get help in breaking out the specifics of the scores.
• Ask your administration to provide in-service training, such as analytics workshops.
• Look at the data from every angle. Do the figures, for instance, apply across the board or to specific subgroups?
• Administrators should provide sufficient time for teachers to write or rewrite lesson plans to correct any weaknesses disclosed by the test scores.
• Examine state and local standards to see how they may also be broken out into their constituent parts before comparing them with student test data.
Claire Meirowitz, a writer and editor who specializes in information technology and education, is based in Babylon, N. Y.