For schools with limited budgets, knowing which technology is worthy of investment is key. Why not use brain science to determine whether a tool is working for students?
The partnership led to increased engagement and improved test scores in three of Duval County’s K–2 schools. It provided a drastic change in some of the district’s most impoverished and low-performing classrooms, Kisya Johnson, a county official, tells eSchool News.
Using physical exercises and adaptive education games, C8 Science’s program, ACTIVATE, looks at cognitive development holistically and gathers data to inform teachers on progress so that they can adapt student programs.
“When the students are on the brain training software, it helps them to focus, it helps them to be able to listen in class and do the work that they need to do,” says Wexler in the eSchool News article.
Tailored Research Supports Better Tech Use
For other low-income schools, knowing which ed tech tools work is a huge benefit. Using evidence-based tools that look at cognitive development and how each student learns is one great way to find success in tech use.
Similarly, to Duval County, the Starkville-Oktibbeha Consolidated School District in Mississippi rolled out a tech tool that used brain science to determine reading levels of students and tailored lessons to help them improve, eSchool News reports.
But it often takes a while before the latest tech efficacy research hits the classroom. To avoid getting left behind on technology trends, many schools look to conduct their own research, suggests Theresa Ewald, assistant superintendent of Kettle Moraine School District in Wisconsin, in an EdTech article.
Ewald outlined a number of questions that schools should answer regarding new technology. The questions include:
To what degree does the content align with the national standards of the course?
Does the student data include success at the standard level?
Can the user data be manipulated to create small groups for classroom instruction?
Do teachers have frequent formative feedback about performance or suggested next steps?
Does the tool have an impact on learners who are at, above or below grade level, as measured by growth over time rather than per achievement?
What is the level of engagement of learners after the tool has been used for several months?
Ewald also suggests that the best way to ensure efficacy with a tech tool is to make sure that the stakeholder using tech the most is involved: the teacher.
“Instead of a districtwide rollout of a tech device, teachers select the best digital tools for their learners and classrooms,” writes Ewald.