How to vet edtech tools - SmartBrief

All Articles Education How to vet edtech tools

How to vet edtech tools

Eight questions you should ask when evaluating edtech tools and research.

3 min read

Education

How to vet edtech tools

Pixabay

Several years ago, the Fayette County Public School system, located just south of Atlanta, began implementing the Georgia Standards of Excellence, which is a framework for the district to deliver math instruction. To meet the rigorous requirements of the GSE and to support students with an effective Response to Intervention pyramid, the math leadership began searching for an instructional approach based on small group instruction that would propel student achievement in math.

FCPS began searching for a tool that would make small group instruction a fundamental teaching strategy districtwide. We wanted to find something that helped students to develop a conceptual understanding of math while focusing on the Standards of Mathematical Practice, but with so many tools on the market — and research studies that supported them — we had to think critically about the programs we considered.

To analyze learning tools and the research that accompanied them, we used eight questions developed by Tim Hudson, chief learning officer at DreamBox.

  1. Who conducted the study?
  2. What was the research methodology?
  3. Whose pre-test and post-test results were compared?
  4. Which students were included in (or excluded from) the treatment group?
  5. What were the school support requirements during the study?
  6. How large and representative was the researched population?
  7. How significant were the findings?
  8. To what extent did learning and achievement improve?

These eight questions provided a great framework for reviewing efficacy studies and helped us nail down the learning tool that would best fit our district’s needs. We settled on two tools and decided to run a year-long pilot in some of our schools before committing to one solution long-term. We used Hudson’s questions to create our own test.

We knew we wanted the research population to be large and widely representative, so we made sure to spread the study across various populations from Title 1 schools to more affluent populations. We also considered various geographic locations to ensure a widespread sample size.

We wanted the research to be consistent, so we used a universal process to glean data from the sample groups while seeking narrative feedback from teachers and students. It is very important for district leaders to gather teacher and student feedback in the evaluation process.

When the results came in, it was clear that teachers and students favored one solution over the other. Ease of use, lesson quality, data provided, and the ability to address a wide variety of needs were among the reasons respondents gave for favoring the product.

There are hundreds of learning solutions flooding the market today. It’s critical that district leaders use third-party research to evaluate these tools. These eight questions will help you stay on track.

Mark Henderson is the Fayette County High School assistant principal. Prior to that he spent five years as the Fayette County Schools K-12 Mathematics curriculum coordinator.

Tech Tips is a weekly column in SmartBrief on EdTech. Have a tech tip to share? Contact us at [email protected]

____________________________________

Like this article? Sign up for SmartBrief on EdTech to get news like this in your inbox, or check out all of SmartBrief’s education newsletters , covering career and technical education, educational leadership, math education and more.