5 Research Tips for EdTech Product Development

As part of the Innovation Workshop, the lead researchers in the Global Victoria EdTech Innovation Alliance provided their key advice on EdTech efficacy research. These top 5 tips can also be found in the whitepaper: “Designing an EdTech Efficacy Research Report.”

Undertaking research for education technology solutions is complex: there are hoops to jump through before getting educators and management on board, and a positive user experience may not necessarily mean educational impact. The Global Victoria EdTech Innovation Alliance provides a support team for education technology solutions to navigate the quirks of this kind of research. 

The Alliance coordinates the cooperation between providers, education technology companies and academics to trial technologies that solve and enhance learning outcomes. Not only are the teams testing for the real world efficacy and impact of their solutions, but they’re observing whether their product can be exported and useful in education settings outside of Australia.

Researchers Drs Margaret Bearman, Michael Henderson and Phillip Dawson have provided these 5 tips for education technology companies evaluating and interpreting data.  

Be critical of the evidence itself — don’t jump to conclusions too quickly

Sometimes, what you find may be too good to be true. It’s important to take a step back and analyse the evidence you have before you. Is the evidence reliable? Would you get the same results again and again?

Whether the learner likes the solution is not proof that the solution is effective

Likeability can help your EdTech solution become more successful, but you still need the foundation of learner impact and alignment with curriculum. While there are standardised expectations of learner comprehension level, each education institution has a set of varying needs to teach their students.

Quantitative isn’t always better than qualitative — it depends on what you’re observing

What does your solution seek to achieve with learners and/or educators? Evidence may not be straightforward, and requires some consideration on what outcomes you’re looking for and how to measure and find those outcomes.

Each data point needs context to understand it

Context is key — data can never be understood on its own. Observe the contributing factors that may have made your solution successful or unsuccessful, and carefully consider the learner outcomes as a result of using your solution.

Have multiple people and perspectives review the data

We, as individuals, can be stuck in our own interpretations of the information available to us, based on our own personal knowledge. Invite others on your team or a paid analyst to review the data that you have. You never know: there may be a pattern or learning in the midst that will change the way you view your solution.

To find out more about the researchers’ recommendations, download the Designing an EdTech Efficacy Research Report whitepaper.

The Global Victoria EdTech Innovation Alliance program is funded by the Victorian Government as part of the $3.6 million International Research Partnerships program under the International Education Short-Term Sector Recovery Plan.

For more information about the initiative visit Global Victoria EdTech Innovation Alliance.