Efficacy, Evidence, and Interpretation: Innovation Workshop

This is the first event of the Global Victoria EdTech Innovation Alliance. The workshop was led by two researchers, in which they explored efficacy, evidence and data interpretation followed by participant breakout discussions.

In the Innovation Workshop, as part of the Global Victoria EdTech Innovation Alliance, participants heard from researchers Margaret Bearman, Research Professor within the Centre for Research in Assessment and Digital Learning at Deakin University, and Michael Henderson, Professor of Digital Futures at Monash University. 

This workshop began with scene setting and opening remarks by Emeritus Professor Beverley Oliver, Founder of EduBrief, Fiona Letos, Director of International Education and Study Melbourne at Global Victoria, and David Linke, Managing Director of EduGrowth. 

In the core part of this workshop, Margaret Bearman and Michael Henderson explored the below three key themes:

Evidence
Efficacy
Interpretation

Evidence

The three stages to observe in evidence are effects, outcomes and impact. Effects are the changes. Some questions you might consider are:

What effects are expected/desired?
What are the possible undesired effects?

Outcomes are the specific and measurable effects. They tell us if the changes have occurred. Some questions to consider are:

What are your objectives?
What effects will help you demonstrate progress in your objectives?

Impact tells the story of the previous, focusing on the long-term results or changes. They can be difficult to measure. Impact is tied to the experiences of people. 

What are your impact goals?

If we aren’t careful about what we’re measuring or why we’re measuring it, this can have a negative impact on teaching and learning. It’s for this reason it is important to be critical of evidence itself. Be careful of jumping to conclusions or of correlations. 

Quantitative isn’t always better than qualitative — it depends on what you’re observing. What are observable patterns? And which are measurable?

“Evidence is about building a compelling case”.

Consider the nature of evidence — in relation to both validity and reliability. It’s important to check if it is reliable, and whether you will get the same results again if tested subsequent times. In a different context with different variables, the results can change. Validity, in contrast, is asking if the phenomenon measures or reports on what it claims.

Efficacy

Research evaluation is a key part of evaluating the efficacy of a product or solution. Research evaluation describes how the project has been implemented, usually in relation to the intended process and expected milestones, effects or activities. This is what you deal with along the way. Outcome evaluation tells us what kind of change has occurred, typically in the target population in reference to the stated objectives. Impact evaluation paints a picture as to how a program might have affected participants’ lives on a broader scale. 

There are four primary facets of efficacy:

Process
Outcomes are not the only measure of success. Rapid prototyping, for example, is a process-based evaluation that informs design. It not only informs what you did, but is a marker of what you will be doing. Consider how your product draws upon learning theory, aligns with good educational practices, and ensures it is ethical to use for both learners and teachers. 

Intended outcome
Qualitative data can reveal contradictions. Intended outcomes are easy to write, but can be difficult to achieve in regards to measurement. 

Acceptability
Whether users enjoy engaging with a product is not the same as if the product is effective for learning and teaching outcomes. However, how much a student likes a solution is important. It can be a gatekeeper for if it can be used in other departments or at other schools. For example, if a service nudged a learner each time an assessment came up, they might find this irritating. Despite the nudges potentially having a positive impact on whether they turned in the assignment in on time, it does not gain the desired acceptability. 

Feasibility
This takes into account the cost, resources and timing of an intervention. Expensive is also educator cost or student cost. 

Efficacy = P(rocess) x I(ntended outcome) x A(cceptability) x F(easability)

“If any of the variables are zero, then the efficacy goes to zero. All are required to make a product efficacious.”
Interpretation

Generally speaking, a data point does not speak for itself. Likewise, almost any piece of information needs to be understood in its context. We know intuitively that things can work differently in different places, used by different people. This is why we can’t cling too closely to any numerical formulas about evaluating evidence or efficacy. 

Even the very basics of quantitative data require some degree of interpretation. 

As EdTech is in the business involving people, interpretation can be hard.

Having more people in the room to look at that data can be most helpful. We, as individuals, can be stuck in our own interpretations of the information available to us, based on our own personal knowledge.

“Any form of efficacy or any form of evidence is about working with people. What’s feasible to one person isn’t to another, and what is acceptable to one person isn’t acceptable to another. There can be subjectivity to this.”

Speakers

Fiona Letos

Fiona Letos is the Director, International Education in Global Victoria. Global Victoria supports international education businesses and providers to diversify global markets and products and maintain Victoria’s position as a destination of choice for students, including through innovative student initiatives and the Study Melbourne Brand. Fiona oversees a Global Education Network of 11 education specialists across key international markets.

Beverley Oliver

Emeritus Professor Beverley Oliver is a higher education consultant, speaker and researcher focussed on digital education, micro-credentials, curriculum transformation, quality assurance and graduate employability. Beverley is a past Deputy Vice-Chancellor Education at Deakin University, where she was awarded the title of Alfred Deakin Professor, for her outstanding and sustained contribution to conceptualising the strategic enhancement of courses in the digital economy and furthering Deakin University’s research and scholarship in the field of higher education.

David Linke

David is the Managing Director of EduGrowth. Over 20 years, David has built a successful career in the education sector across Australia and the Asia-Pacific. He led the Asia-Pacific operations of Renaissance, a global education technology vendor. More recently David has established, scaled and exited numerous education technology and innovation businesses. The combination of David’s strong education sector experience and professional services background has seen him operate as a strategic consultant to EdTech startups, educational innovation consulting practices and venture capital partners.

Margaret Bearman

Margaret Bearman is a Research Professor within the Centre for Research in Assessment and Digital Learning (CRADLE), Deakin University. She holds a first class honours degree in computer science and a PhD in medical education. Over the course of her career researching higher and clinical education, Margaret has written over 100 publications and regularly publishes in the highest ranked journals in her fields.  Recognition for her work, includes Program Innovation awards from the Australian Office of Learning and Teaching and Simulation Australasia.  Margaret’s interests include: assessment/feedback, digital education, and sociomateriality.

Michael Henderson

Michael is Professor of Digital Futures in the Faculty of Education at Monash University. He is a world expert in the field of digital education, in particular the effective use of technology in internet enabled teaching and learning. In 2020 Michael was identified by The Australian as the national Field Leader in Education research. Unique to his profile is that his research spans early childhood, schools, universities and professional learning contexts. Attracting over 5 million dollars in research funding from the Australian Research Council and other philanthropic organisations and industry his current research projects are generally aligned with the three broad fields: assessment and feedback, risk (wellbeing and creativity), and effective teaching and learning with online technologies.

The Global Victoria EdTech Innovation Alliance program is funded by the Victorian Government as part of the $3.6 million International Research Partnerships program under the International Education Short-Term Sector Recovery Plan.

For more information about the initiative visit Global Victoria EdTech Innovation Alliance.