Co-creating with Your Customer: Pedagogy and Piloting

Co-creating with Your Customer: Pedagogy and Piloting

In the latest Insight Seminar from EduGrowth, industry experts meet to discuss pilot programs for startups and the critical role of co-creating EdTech to meet a school’s individual pedagogy and goals.

Lyn Hay from Leading Learning Institute brings together two of Australia’s experts on EdTech development, innovation, and success to explore the role of co-creation, pedagogy, and pilot programs in an EdTech startups success. Her guests today are:

  • Lauren Sayer, Executive Director of Digital Learning, Research, and Innovation at Melbourne Girls Grammar
  • Herk Kailis, CEO and Founder of Cadmus.

Highlights of the conversation follow along with the full transcript:

Defining and aligning pedagogy

Lauren Sayer and Herk Kailis begin by giving their basic definition for pedagogy:

  • Lauren Sayer: Pedagogy is what a teacher does, day-to-day, to influence learning. It is teacher practice.
  • Herk Kailis: Pedagogy is teaching and learning. It is meeting the learner where they are, and finding solutions to their learning needs.

For EdTech startups, it is important to understand the pedagogy of the schools they are targeting. To this end, EdTechs learn as much as they can about the institution and align their products toward the user. Herk gives some advice:

  • Discover how the school assesses the success of an EdTech product. Is it grades? Student retention? The student experience?
  • Learn what the school needs first, and then tailor your product to fit that.
  • Schools also carry a philosophy on learning, whether it is explicit or not. If your product contradicts the school philosophy then it will not succeed in that context.

Lauren adds that schools across Australia, and certainly across the world, are not homogenous. What works for one school may not fit the pedagogy or the needs of another school. Product alignment needs to be a constant process.

“Schools tend to have a pretty strong standpoint as to their methods of teaching and learning and what they hold dear to them.”
Lauren Sayer, Melbourne Girls Grammar

Tailoring your product to the customer

Knowing that products need to be aligned to the needs of particular institutions, Herk and Lauren discuss how an EdTech startup can begin learning about a school and preparing to engage with them.

  • First, read all the publicly available information about a potential customer.
  • Universities often publish a great deal of information online, making them an easier target than K-12 in this respect.
  • Talk to more than one person in a given school. There are multiple perspectives in a single institution.
  • The EdTech ecosystem is a small environment and faculty move from school to school. Try to use these connections to your advantage.

In Lauren’s experience, small workshops tend to be better networking opportunities than large conferences. In the latter an institution may engage with dozens of EdTech organisations, but none of them has a tailored product for her school’s needs. In the former, EdTech leaders can learn exactly what a school needs and can begin to tailor a product that fits those criteria.

“A bit of a pro tip is that there’s more benefit in showing up to the small discussions and sitting next to a decision maker than there is standing in a trade show in the hope that a decision maker will walk up to you.”
Lauren Sayer, Melbourne Girls Grammar
EduGrowth LaunchPad Essentials Pedagogy and Piloting - Lauren Sayer

Startups and universities

K-12 schools and universities require different approaches due to the difference in their composition, management, and overall goals.

  • Universities’ pain points are often easier to locate due to the large amount of information published online.
  • Universities are looking for ways to make money or save money, and may be more attracted to money-saving products than K-12 will be.
  • Universities are less numerous but older and larger than K-12 institutions. They have long memories. Burning a bridge with a university may have very long-reaching consequences.
  • K-12 institutions are numerous. So if a product doesn’t work in one school, it very well may work in the next one.
“Launching fast in a higher ed setting can backfire if not done correctly because you can get really bad press. You can have shocking results. They’ll tell everyone in the groups that they frequent, and then that can keep you out for a couple of years.”
Herk Kailis, Cadmus
EduGrowth LaunchPad Essentials Pedagogy and Piloting - Herk Kailis from Cadmus

Finding a champion partner

Early on it can benefit EdTechs to partner with one or two “champion” institutions who will have a vested interest in piloting the startup’s product, working with it to develop, and even market and present at later stages.

  • IT is not often the department to find a “champion” contact. Try to communicate with school leadership as much as possible
  • Institutions which pilot the program can receive the final product at a discount to encourage their participation.
  • To encourage an institution to pilot a project, show them how it targets their particular needs and benefits them in the long term.
“You’re really seeking out a champion, or a couple of champions, who see the need. They’re going to be able to push it through the different levels of decision making that are happening in a large organisation like a university. It’s because they want to be actually seeing it implemented with their academics.”
Lyn Hay, Leading Learning Institute

Measuring success in EdTech and institutions

Leaders in education have many EdTech products to consider and will often choose the one that scores high on their particular metrics.

  • Success metrics can include: 
    • Does it improve the life of students?
    • Does it improve the life of teachers?
    • Is it going to improve learning productivity and teaching productivity?
    • Ease-of-use
    • Does it save us money?

  • Success metrics often do not include:
    • Grades
    • Viewership
    • Enrolment

In other words, the important metrics for an EdTech’s relationship with institutions are often more qualitative than they are quantitative. It’s about how well-received the product is and not necessarily how much it is used.

Herk adds that, while the metrics are important, there are several requirements that an EdTech product needs to meet before they should even consider success metrics.

  • A school will often reject a program wholesale if it does not already include:
    • Single sign-on
    • A privacy policy
    • Technical due diligence and compliance
    • Easy integration into the existing ecosystem

  • So many base requirements create a high barrier to entry.
  • Speak to students and teachers directly to understand if their needs are being met or not.

“There are a number of things you need to get right just to come to the party, and often that makes it a very high barrier to entry, especially in universities where you need to be, depending on where your tool fits into the ecosystem.”
Herk Kailis, Cadmus

An increasingly complex EdTech ecosystem

Lauren Sayer expresses how much the world of education has changed in just a short number of years. Some considerations that are key today would not have even appeared on the radar several years back. She notes that, today:

  • Data stored off-shore is a deal-breaker. Institutions and individuals demand control over their data.
  • Students are more data-aware than ever before.

As a result, Herk cautions us to be constantly learning because the education ecosystem will continue to grow, develop, and change in the years to come. The best ways to stay informed are to:

  • Learn from your good customers. They are personally invested in your product and will guide you to what works and what doesn’t.
  • Bad customers teach little to nothing. If they don’t like a product, they will discard it. Little constructive feedback results.
“So find the good customers, listen to them, figure out your processes, fill in the gaps.”
Herk Kailis, Cadmus

Full Transcript

Lyn Hay:

Good afternoon everyone and welcome to the Second Edge Growth Launch Pad Essentials Insights seminar. I’m Lyn Hay. I’m the director of the Leading Learning Institute and I have done a little bit of work with EduGrowth as the innovation lead at Charles Stuart University for the last few years. I’ll moderate today’s insight seminar on behalf of the EduGrowth Team. EduGrowth is Australia’s education technology and innovation industry hub, and through connection and collaboration EduGrowth accelerates Australia’s EdTech ecosystem globally.

This afternoon you will be hearing conversation from Lauren Sayer, the executive director of digital learning and research and innovation at Melbourne Girls Grammar School, and Herk Kailis who’s the CEO and founder of Cadmus.

This afternoon we’re exploring the topics of pedagogy and piloting which is certainly something that has a great impact on those people who are trying to enter into both the K-12 and higher ed sector as startups.

Lauren Sayer is the leading voice in the Victorian International education community throughout her 30-year career. She’s now part of the executive of Melbourne Girls Grammar School where she’s developing best practices in learning technology, digital literacy, and building an evidence base for modern digital capabilities. And also welcome to Herk Kailis, who is the CEO and founder of Cadmus, an EdTech company that’s improving the quality of education globally through better assessment experience. Herk leads the executive functions of Cadmus and is responsible for commercial operations including university partnerships. 

I might start with you first, Lauren. Would you like to start with defining for us what you see as pedagogy?

Lauren Sayer:

I think pedagogy is an over-complicated term sometimes. If we look at what pedagogy is, it’s what a teacher does each day to influence learning. So it’s teacher practice at heart. Pedagogy sits really strongly in that teaching and learning ecosystem.

Lyn Hay:

Herk, from your perspective in a EdTech company, how do you view pedagogy, and how does that inform the work that you do?

Herk Kailis:

For us, pedagogy has always been an important consideration and somewhere where we often look to is a synonym for beginning with the customer. For us, figuring out what the problem is, we’re trying to solve, aligning that with the pedagogy, and then looking for a solution. Pedagogy to us means teaching and learning, as a sort of fancier word to say that, and moving away from the jargon. So how do we implement the best teaching and learning for the specific customer? 

Lyn Hay:

It’s synonymous with starting with the customer’s needs. How then does pedagogy inform the user experience and user interface for your product?

Herk Kailis:

With a specific product or a specific need the pedagogy might be different around what that specifically is. So for assessment, understanding what the relevant pedagogy is, is always also related to understanding what problem you’re trying to solve, and then finding the intersection between the two. It’s different for different things inside universities and it’ll be different for different problems inside schools, as well.

For us figuring out what the right pedagogy is for assessment really aligns to what problems we are trying to solve around assessment. More broadly, what problems are we trying to solve and address in the institution, whether their student experience, graduate outcomes, student success and grades, making sure students don’t drop out of the institution, etc. How does the function of assessment and the pedagogy around assessment help to improve those benefits and outcomes in a way that satisfies the different user groups in an institution? It’s from those who adopt it, whether they’re students or teachers, all the way to those who have an impact on the solution and what it might be. There’s lots of examples of solutions that might work, but the pedagogy is wrong and they ultimately don’t make the master.

Lyn Hay:

Lauren, would you like to add your perspective on that, particularly from a K-12 perspective?

Lauren Sayer:

I think the philosophies around teaching and learning in K-12 differ from context to context. I think it goes back to what Herk said of finding your customers needs and where they sit before you go in and offer a product. If you look at a school that has a strong focus on explicit instruction and being able to teach in that specific way, that will be a very different product offering than a school that’s very interested in a project-based learning or design thinking approach. I don’t think I’ve ever had an EdTech company say, “ask the school what the teaching and learning philosophy are?” or, “what the pedagogy of the school is?” and I think it’s a really key question because that will allow you to match up.

Lauren Sayer:

One of the big pitfalls around that is not understanding the context well. I know I’ve worked in contexts where there’s been strong philosophical opposition to certain areas, and then a technology company comes in and goes, “well we want to do this.” It’s never going to get off the ground because that is not that philosophy for that school, and schools tend to have a pretty strong standpoint as to their methods of teaching and learning and what they hold dear to them. I think the good thing about that is there’s lots of schools and lots of different processes. From a market perspective, if School A doesn’t fit your product, don’t worry. There’s a lot of schools. It’s really understanding that there’s no one fit for all schools. Especially in the Australian EdTech environment, schools are incredibly diverse. We don’t tend to have a homogeneous teaching and learning philosophy in our schools across Australia.

Lyn Hay:

How much does an individual teacher or principal inform the pedagogical approach or even the philosophy in a school? What is your philosophy at Melbourne Girls Grammar, for example, with respect to what we’re talking about here today?

Lauren Sayer:

I think your principal always has a strong influence on your school, but good principles bring all of their teachers along. It’s part of a global discussion and it’s really interesting that you’ve put this in because it is exactly what we’re going through right now at Melbourne Girls Grammar. We are sitting down as a learning organization and we’re looking at what we want as far as a learner profile and our ways of working for our girls. But we are also looking at what we hold dear in terms of our teaching. We strongly follow a science of learning and neuroscience perspective at our school, with a strong hybrid technology back bone. But we do see our teachers and the formation of knowledge in an explicit sense as being very dear to us.

Lyn Hay:

I think that’s really important because what you’ve shown is for any EdTech startups, if they were looking at approaching either a university or a school, they really need to understand where the context of that school or university is coming from in terms of that.

Herk, would you like to add anything to that because I know that you’ve worked with some different universities and I’m sure you would have seen or felt different approaches.

Herk Kailis:

It’s always different at different institutions, and part of that comes down as segmenting your market correctly and understanding who Melbourne Girls is, or who University of Melbourne is relative to other institutions and other schools. So I think universities, like schools, expect you to be prepared when you come and have those conversations, to have done your research, and to have read the strategic plans, and to have looked at what the strategy for that institution is, and to look at who the key stakeholders are, and what they’ve been saying. Then come in really well informed around where their pain points are and what their issues are.

Proctoring has been a big one over the last 12 months, globally, as a sort of piece for assessment and how proctoring mirrors the invigilated in-place assessment tasks that go on. From a pedagogical standpoint, that has an enormous number of drawbacks in terms of what it means, and the philosophy, and the principles, and the types of assessment, and the types of outcomes that can be encouraged throughout it. If you don’t understand that and you go into an institution pitching that thing, and you don’t know that they don’t align up with that very well, and their philosophy is to move away from those types of assessments, then you can get lost in that piece.

You need to have done your research. You need to understand who you’re talking to and I think, more and more, it’s harder to get people like Lauren’s time, and other key stakeholders at institutions. You’re expected to be very well prepared when you come in for those conversations, and anything that’s publicly available you should have read, and been informed about, and be able to then offer a solution, which is having understood their pain points. Then working together to solve something that might actually provide a lot more benefit.

Lyn Hay:

That was a valuable insight there too, in really doing your research. Given that there’s a breadth and a lot of diversity within the K-12 and higher ed sectors, trying to target some individuals probably at an institution to get a sense of what’s happening behind the scenes, because often the public-facing information on a website may not give you the answers to the questions that you might have.

How can you uncover students’ and teachers’ needs within a school? And what are some of those pain points as well? Lauren would you like to provide some advice for some of our audience on that?

Lauren Sayer

I think in terms of professional networks that are around, it’s really important for EdTech businesses to know that schools are not necessarily the brave innovators everybody thinks they are. We like to jump onto things together sometimes and, looking at professional learning networks in Victoria, we have Vine, which is the Victorian IT network. There’s Mighty, which is an Australian IT managers’ network as well. But then there’s professional learning associations. I also think events like this where there are educators that are here, but really being able to offer opportunities for listening.

I came into this role at the start of this year, and I moved from one EdTech product to another, and I was absolutely blown away that one of the EdTechs invited me into their offices for the day and listened. Now we’re an existing customer of how that worked. But I’m one of the biggest advocates of how they work because they sat down, and they said, “so tell us what’s happening with you at the moment and how that’s going?”, and it wasn’t about a product pipeline. It was really about getting to know who that person is. One of the things that I would be aware of in finding user stories is: speak to more than one person. Personnel change in schools, and they go from school to school. So the schooling ecosystems are a very small environment. We go and we all talk to each other, and it’s incredibly important that just because your product isn’t in one context right now doesn’t mean that teacher or leader won’t move to that next context the next day. I think that that’s very important.

Actually just spending the time to listen and not necessarily offer answers. One of the things James did was just sit and take notes, and ask questions, and didn’t prescribe to say, “Well, I think this is the way you should do things.” And I think that really strong development of, “what is the user story and context for you as a school” is a great way. But how to do that for an EdTech is to make sure that you’re involved in the community and you’re part of those discussions. So if you can get into small networked discussions, that’s where, as Herk said earlier, getting time one on one’s an absolute privilege and very hard to get. Our diaries are incredibly full, but going to networking events where you can sit and have a discussion is incredibly important. I think it’s worth being aware that it’s these sort of small networking events. As a decision maker, I’m moving towards, and I steer away from, large conferences in terms of a thousand EdTechs in a trade hall. I’ll be honest, as a decision maker I could think of nothing worse than attending that for two days and then all of a sudden I get 35 follow-up phone calls that I have to then deal with. That’s not how I want to work as a decision maker.

So a bit of a pro tip in that regard is that I think there’s more benefit in showing up to the small discussions and sitting next to a decision maker than there is standing in a trade show in the hope that a decision maker will walk up to you.

Lyn Hay:

Herk, would you like to provide us with a little bit of insight about how you have worked at trying to uncover the sorts of needs of students and teachers within the higher ed context?

Herk Kailis:

I’ll mirror Lauren’s comment there on going to trade halls with a thousand EdTechs. We haven’t done that yet. It’s good to know we don’t need to do that. For universities, they’re so big, there’s so many people in there that they often do publish an inordinate amount of information online. So you can do a lot of research on who you’re talking to and understanding what they’re doing well at and what they’re not doing well. Then they publish that as well everywhere, so those numbers are all public information as to what their students’ experiences are, what their student success rates are, and how they get funded, and where their shortfalls are.

So there’s a lot of information you can unpack and understand, and often in institutions, especially in times like this, they’re either looking for ways to save money or make money. So it’s really that justification they need to make around a product. In some respects there is the pedagogy around it. There is the fit for the institution. And on the other side of that piece, a critical component, but in times where you’re struggling to get budget and you’re fighting for budget, you need to make an alignment around those pain points for an institution on how they make decisions and what the funding impacts of those decisions are, especially with the Australian context, and international students impacting the bottom line of Australian institutions, and redundancies happening.

The funding situation is really important. Understanding what that is and that’s freely available information to go and get. The pedagogical side of things and what fits into the institution can also be worked out in many respects, from segmenting the market and then reading what their strategies are, and then finding out about the key people that are involved. I can’t mirror Lauren’s points enough here that the higher education sector is the same as the K-12. People move around. It’s very much an ecosystem of 43 big universities who all talk to each other and who all change positions pretty regularly in one way or another. So you really need to make sure whoever you get in with to start with, you do a wonderful job with, because of the reference ability of the market and burning customers early is a really dangerous approach to take. And even more so, trying to learn on the customer isn’t even a very dangerous thing to do.

So if you’re trying to pull apart pain points and trying to launch fast and implement your product in a higher education setting, it can backfire if not done correctly because you can get really bad press. You can have shocking results. They’ll tell everyone in the groups that they frequent, and then that can keep you out for a couple of years.

So how you do that pain point discovery, and how you actually build your product, and your understanding of the pedagogy, and the pain points, and those different bits and pieces needs to be quite strategic. It’s not a consumer market where you can just launch, make mistakes, and iterate necessarily with big universities who have long memories of things that don’t work well, especially with EdTechs and technologies that promise a lot and sometimes don’t deliver what they’ve promised. So it needs to be done quite strategically. One way to do it is to partner with an institution or a number of institutions and ask them to put up with you for a few years as you bring this product to market. So we took a very long time to get our product to market because we wanted to make sure the student and the teacher’s experience are exceptional before opening the doors to other institutions.

There’s a theory in EdTechs, and in tech companies in general, which is “build a product your users love and then scale it.” Don’t do the two things simultaneously because you won’t build user love and you’ll churn out of everywhere. I don’t think that’s any more true than in this market where, if you build something no one loves, you struggle along for a while until you die.

Lyn Hay:

You’re really seeking out a champion, or a couple of champions, who see the need. They’re going to be able to push it through the different levels of decision making that are happening in a large organization like a university. It’s because they want to be actually seeing it implemented with their academics. Speaking to the IT department about it, it’s about partnering with those who are looking at implementing this and integrating it into the curriculum.

Herk Kailis:

I think sometimes people get confused about who they need to go talk to in an institution, and it’s almost always not IT. It’s who’s got the most to gain out of this solution, who can look like a champion if they implement it, who wants to get a lead in a particular area. So often for us in teaching and learning, we look and it might be a deputy vice chancellor who can see themselves with the product or with a company to be able to get a large advantage in an area. If they’re the first mover they’ll put up with some pain to be able to deal with that. So identifying who those people are is often the first challenge. You can kind of knock on all the doors and find out who responds to the emails. That’s one way of doing it, because whoever responds and whoever’s willing to put up with some pain is usually someone that’s looking for an edge or an advantage.

There are a number of those universities in Australia, but they’re in every market. There are a number of those schools as well, but they will help co-create your product. They’ll help put up with the pain of the product not being fantastic because they’re in it for a journey in which they expect to see some really big improvement that your product can actually achieve. They share the vision with you, not necessarily the product of today. So if you don’t quite get those customers right, you’ll lose the ones that don’t want to put up with the pain today. You’ll not quite get that to click as nicely. But that would be the gold standard, which is finding a reference group of customers prepared to put up with pain prepared to help give product and pain feedback to you so that you can develop a product that they can get a lot of benefit out of, and they can get an advantage by being first with.

Lyn Hay:

In a higher ed setting you might have a sub gain of learning and teaching, or you might have a sub gain of learning technology innovation. They are the people that are bringing the technology and the pedagogy together. They’re using technology to solve pedagogical problems, and so often they’re the kinds of people that have a very pragmatic approach. They’d be definitely looking for those solutions.

Herk Kailis:

I’d also add that it’s almost people higher up the chain you’d want to get involved as well. I think Deputy Vice Chancellors. BBCAS. The DVC Academic and even the PVC Academics are the people you want to get involved. They have more sway in terms of navigating an institution and in terms of pushing initiatives through. They have a budget that they can rely on, and they have strategic project buckets as well that they can use to generate funding. Without that help, in navigating an institution, EdTechs can get lost in universities. There are too many groups, too many stakeholders, and too many dead-end conversations that you need to make sure you get alignment with someone who can cut through the noise early-on to save you the time of figuring out that piece and can help you do the important stuff of getting your product to market.

Lyn Hay:

With each EdTech company having a different specialized product and schools having a fixed curriculum / objective to follow, where do you see the bridging happening?

Lauren Sayer:

I think this is a timely thing. I counted up because for this last week, I had 17 people offering me business propositions on LinkedIn to come into my school to fix a problem that I didn’t have. I’ve given up responding to them. If the first thing you’re going to do is approach me with a pitch, I don’t have time for that. The bridging needs to be listening to the schools and the networks, and James Leckie put in the chat before how it is where people are talking and then they’ll say, “We’re social beings.” Education teachers don’t get into this because they don’t love to talk. We stand in front of classrooms and work with kids. We love to talk, and we love to broadcast.

We will tell you via multiple networks what the problems are. That’s when you provide a solution, not providing me a solution to a problem I don’t have. If I get another LinkedIn, it’s almost to the point that I’ll totally remove the connection altogether because they haven’t listened. If I ask a question on Twitter on how that works, if anyone has solved this, and someone sends me a message and goes, “Well, I’m working on this, and I’m solving that.” then I’ll have a chat because you’ve listened to my need. Then there’s a match, and it’s not just Twitter. It’s linkedin conversations or things like that. I think we’ll ask for problems where we’ll ask for those solutions of how that works. But I’m not sitting there looking for solutions to problems I don’t have in that space. I think it’s again having those conversations at a really ground level.

Then there’s different areas that this is emerging. One in a K-12 space is every Sunday night on Twitter there’s an Aussie Ed chat, and each week there’s a theme. The new one where lots of people are, and I see a real merge in teachers and EdTechs is via audio chat and Clubhouse. So I’m seeing more and more Clubhouse discussions of EdTechs and teachers talking together in that really low threat discussion forum.

I would say a new place to go and talk to teachers is Clubhouse because lots of teachers are on there, and lots of EdTech, and they’re going in there to have conversations together. It’s incredibly purposeful. It’s really these purposeful conversations of small groups that are incredibly important. But don’t try and solve a problem that we don’t have.

Lyn Hay:

I’d like us to move on just to explore what some of the quantitative and qualitative metrics that your organization uses to measure their success and effectiveness of EdTech.

Lauren Sayer:

I’ll start with what our metrics are not and then I’ll move to what our metrics are. Our metrics are not viewership or enrolments. I think Herk might talk more on that. We had a conversation the other day around this. You’re using this, you’ve got this many hits, etc. I don’t care how many hits there are. I want to know if it is improving the life for our students, number one. If it improves their life and ease of use for our teachers, number two. I think it’s a bonus if it saves money. It’s an essential if it saves or streamlines time. But we’re a learning organization, so number one in an EdTech specifically around that is whether it’s going to improve learning productivity and teaching productivity. So whether that be time results of how that works, being able to do. Then the other one is ease-of-use for teachers. I’ve worked in tech integration for a really long time. Everybody always expects there to be a bit of a hump. I don’t think someone might be able to tell me a product that has, but I don’t think technology originally, when you first go, has made any teacher’s life easier straight off the bat. There’s always this frustration gap, and I think one of the metrics in that space is that ease-of-use and knowledge of other systems.

If it works like another system a teacher already uses, that’s a key metric. But really what we’re looking for is, “is it improving learning productivity?” or, “is it improving teaching productivity?”, and those two key areas is success here for us.

Herk Kailis:

I think Lauren’s spot on. The thing to add to that is that there are a lot of things you can get trumped out of the university for, or the school for. There’s a base level of stuff you need to have which, if you don’t have, you’re a red flag right away. If you don’t have single sign-on, if you don’t have privacy policies, if you don’t have compliance, if you can’t do technical due diligence, sometimes it’s if you don’t have SLA’s, or if you don’t integrate into all the tools they use then it’s just too hard. So when you’re getting evaluated at a pilot stage or a trial stage, then often it’s a very small group and they’re extrapolating out what the results are. Even if you only have 10 flags that come through as support requests or problems with the tool, and they’re going to extrapolate that out 100 times or whatever the scale of the pilot will be.

There are a number of things you need to get right just to come to the party, and often that makes it a very high barrier to entry, especially in universities where you need to be, depending on where your tool fits into the ecosystem, the stack of tools you need to make sure you talk to all the other tools. So if it’s assessment, you integrate into the LMS. There are four LMS’s. You need to make sure you integrate into all of them. Everyone uses TurnItIn. you need to integrate into TurnItIn. The experience of integrating and working in TurnItIn and the LMS through your tool, harder or easier? And that talks to Lauren’s point that, if it’s harder, then they’re not going to use it because it’s more work for everyone. It’s going to be too difficult. So you actually have to be able to fit into all the things that they already use, and need, and actually be able to, on top of that, improve the experience and actually decrease the cost. If it’s more cost to do it then the benefit has to be much, much larger than it already is. To get large-scale numbers of people to change it needs to be really easy. So think the terms and conditions on Apple. It’s got to be that easy to agree and that easy to shift to. Otherwise people are going to struggle to take up your product. Even if the benefits are there, people are going to go, “Yeah, but it’s too hard to log in” or, “Hey, but I have to send the grades through an export function” or, “Hey, I need three people to help me run it in my tools.”

So I think there are a number of trunks in there that you need to overcome to get to the party. Then I think it really goes down to what they’re measuring. How does it impact the users that are actually implementing it? Does it solve their problems? Often it’s just a conversation with the students or the teachers to say, “Hey, we implemented this because you identified these pain points. Did this solve it for you qualitatively?” And if everyone’s coming back going, “Hey, this was a huge success. I’d love to use it again. I don’t have the numbers, but I felt like it’s really worked well and everyone says the same thing.” then more often than not they’re going to take those opinions over even what the data says. Maybe there’s a lag or maybe there’s other factors at play. But they seem to trust people in the organization to give them the right information, so that’s less of a consideration early on. It’s more the quantity of peace is a consideration later on when there’s greater justifications of rollouts and price as well.

Lyn Hay:

Herk, you’ve covered a number of red flags and no’s when it comes to evaluating a product. Lauren, do you have any more to add to that list?

Lauren Sayer:

Understanding the technology and the human ecosystem within the school of how that is… If you’re not integrating with it then I’m not interested. No school in K-12 is interested in another standalone product that doesn’t sit within the ecosystem. Data housed offshore is a no-go in schools, so it’s now one of the key questions I ask. Where is your data? I’m at the point with a couple of projects that are in-flight at the moment where I’ve stopped them because the data’s not in Australia. They will not restart until our data is in Australia. Lack of privacy. One of the interesting ones is data portability and what happens if I no longer want your product, or how do I get my data in and out? How does that work? And I look at EdTechs, this is going to become more and more important.

If I go back six years ago, which isn’t that long ago, making the decision to move from LMS 1 to LMS 2 was not really an issue. There was nothing worth keeping in it. People weren’t using it because paper was more efficient. Now I look at previous schools and how it works, and going, “oh wow, that product!” But all of a sudden it’s hard to get out of those things. You can’t move to the next great product. I think it’s data portability both ways. I think as an EdTech, if you’ve got a really great product, then having that integration that you can get data from, you can move from one to your product is really important. To keep people like me that like to feel safe in that space, being able to get out of that later is incredibly important. I think also in terms of the product, understanding that this sort of product in an EdTech is sometimes a software and sometimes a service, and both of these stay true no matter what that is. So not all EdTechs are trying to sell me a piece of software. Some are trying to work with us as a service and actually really looking at those same things. What are the privacy policies? Where are our data and our conversations kept? How do you sit within our ecosystem of how that works? And sit with the philosophy? But really I think those red flags of integration, data, and privacy, and single sign-on — you’re not even getting from that landing spot these days.

Herk Kailis:

They’re the ones that you have to have to get through the door. The other ones that are coming about are important, but they’re not going to stop you necessarily from getting in, at least a higher education. Data portability: everyone wants to know about it, but if you don’t have the other things then they just won’t even look at you, or you won’t even get a conversation through. The data in Australia one — you can’t work with an Australian university if you don’t have that. I think more and more schools are also looking at data portability… They’re not just looking for it in terms of being able to move to another product. We’re moving from reports where you used to get your pdf, and you’d have it, and then you’d email it, to having micro-credentials and some sort of portfolio.

Our students want to take their data with them, and I think one of the interesting things is this new concept of “generation alpha”, where they don’t like all of their data. They’re incredibly data-aware, our students, and I think people would be surprised at how many 15, 16, 17-year olds I know in my school of girls that are actively not on any social media. They will not let their parents upload their videos. They will ask me questions in my role of where is the learning management system data going, and how is it stored.

I am not just getting those questions from governance and risk anymore. I’m now getting those questions from students, and that is really important. I think the next generation of students coming through are incredibly data-aware, and they do not want their data being everywhere. I think that’s important.

Herk Kailis:

And it’s what’s happening also globally. GDPR compliance is, once you get out of Australia, you have to tick-off on those things anyway, so it’s not just in this context. It’s actually a global context. That’s the way everything’s going, so if you can’t give people their data, and delete the data, and do those things, then it becomes harder and harder to win new customers and also satisfy your existing customers.

Lyn Hay:

If someone was fortunate enough to be in a situation where they were having conversations with a school or a university and they’re at the point that they were on the cusp of actually managing to look in some kind of pilot program, could we get a sense from both of you from both of your contexts and perspectives: what are the key things that you would need to be agreeing on before you started a pilot program? Herk, would you like to start based on your experience with higher ed and doing what you’ve been doing?

Herk Kailis:

It depends what sort of a pilot that you implement and how you set up that relationship. One of the things to line up is expectations. Depending on where your product is up to and what outcomes you can achieve at that point in time, getting some alignment with key stakeholders. If your product is early stage, you don’t want them thinking they’re testing a final product. You’d rather have a line up where we’re testing your product for these capabilities. If it achieves those then you’re going to reach the next compliance point. So that alignment piece early on is really important.

Then also figuring out what success looks like up front so, “they’re doing this because they think this can be a success.” If it is successful how does this impact or change what they do as an institution going forward? So you can effectively plan as an organization what might happen if this works well. If it works badly then you drop out anyway. So there’s no value in planning for if it goes badly. The results are always going to be that you fall out of there. But if it goes well, what does success then look like, and how can you actually have an approach where you started lining up some of those next steps concurrently? Institutions and universities can be slow moving places, so if you don’t pick up momentum and engage with the right stakeholders and validate with the right committees in the right groups then you might stall, and the stall at a university is usually a full semester which is three or four months. You’re actually not going to get the uptake, and then you may miss out on budget. It might be a yearly cycle or you may not get any more of the same budget that you got.

I think the two things that are probably important to understand are: the person you’re talking to, and what they actually hope for to see a success. Then if it is successful, what will your next steps look like so that you agree on those upfront as opposed to having to come back and try and have those conversations on the back end.

Lyn Hay:

Lauren, can I hand over to you and get you to just provide a perspective from the school context? What would a pilot look like in your school context?

Lauren Sayer:

I think what Herk said around setting the expectations, and sitting around the table, and looking at what success will look like is incredibly important. Understanding the K-12 environment so lots of people will offer a pilot project in a term, understanding that a term goes incredibly fast in a school. If I look at this week, we’ve got school athletics, and we’ve got another four events. Basically it’s a nine week term. So to get a pilot off the ground in a term, or to offer a school a free term, and then ask them to evaluate and buy a product is not a great value proposition in the K-12 market. If we’re looking to pilot, then we plan things out year-on-year, and being able to work on that long-term pilot where we can have the time because a term is far too fast for schools, even an 11 or a 12-week term. By the time you put all the events in, teaching time is incredibly limited and I don’t think, unless you’re in a school, you really can appreciate how many events sit in schools. By the time you’ve got your guest speakers, your school sports, open house, and then you have competitive weekend sports, and all these different things that sit in school, and events, and Mother’s Day…

How this works being able to then have it is important. So time, expectations… I would have to ask, “what are the costs going to be when I’m expected to pay?” So it’s great to have it, but there needs to be an indication of what the costs are, and are they staggered costs? Because if I’m doing it free for the first year of this, and then you’re going to be developing your project with us, then I’m going to ask for a discount because we will be troubleshooting, beta testing, and how that works. What is that agreement, and how does it look? That’s incredibly important of how that works. And how can it be unwound if the pilot’s unsuccessful?

Back to that red flag I had, or what happens to our data after that year. So if I’ve got 1,100 students on across Melbourne Girls Grammar, and if I’m putting all their learning data into something for a year, then I’m going to want to know if it doesn’t work, how I get it back out, and what you do with it, and whether you’re using that to develop your product, and how this works. Back to those red flags. Examples of really good pilot experiences, and I can give you an example of a really bad one if you’d like.

Lyn Hay:

Give us a really good example of a pilot and how that worked. Then if you’d like to give us the flip side of that, that’d be really valuable as well.

Lauren Sayer:

I’m not going to name companies in this because they’ll end up hating me forever, and I don’t want to name and shame. But a really good one is a curriculum content company who sat with me in a previous role and said, “we’ve heard that you’re looking for a specific type of curriculum content and we’d like to work and build that with you.” We met with that department, and the teachers all said, “well, we would like to be able to do that.” There were partnerships with other organizations. So they were partnering with other governmental organizations, and how that worked, and we said this is something we want to get into. There was a clear agreement of what our contribution was to the product, and therefore what we were getting out of it, and what the company was getting out of it. Then over a period of a year we co-collaborated, presented together at conferences as a partnership, and we went on to purchase that product at a discounted rate in year two and year three. But we continued to be advocates for that.

I think one of the important things when a pilot goes really well, and you become a customer, is that it’s really wonderful to keep loving that customer. Even this morning I was invited to a business breakfast which I couldn’t make, but I appreciated the invite and it was for all foundation customers that had signed up and been part of that process, year in, year out. So this is a company we’ve been foundation customers with for a long time. We’ve had a long-term relationship with them, but they don’t forget about us. I think that’s incredibly important.

On the flip side, where it doesn’t work is where you agree to things and there’s no exit strategy. You’ve got a discount, but by the time you’re looking to go in, and the fault is sitting with people sitting in my role, not necessarily the company. But if you’ve put all your data in and you’ve put a big startup of getting everyone to make their content and put it into the product x, and product x has never promised to deliver, and it hasn’t delivered at all… I’m now in too deep in that scenario to be able to get out easily. But what you then have is a disgruntled person, and I don’t think you want disgruntled customers out in the marketplace in that space. You don’t want that. When a school goes, “How is product x?” then you don’t want them to be going, “oh, we’re in it but we can’t get out, and I’m not happy, and I’m not going to use it no matter what.”

Back to that small ecosystem. We all talked, and just because we’re using a product doesn’t mean we love a product. We all ask each other, and I think in pilots, if a pilot doesn’t make sure that there’s a win-win then getting out is incredibly important. You don’t want a disgruntled customer out in that marketplace.

Lyn Hay:

That was an education, itself, on pilot programs and partnering with people. Herk, would you like to provide some examples of pilot experiences that you’ve had, and any insights that will be useful to our audience?

Herk Kailis:

There’s a few points just to add to that. We found that working with people early on, and this is at all levels of an institution, who are prepared to put up with some pain, or prepared to find a way that the solution can work, even if there are issues with it — it’s always easy to pull something apart, or to find a hole, or to find a gap, and to then make the whole story about that — but it’s a harder thing to have someone who’s capable of seeing the bigger picture and seeing how this could work, and then being able to implement it and achieve success. So identifying who those people are early-on can make or break your pilot.

If you pick the wrong people to pilot your product in an institution, and you don’t have the right alignment between the academics or the teaching team, and they’re not prepared to actually do what it takes to move things forward as a team, then you’re going to get bad results. In some instances you need to learn what that looks like, and who those people are, and what those experiences are to not have to go make those mistakes again. But it’s often the people who are not really interested in piloting something. They’re more interested in, “will this be perfect?” and if it’s not perfect then, “I’m not going to not going to ever use it again and not going to say good things about it.” So I think finding who that group is, whether it’s at the customer level, whether it’s segmenting the market, whether it’s the key stakeholders in the institution, whether it’s the teachers that you go and find make an enormous difference.

We had an early client, a foundation client, who the Deputy Vice Chancellor showed the product to, as a group of people in the institution. We had a blue screen of death situation where it just kept loading and nothing happened. She sort of smiled and laughed to the group of people she was presenting it to, and she said, “geez, I love this company. Aren’t they great?” It’s a disaster. Everything was code red. There’s alarms and fireballs going off at our place. But they’re the sort of people you need to be able to find early-on to put up with some of those issues and overcome them. Obviously you don’t want to have that situation happen in which everything’s fallen apart and gone wrong. But you do need to find people who are not going to then go, “well the login button didn’t work this one time, and I clicked it the second time and it did, and then it worked. But I’m not going to use it again because it’s buggy.” So I think finding who those people are is really important.

And the other part of it is just making sure you also, with your pilot customers, get that alignment piece right around what it is you’re delivering. We had a situation early on where we pitched our product and we thought that it’s definitely going to solve this problem around academic integrity. Then it did that. But that’s not what anyone wanted, and so we found out that we’ve built this product that’s solving this issue for this customer, but no one that is at a user level actually wants to use the product because, firstly, the pedagogy is wrong. But then there are a whole lot of other issues associated with it. So with those pilot customers it’s trying to build a relationship. If you can actually set the expectations correctly then you can get the feedback, and then you can go and iterate things. When we iterated our product very early-on we didn’t lose those early customers. We lost some who weren’t right. But we were better off losing them anyway because we were spending so much time trying to satisfy them. The other ones were just like, “okay, we’ll just live through this pain. You guys are working on this thing. You figured out this first thing wasn’t right. Great. We didn’t really love that anyway. But if you build this other thing which is going to be better, then we’re invested in that.”

So it’s trying to get that alignment in really good pilot experiences. You always learn the most off of your best customers, and your best users, and you learn very little or a negative effect from your very bad customers and your bad users. The good customers teach you what to do, and the bad customers don’t teach you what not to do. They just teach you all this other stuff you shouldn’t be doing anyway. You end up firefighting and you wonder what the hell you’re doing. So find the good customers, listen to them, figure out your processes, fill in the gaps. It might be, “hey! We need data portability. Hey! We need a privacy policy. Hey! We need single sign-on. Hey! We need all this.” They’ll teach you what to do, and they’ll put up with it. Then the bad ones just won’t tell you anything. They’ll just say, “Hey. It didn’t work.” and, Hey. it’s a disaster” and, “answer this support request.” and you just won’t even figure out enough to build the documentation to build the integration. You go around in circles. So find the good ones, work with them, and try and not work with the ones that are more difficult until you’re more established. Crossing the Chasm is a really good book for that. It highlights how to segment that market out and which customers to look for early-on, and what traits they have, and then how to actually work with those ones, and why they’ll actually work with you in the first place, as opposed to picking up the ones that you might want, but they don’t want you and figuring that relationship out.

 

Lyn Hay:

The question here, in the context of pedagogy, I’m wondering if the panel can comment on the attractiveness or otherwise to teachers, schools, of courseware / curriculum inclusions. That is, to help teachers integrate a service into their teaching. Lauren, is that something that you could provide a response to?

Lauren Sayer:

The attractiveness would depend on the school, and the context, what the curriculum focus is for that school at the moment, and if there is a need or a gap in what they’re doing. In terms of curriculum content or courseware, I would be hesitant to say that all schools are looking for STEM or all schools are looking for Computer Science because it’s very nuanced. It depends on the school’s strategic plan and their focus. It’s about doing the homework and understanding what it is. But I’d be very wary of making whole generalizations on what schools or teachers need because it is such a nuanced profession. Schools are very diverse in what they’re doing in a content space. So in a content and curriculum space that’s very personalized, I think if you look at broader EdTechs around learning analytics or provision of wider management systems, of school management systems, of how that is in terms of content, it’s very narrow.

I can see there’s a follow-on of that of just delivering. So is there an appetite for that? If it matched the teaching and learning philosophy of the school and a need, then yes. But is there a whole school appetite for pre-prepped content? I’m not currently looking for that, but I’m looking for tools that make curriculum content creation easier for my teachers.

Lyn Hay:

I’d like to thank Herk and Lauren. I feel that you’ve really delved deeply into this topic of pedagogy and piloting, and there’s been fabulous advice across both K-12 and the higher ed sector. There were a lot of similarities in many ways in the advice that you’re providing. But then there were obviously these nuanced differences as well. I do hope that the audience has found this conversation valuable and you now will have an opportunity if you like to actually explore this further in a networking session. So thank you, Herk and Lauren. It’s been lovely working with you this afternoon.