On Day 2 of the Victorian Global EdTech and Innovation Expo 2021, Sacha Nouwens and Marcus O’Donnell addressed how data analytics are being used to support and retain university students.
The morning session of the second day of the Victorian Global EdTech and Innovation Expo 2021 brought together three education experts to explore the use of data analytics in universities.
They discussed how analytics are currently being used to support student success, the impact of the pandemic, whether analytics have improved retention rates and the future of analytics in higher education.
Belinda Howell, Strategic Advisor at LINC Education, moderated. She was joined by two panellists:
- Sacha Nouwens, Executive Director, Student Experience and Insights at Online Education Services (OES)
- Marcus O’Donnell, Director, Cloud Learning Futures at Deakin University
Below are the highlights of this session, followed by the full transcript.
How data analytics are being used to support student success
Sacha Nouwens explains that OES partners with universities to help them deliver their courses fully online or digitise parts of their learning experience. They use data analytics to:
- Assist universities in solving problems to increase student success and retention.
- Show educators the effects their course content changes have on students. Have the changes increased the amount of time students spend studying? Have they helped them achieve better results?
- Identify at-risk students and help universities understand the steps they need to take to support those students.
“We work with universities to help them make the best use of their data.”
Sacha Nouwens, Online Education Services (OES)
Marcus O’Donnell says they’ve been using data in similar ways at Deakin University. Their initiatives include:
- The Priority Students Program that brings together several different data sets to identify students who need personalised support ranging from phone calls to peer mentoring.
- The Teacher Dashboard Project that aims to collect data insights for every teacher to help them understand what’s happening with the students in their group.
The impact of the pandemic on the use of data analytics
According to Sacha Nouwens, OES’s way of operating didn’t change much because most of their programs were already online or blended. But COVID was a game changer because:
- Universities had to move online quickly, providing OES with an opportunity to assist them with analytics and digitising other elements of the student experience.
Marcus O’Donnell agrees that day-to-day operations didn’t change much at Deakin University because most of their courses were already online, but the pandemic did raise a major question around student engagement:
- How do we keep students engaged when there are things going on in both students’ and teachers’ lives that might mean education isn’t their number one priority?
“[The pandemic] brought a new sense of focus on what digital education really means.”
Marcus O’Donnell, Deakin University
The combination of data is key to student success
Both Sacha and Marcus agree that there isn’t one specific element of data that can be pinpointed as the most important in determining student success. It’s the combination of data or the interaction between data sets that really matters.
According to Sacha, the collection of data requires:
- A high-quality learning management system (LMS) to understand the interactions students are having in the systems.
- A great customer relationship management system (CRM) to know when students are reaching out to staff members and what they’re asking.
“[Within] a matter of weeks of observing their behaviour, we can quickly get a quite accurate prediction [of] which students are going to fly on their own and which ones need an extra helping hand. But we really find that it’s not one single data point, it’s the combination of all of it together.”
Sacha Nouwens, Online Education Services (OES)
The invisibility of analytics to students
Sacha Nouwens explains that analytics are invisible to students and they generally wouldn’t realise they’re being used. Here’s how it works at OES:
- Their analytics identify when a student is struggling and a helping hand reaches out.
- They use a high-touch model that connects 25 students to one tutor for each class.
- A tool also serves up analytics to teachers, but OES doesn’t use automated responses because they want teachers to be able to overlay their knowledge. If they’ve already been in touch with a student, a message that’s not in sync with the previous conversation wouldn’t work.
“With our analytics, we’re really trying to focus on the group that’s not going to be successful without the extra help.”
Sacha Nouwens, Online Education Services (OES)
Marcus O’Donnell believes that automated systems can help, but teachers need to be able to modify the responses. At Deakin University:
- An intelligent agent can send automated messages to students when they do something unexpected, but their new Teacher Dashboard Project aims to provide teachers with a series of template responses they can adapt depending on their communication history with a given student.
- Their goal with the dashboard is to use analytics to make life easier for both the teachers and the students.
The challenges of developing analytics
According to Sacha, the adaption of analytics in universities comes with several challenges:
- Even though OES works closely with teaching staff to design systems and processes that work for them, what teachers think they need doesn’t always turn out to be helpful and they frequently have to start over.
- Building a bespoke system is a long iterative process and everyone has to accept that.
Marcus O’Donnell agrees and points out that:
- Analytics is a new area, so we have to be patient and give teachers time to understand how it can integrate into their work.
“Analytics is about individual personalised attention when you’re doing things at scale. That’s the new area.”
Marcus O’Donnell, Deakin University
How analytics have improved retention rates
Marcus O’Donnell says that while they’re still in the early stages of analytics at Deakin University, the foundations are in place and they have noted:
- Improved retention and a positive effect.
Sacha Nouwens explains that OES does slow rollouts of new technologies to be able to measure their impact. They’ve found that:
- Before OES used analytics in their interventions, retention rates improved by 2 percent for one of their partners.
- Now that they’re using analytics, the retention rates for that same partner are 5 percent better.
“When we got the data behind it, it [made] an even bigger difference.”
Sacha Nouwens, Online Education Services (OES)
The pros and cons of analytics that feed directly to the student
According to Marcus O’Donnell, giving students direct feedback about their performance has advantages and disadvantages:
- They get a really good idea of their self-regulating processes, motivations and engagement. This information could help them perform better in an online environment.
- At the same time, giving students the right feedback requires taking individual differences and psychometric data into account. We can’t give a student advice on what to do next without considering what type of learner they are and how they approach their goals.
“Matching the analytics we get and the feedback we give to students with specific learning designs… is really coming up. [It’s] the way that we can actually get better value out of the data.”
Marcus O’Donnell, Deakin University
Sacha Nouwens agrees that giving students direct feedback is tricky because:
- Students need to understand the data and the next steps they need to take.
- OES conducted an experiment early on where they told students how much time they spent in the LMS and how that compared to top-performing students. They ended up pulling back on it because it didn’t help the students who needed the most help and it didn’t tell students what they needed to change. OES is now researching next steps in this area.
The challenges of data analytics in universities
According to Sacha Nouwens, the main challenges are that:
- Universities have legacy systems.
- With all the different departments working independently, it’s extremely challenging for the central data analytics team to service the needs of the whole university.
Marcus O’Donnell believes that:
- Universities must educate staff to become analytically data proficient if all this data is to be useful.
- We have to accept it’s a long and slow process for universities to build their analytics and make the most of them.
Advice for universities about how to approach analytics
Sacha Nouwens says that:
- Universities are behind other sectors such as banks, social media companies and gambling companies when it comes to analytics.
- These companies invest a lot of money in analytics because it brings big returns and she would love to see universities make that same investment.
As for Marcus O’Donnell, he believes that:
- Students need to be brought along on the analytics journey.
“Data is an incredibly contested area because of a range of things that have been happening, so it’s really important to work directly with students, explain to them what we’re doing and get informed consent to move forward.”
Marcus O’Donnell, Deakin University
I would now like to invite the guest speakers for the plenary discussion, this morning’s program. They’ll discuss driving student success with analytics and personalized support. Now I’d like to introduce this session’s moderator, Belinda Howell. Let’s give a brief introduction to Belinda. Belinda has held multiple senior roles in international higher education over the last ten years and is now actively supporting Ed-Tech companies. Following a successful corporate career, she led UTS college into a period of significant growth in its international student cohorts. She expanded its in-market presence through Asia and the Middle East. Belinda coaches leaders in the education and nonprofit sectors and is on the board of many organizations. Welcome Belinda
Thanks very much, Sarah. I’m delighted to introduce the two panellists in this discussion. We’d like to welcome Sasha Downs, whose name’s already been mentioned several times from OES. Sasha is the executive director of student experience and insights. Next, we’d like to introduce associate professor Marcus O’Donnell from Deakin University. Welcome, Marcus. Marcus has a wonderful title called director cloud learning futures, which hopefully we can unpack a bit during this discussion. Thank you both for joining me. I acknowledge that amongst many of the companies I’m working with, one named Link Education operates absolutely in this space of data analytics and personalized activity support, so we should have a good conversation together.
Let’s kick off by just outlining what we’re all exactly doing. Let’s introduce ourselves to everybody here by talking about what each of us is working on at present institutions or in other parts of our lives.
Could I start with Sasha and invite you to let us all know how you and OES are using data analytics to support student success. What are you working on, Sasha?
Thanks, Belinda. It’s great to be here with you and Marcus this morning. As you just heard Claire share, OES does partner with universities to help them either deliver their courses fully online or to help them digitize parts of that experience, and analytics is critical to that. Analytics can help in many different ways. Universities come to us at different stages of sophistication and in the hunt for solutions to different specific problems. But very often, they talk about their student’s success and the will to retain their students. We work with those universities and help them make sure that they’re making the best use of their data. I wanted to share a couple of examples of the types of initiatives we do to give you a bit of a flavour of analytics.
This one type of thing we do is more about the everyday experience that students have; universities want to make sure that students are experiencing things in the best possible way, and the key to that is the actual course content. As educators continuously change and improve their course content, we use analytics to show them what that change has had. Has it procured the result they intended? Has it increased the number of hours students spend studying? Has it helped them achieve better results? I guess this is one way in which analytics can help in day-to-day education.
Then the other area we’re stepping in without analytics is the odd times when things start to go wrong for students. The analytics we use helps universities in identifying the students that are going to be at risk, it helps them understand why it is happening, which further helps them take the needed steps for being able to provide support to those students.
That’s great Sasha. That sounds so interesting, and we’ll delve into the details shortly. Marcus, could I get a snapshot from you on what’s going on at Deakin?
We have been doing very similar things at Deakin. For a long time, we have had a Priority Students Program, which is a lot similar to the OES model that was flashed up on the screen earlier. It tries to bring a whole lot of different data sets together in a group of students who might need some intervention and supports them in doing well. We take this range of data sets and this leads to a whole set of personalized support from phone calls to peer monitoring and a lot of other different facilities.
One of the other things we are working on at the moment is the Teacher Dashboard Project. In this, instead of the classical institutional approach, we try to get together data insights for every individual teacher. This helps them in clearly visualizing the actual state of each student in their group.
Thanks a lot, Marcus. I’ll like to add here that the company I am working with i.e. Link Education also does the same work as Sacha and OES, it is their sweet spot. It is all about the personalized academic support followed by the system and the analytic support which enhances the experience. At this stage, they are working principally with online courses. It is something that started well before COVID, but now it is very topical.
This leads me to my next question; Have your works been COVID influenced, in the sense of changed or increased focus following the COVID-induced move to online learning and teaching?
I feel that COVID has changed everything. For us, the biggest impact of the COVID was that previously we used to use our analytic services internally, but the movement of universities online allowed us to reach out and help them with our analytics and other digitized elements of their student’s experience. Apart from that, most of the programs that OES works with were already online or blended. So, COVID didn’t shift the way we taught or the way we did analytics in the pre-COVID era.
What about you Marcus?
Our experience is very similar, we already had an incredibly big online cohort, nearly all of our courses were already online. So, as Sacha said COVID has changed everything but didn’t change much of the day-to-day stuff that we were doing.
I think COVID has brought a new sense of focus on what digital education means because everyone is talking about it. The big question during COVID was how will the teachers who are not familiar with online education keep students engaged. How do we keep students engaged, especially at a time when numerous things were going on in students’ and teachers’ lives last year perhaps education wasn’t their number one priority.
So, I am hearing that the goal has been around retaining students, giving them a better experience particularly of the online arena, and helping universities in understanding their student’s experience and how they are progressing.
Is there anything else that we should add in there?
I think that one of the areas that Deakin is not currently pursuing but is something that we are interested in exploring is the type of analytics that feed directly to the students. So, most of the work in learning analytics has been focused at the institution level or the teacher level. I think this is the area that you can see in the research with promising results. Feeding analytics directly to the students will ensure that they get a really good idea of their self-regulating process, motivation, and engagement. And for now, this is certainly something that we are very interested in experimenting with.
That’s a great segue Marcus into what I was going to ask about next which is; What is the important data? We talk about huge data, so what do both of you feel is the most important element of the data you collect that helps students in succeeding?
For us, it’s the combination of the data. We try to understand the interactions that students are having in the learning management system. Learning Management Systems is the key to that, but interaction with the staff is also critical. So, having a good CRM is important to us. This helps to know when students are reaching out or when they are talking to staff members and what they are asking for.
We have found that early on when students first enrol with one of our partner universities, it was difficult for us to predict which one was going to be successful. But as time went by, we started observing the behaviour. Now we can quickly get the accurate prediction of students who are going to fly on their own and those who will need a little bit of extra helping hand. We found that it’s not one single data point, but the combination of all of it together.
Marcus, anything from you?
I think that’s true. I too feel that the interaction between the different data sets is vital. Data is everywhere, now that we are using digital systems, we are capable of setting up things that produce data in a certain way. So, we like to carry out the student-focused work as I have talked about before. We can set up maybe psychometric testing or other things like that which can give us different kinds of data sets that can then create a framework for the usual data sets we have been working with.
Okay, so we are talking quite conceptually at the moment, I am wondering if you could both walk us through an experience from the student perspective that is also from the teacher or academic perspective of what students are seeing?
Well, the great thing about analytics is that it is invisible in the air out there. Students often wouldn’t realize that analytics has been used for them. For students it’s just potentially reaching a point in their studies, completing the big assignment before the due date. They are struggling and then some helping hand reaches out to them with our analytics. We try to connect these students with their tutor, we have higher touch models under which twenty-five students work with one tutor in each class.
So, with analytics, we are trying to figure out when a student is having one of those moments where they need some reassurance or just a check-in, and then connect them with their teacher. We have built a tool that helps in serving those analytics to the teachers along with letting them overlay their knowledge. The teachers often have conversations with their students via email, so we choose not to use automated methods where students might get a message that’s not in sync or part of a conversation they’ve already had. What they’ll get is just teachers reaching out to them at the time of need for providing the required help.
So, Are the students at present not conscious of what is going on?
I think to an extent, we all know and have plenty of examples of companies using our data for different objectives. I’m sure people are aware that the data is being captured and used by universities. But it makes it front and centre for the students. It is just a timely intervention or the way we communicate with students like contacting them through channels that they prefer; we like to talk to the students about things that are important for them. It is about us providing the right information at the right time.
I think that is true. We had a system that comes with a compiled learning management system called Intelligent Agents. Through this, you can set up cues that if a student didn’t do something you were expecting, then they will receive a little message. But what we are trying to do in this new dashboard for teachers is exactly what Sacha just said; it’s not relying on that automated message it is still using the same information. So, it is providing teachers with the required visibility. But then the teacher chooses what and how will they message their students. There will be a series of templates that the teacher can use. So, we are trying to make it easy for the teacher. But it has to be in the flow of the conversation so that they know what happened last time when they were interacting with that specific student, and accordingly, they can modify the template.
I also think that in a sense, it should be the bottom line of analytics. It should make itself easier for both the students and the teachers. This is all about trying to make education service in terms of what both the teacher and the students do.
I was reminded of our earlier conversation when we were preparing for this session and Sacha said something which I wrote down as being a great quote. You said, “Analytics is only good if it changes behaviour.”
So, whose behaviour are you trying to change?
We have talked about both staff and students, and I think that there are a lot of students who are going to be successful no matter what, even if things aren’t great. COVID is a great example, some students were able to overcome the difficulties that occurred due to the pandemic and adapted to it. But at the same time, there is going to be a group of students that face more barriers in the way or need a bit more reassurance.
So, with our analytics, we are trying to focus on the group which is not going to be successful without the extra help. My background is in analytics, and the hardest part about it is still people, but the adaptation of the analytics is still the challenge. We try to work very closely with our teaching staff to design a system that would work for them. It took us years to get there, at first, we tried to start by asking what would you need? And often people tell us after the first pass that the analytics is not helping them.
So, it was an iterative process. I think it is very important to work hand in hand and acknowledge that it will take time. If you are building something like bespoke as we did, it will take time.
I think that the iterative process is very important and we had the same experience. Since analytic is a relatively new area because of that teachers don’t necessarily know what they want and what can help them. We have been on this journey for the last three years and along the way, we have been opting for different versions of these products. So, it is about changing a culture and giving people some new understanding of how this can be integrated into their work. Our session is about personalized education, and education has always been personalized, that’s what simply education is, that’s what teachers are meant to do.
So, analytics is about individual personalized attention when you are doing things on a large scale. And since it is a new area the question of change is important.
We have discussed the changes in the students, the academic staff, and the teachers. But institutions are also an integral part of the educational pillar.
So, what changes are going on in the institutions according to you? And how much change is possible?
Universities are in the process of becoming digital organizations like every other business, and data is at the core of that. So, we have made a huge investment in processes over the last four or five years of building big data warehouses. We have changed our learning management systems so that we can have better access to data. Now we have a data point of every single thing that happens every time a student or a teacher takes an action in that system. This is the first step, next it is about where it goes and how it is kept, including all sorts of privacy and policy issues.
So, data at every level of the institution is changing policy, behaviour, the system, it’s changing everything.
What can we know about the outcome so far? We are talking very holistically at the moment, and Marcus I presume not every course at Deakin, and every student is touched by your process at the moment. And Sacha, OES is working with different universities on different courses, Link Education is doing the same, but with a selection of Australian and overseas universities on supporting particular courses.
So, what changes have we seen so far? What have been the outcomes according to you?
In the beginning, I said that there are two-level. The priority student program is across the whole university, every single first-year student’s data is analyzed in that system. So, we are trying to look across the whole university and ask who needs some extra support. Since we have been experimenting with more boutique projects over for the last few years like the Teacher Dashboard Project, along with other iterative projects, just to know what works, therefore in that sense we are at a much earlier stage,
So, we have the foundation that works across the universities at some level, and that has improved retention along with positive effects, but that is a very tiny fraction of what we could be doing.
We have been using analytics for both identifying what things we can change and measuring these changes. So, we have done quite a slow but progressive roll-out, because when we are introducing a new change. For example, when we introduce any new technology for our students where they can connect with other students, we do it in a way where we set them up as experimental or controlled trials, through this we can measure and see its impact, did it affect their retention, did it improve student satisfaction. We started in 2014 where we introduced this new technology and began our interventions for the first time. Before that, we had a great student service, seven days a week, public holidays, extended hours, but some students still needed some help especially those who are the most scared to ask for it.
So, when we introduced that intervention, we did see a change immediately. We witnessed a two per cent improvement in retention, and back then our interventions weren’t driven by analytics, we just called everyone that we hadn’t heard from in a while. Over the years we have increased the sophistication, done more propensity modelling, and got more targeted with our interventions. By employing analytics, we witnessed skyrocketing rates of retention for particular partners. So, you can still achieve success without analytics, but having the analytic data will surely make an even bigger difference.
All of this result is impressive.
Marcus, earlier you mentioned that you see the future in which analytics is directly going to the students. Can we explore this a little bit?
One of the issues in online education is students who have to self-regulate and be their organizers. We know that students who are better at regulating themselves with their learning do better in an online environment. So, we can give students direct feedback about what they need to do next, what they might not be doing, how good students in the past may have acted, etc.
Every student is an individual so we make use of psychometric data which takes into consideration who that student is and how they might be approaching, what their goals are. Margot in the presentation mentioned the move to short courses and the move of career changes into education. So, there are different types of education and learner, so it’s all about how do we actually take account of different people in different situations and hence give them the right kind of feedback.
I think it’s the key. Marcus can help students understand the data and along with the feedback, they have in the next step. One of the experiments we ran quite early on was we did surface data straight to students about the time they spend in LMS. So, we share with them the amount of time they are devoting to it, and what compared it to average students who were taking the top grades. But we later decided to pull back on that for two reasons:
- Some students were helped by this, but we were conscious about the students that we wanted to help potentially were not resonating with the service.
- It also didn’t link to outcomes.
So, we pulled back for that pure delivery of analytics. Currently, we are researching our next step and where we take that.
This is what people are doing in LMS. They might just have their browser open, so they might be spending ages in there but we don’t know what they are doing most of the time. So, this is the reason why such individualized event data is really important. Another area I presume to be important is matching the analytics that we get and the feedback that we give to the students with a specific learning design.
So, we should know the structure of the learning journey in quite a detailed way. We are trying to give students advice based on what they are doing instead of what they should be doing next. That I think is the precursor for a lot of institutions, OES is brilliant at it. We have also been doing that for a long while but the structured learning design is the way that we can get a better value of the data.
It’s a really exciting picture that both of you are painting, and the whole area is just mind-boggling in the speed of progress. Some little red flags are coming up in my mind at the moment and Marcus you have mentioned data privacy and you are hearing about psychometric tests, and various other things.
So, I am wondering what the challenges are? There are also resources, I presume this is hugely resource-intensive, both from the analytics point of view and intervention from the academic staff perspective. So, now I would love to hear from you what are the challenges that stand at present in this whole move towards analyzing students and academic performance.
From my experience working for private and government sector universities, I can tell that they have a legacy system. They are big and have different structures for different universities and different departments. They are working independently and maybe there’s a central data analytics team, but there is absolutely no way in which they can serve the needs of a whole university.
So, what needs to happen?
That’s a million-dollar question. I think it is easy to get bogged down in analytics, and the people who have the skill to work at it aren’t always the right people to be asking the question of deciding what analytics to do. So, I think that governance is really important, staying focused on the question that universities are trying to answer, what is important to them is vital. If you are making decisions especially if you are at a senior level and you are not using evidence and data then driving students to success will be very tough for you. So, this is one of the places I’d like to start.
I think educating our staff is very important. This is not something that universities are just going through. Becoming analytically data proficient is a new skill, we have talked about them in terms of our graduate attributes. We want our students to graduate with this kind of new data literacies, and as Sacha said, universities are often dealing with legacy systems and legacy behaviours. We are big organizations and we do have the resources through which we can help them grow. I remember once a vice-chancellor saying in a talk, “When I look at what I have for all of my technology systems and then I look at what the commonwealth bank spends on their app, it’s about the same.”
So, we are good at the slow long game, we are dealing with things that do take big resources. A lot of universities have been putting a lot of effort into analytics, to build it slowly over the last five or ten years.
One of the things I have noticed by just talking to senior people at universities is that analytics has not essentially been the main game for many people at the higher echelons of universities. It is a very difficult and complex subject to get your head around. Many of them certainly sense that they have been thrown into the deep end of this pool. With COVID, this has come to the attention of the most senior executives in the universities, and it is very hard for all of them to unpick all of the players in the market to understand what the difference is. What service is being offered, what that means for their universities, and how can they best make their judgment?
We all have made a massive education program, so what does all of this mean?
I would like to inform the audience that there are only 5-minutes left for the session, so if anyone has any questions they can ask them in the Q&A section right now. First question:
What metric are you using and what types of data points are you collecting to align to those metrics?
The two of the main metrics are the ones that we have been talking about in all this retention and that is:
- Are we keeping students in courses?
- Are students scoring well?
So, completion and passing are our two big metrics
I’ll just like to add one more. We track progression, it can be just another way of saying retention, but sometimes universities will look at their course and look at the retention within the course and they might not have the data or the ability to look at every students’ performance in one course. So, progression is another key metric for us in such a case.
In all of these analytics, how much of it is still human intervention, and how much of it is driven by AI and machine learning?
In my team, I have ten analysts at OES and only one data scientist working in the AI area. So, we have ten humans contacting students and 400 teaching staff. Therefore for us, AI is just a small part of it, and it is often where universities start. They build predictive models using the data, but that is just the first tiny block and it’s the human side around it that’s a lot more difficult.
Since universities like us are highly regulated, we have been using data forever. In that, we have to give extraordinary levels of reports to various levels of government. So, data has always been there and has always been reported in one or the other way. I can see a question in the chat about the real-time data, and I think that’s the difference, it’s not retrospective like reporting data, but having access to data at the moment where it can make a difference.
I think we are coming close to the end of the session. I’d like to finish with a question for each of you.
If you were to advise the vice-chancellor of all the universities of Australia, what would you be saying to them about approaching this whole analytics area?
I would just say that universities are behind other sectors. Banks, social media, and gambling companies are investing lots of money into analytics, only because it brings a return, and I would love to see universities make that same investment. I think it is exciting because universities are doing it for good reasons, they want to help people in achieving their career goals.
I would like to advise universities to work with their students because I think it’s really important to bring students on this analytics journey. There has been a time when universities tried to introduce new systems and different data points, and there has been lots of pushback from students. Data is an incredibly contested area at the moment because of a range of things that have been happening. Therefore, I think it is really important to work and explain to students what data analytics and get their informed consent to move forward.
So, a combination of going faster but also going slower collaboratively. Go faster in your decision-making and appetite for these new approaches but take it more slowly when it comes to the consultation with your students and staff. As Sacha said, it’s the whole people side where it starts and finishes, and that is where the opportunity lies.
We have reached our time limit. So, I would like to thank you both for this conversation, I’ve really enjoyed it and learned a lot and I’m sure so have our audience. So, Marcus and Sacha Thank you very much for participating this morning.