
In this episode, Phil Hill, Jeanette Wiseman, and Kevin Kelly discuss the latest CHLOE (Changing Landscape of Online Education) survey report put out by Quality Matters and Eduventures Research, based on responses from more than 300 online leaders at US colleges and universities.
- CHLOE 5 Survey Report: The Pivot to Remote Teaching in Spring 2020 and Its Impact
- Tyton Partners / Every Learner Everywhere Survey Report: Time For Class
Podcast: Play in new window | Download
Subscribe: Apple Podcasts | Google Podcasts | RSS | More
Hosts:
- Phil Hill
- Jeanette Wiseman
- Kevin Kelly
Transcription:
Phil: Hello, welcome to COVID Transitions. I’m Phil Hill, and I’m here with Kevin Kelly and Jeanette Wiseman. We’re talking about the CHLOE report, the fifth one that’s coming out on Monday when will publish publishes podcast.That’s the Changing Landscape of Online Education (CHLOE) that is put out by Eduventures and Quality Matters. There’s a lot of interesting insight in this survey that I thought it was worth us discussing.
In particular, to me, this survey provides the most usable context of any, except for maybe the Tyton Partners – Every Learner Everywhere is the other one I would put in this category where there’s useful context behind the survey that could be actionable by schools. What I mean by that is they don’t just go with the simple narrative of everybody’s going to Zoom. Students don’t like online this. They actually [00:01:00] provide a lot of context about what percentage of faculty and students have never had an online course in the past? And how does that affect this?
What are the multiple tools that are being used? Almost all the data is broken out by two year public community colleges versus four year in private. To me, it just has some of the most useful context of any of the surveys out there, say, the Tyton Partners. I would put that in the same category. I think it’d be useful for us to discuss this. I guess I’ve already shared my lead. This has context and this is a very usable survey and we’ve got a ton out there. So we’re starting to get into the point of, well, what can you do with these? Let me get your initial reactions. How how usable and what was your impression of the overall survey report, Jeanette?
Jeanette: Well, first, I think we need to point out who was [00:02:00] surveyed, because I think that makes an important distinction for this one, because it’s a little bit different than we’re getting students or faculty survey, For this report, which is a little bit further. Not as many as they usually do. What I’m reading is they mostly survey chief online officers for this report, for the survey. This time they only did three hundred and eight, although they did break it down across the different types of institutions that you listed.I will admit I’m getting a little bit of survey fatigue, and I shouldn’t be, since you guys are really doing more of the reporting on that, especially you, Kevin. I thought it was really interesting. I also think important to read the report because I think there’s some insight there that’s not just part of the numbers, but they call out some really important distinctions, which I think some people aren’t doing. I agree with you that it’s an important one. I think there is a lot of insight. I think between the lines sometimes [00:03:00] in this report is even more important than the actual numbers.
Phil: So basically, I failed to provide the context of who was even surveyed in this. But thank you, Kevin. What was your impression?
Kevin: Well, if Jeanette hadn’t brought it up, I would have, because I thought that was a key factor, especially when you’re looking at charts about faculty and students feelings about online teaching and learning. It’s the chief online officers feelings of how faculty and students are feeling. Let’s just call this second hand information for some of these responses. With that in mind, I still think you have a good point, Phil, that it’s got some good background information. These officers probably do have access to the statistics in the student information system and elsewhere of who hadn’t taken courses online before and who had taught online courses before covered hit. There are some interesting data and I’m really happy that they paid attention to the plight of adjuncts [00:04:00] and other groups that typically don’t have this much attention.
Phil: So one way to look at this then, is that we need to have a little bit of caution reading this report where it gets into purely impressions, particularly second hand. If there’s implications of what students or faculty thought, we need to be aware that this is certainly second hand and not as quantifiable, whereas other things such as which tools are you using and how many courses did you convert? Those should have a much higher reliability.
Jeanette: Right. That’s why I mentioned I think reading the report, not just looking at the data was important, because I think that when I read, how successful was your transition to online for your students? I’m like, well, why are we asking these people? I think that’s my first impression. When you read the actual report, they recognize that that’s a bias that they’re having. They also say, this is [00:05:00] the subjective question of the nature of this question, what we’re saying is these students we did put these courses online, we did get through the spring on them and they were able to complete their classes. That was the level of success for them. It wasn’t how much do they enjoy being online? It wasn’t how much did they learn online? It was that they completed their course online.
Kevin: Was it possible for them to complete their course? They didn’t actually talk about how many students actually completed the courses. I think they made a reference to what they called DFW Iron, dropped, fail, withdrawal, incomplete rates. I know for a fact, as an online teacher myself, that even as hard as I tried in a class started the semester online, I still had around 15 to 20 percent of my students not finish. ‘m guessing that the measure of success for chief online officers was did they provide [00:06:00] the opportunity for students to complete their courses?
Phil: I hope that when the media coverage comes out of around this report and we need to look to ourselves and what we write on the blog, but I’m a little bit worried that there might be a lead of online officers. Oh, they think everything was peachy keen. Then you compare it to an online survey of students and they’re saying we weren’t happy. In the report, they’re very explicit. Our goal of a successful transition, as you’ve mentioned, Janette, was quote, did they have the opportunity? So I hope that doesn’t get buried in the reporting as well.
Kevin: Well, I think there was one through line. If you look at student surveys, the other faculty surveys, like Every Learner Everywhere and even some of the institution specific surveys like University of Pittsburgh or George Washington University or even Penn State, but they all came away with the feeling that we need to improve student [00:07:00] engagement in these online courses that chief online officers felt that way. The 4000 faculty from 15 hundred institutions for Every Learner Everywhere felt that way. It just goes down the road. Everybody agrees on that.
Phil: This report, what it provides is — and I thought that the wording was a little bit confusing, but I hope that people get it — it compared online versus remote, which we’ve talked about quite a bit. The point being remote is what happened this spring where half of the teachers were moving face to face and they’d never even taught online before. They’re calling that remote. Then they compare that to online courses that have already been designed and released as online courses at their institution. I think that was a very useful way to break down the data as long as people understand what they’re saying. Then that captures your point right there [00:08:00] about the engagement. This report really shows the difference in engagement.
The figure eight student engagement online study, which means the stuff that was happening before the pandemic versus remote study in the spring 2020. In this case, across the board, across all institutions types, these are online officers say it was far superior and online courses that had been supposedly thoughtfully designed ahead of time. It’s not just that engagement is a common thread, but in this report, it even says, let’s compare these two different situations and see what the difference is. Now, there is an impression here. It’s judge mental, if you will, not hard data. It is interesting to see that from the perspective of this audience.
Kevin: Well, the writers themselves, my favorite line in the whole report is so it should be no surprise that little attention [00:09:00] to faculty, student and student to student interaction characterized many remote courses. We had this statistic in previous CHLOE reports. Basically they’re acknowledging that faculty members in classroom situations often haven’t been trained how to teach. They’re not going to pay attention to the thing that people who are trained to teach online at least are made aware of the importance of student engagement. I think that’s going to be the takeaway. That is going to be the strongest. This is happy, optimistic, Kevin talking here. The number of instructors that have been exposed to pedagogical training as a result of COVID-19 is possibly going to change higher ed teaching and learning forever.
Phil: Yes, it’s a good point. Jeannette, are you positive or optimistic or pessimistic, Jeanette about this topic? About how [00:10:00] usable this is and how it points to the future,
Jeanette: How usable the survey is? I’m not sure. I think that there was components to it that I felt like need to be pulled out specifically around things like you said. There were more quantifiable, like the use of video conferencing. Seeing that, I mean, Zoom is like the most used video conferencing tool. Which I think is somewhat fascinating, given that there are tools out there that have been created for education but are not being used. I thought that was interesting. I think they use of textbook or online courses and or the lack of use of those, especially in the case where here’s some instructional design. In most cases, instructional designed courses with content. Universities weren’t necessarily looking at them to use them online. I thought those things were really interesting to me. I do hope that there is now some more teacher training [00:11:00] and professor training in terms of how to conduct classes, both in-person and online that haven’t happened for a lot of disciplines. We don’t do that in higher education. I think it’s something that’s been needed for a long time. I think this hopefully will push a lot of instructors to look for that. I do want to make a point that I don’t think instructors in almost any case, there’s going to be these outliers. I think everyone wants to teach as well as they can. I think that push to online create a lot of self reflection, which is also a tenant of educational planning, is that you reflect on your teaching. I think that’s maybe the best thing that’s happened out of this, is that there’s been some self reflection on instruction and pedagogy that maybe wasn’t happening.
Phil: How do you guys take if you’re sort of getting into self reflection and moving forward? If you start looking at figure 10 and beyond, campus [00:12:00] faculty attitudes towards online learning after the pivot to remote teaching. This goes against a lot of the narrative that you see in coverage, which is the popular narrative is now that so many teachers have seen remote teaching, they’re even more against online in general. In figure 10, this shows at least the interpretation second hand is that faculty are very positive or somewhat positive in improving their attitude towards online learning after this. They have a better understanding of what’s involved and what’s possible. This gets back to the point of how much weight do we put to this type of finding? What it’s saying is it implies that this will improve online learning moving forward and more faculty are aware and have a more positive attitude about what can happen. How much weight do you put on this [00:13:00] type of finding particularly and figure 10 and then figure 11 gets into the student attitude.
Jeanette: These are CIOs and a lot of cases, these are executive level people at your university. These aren’t the faculty.
Phil: Well, I’d say they’re executive level. I don’t think see, unless I miss that. I don’t think they’re primarily CEOs. It’s more online officers. Quite often that will be more on the academic technology side or an executive dean of online.
Kevin: For a community college, they may not have an academic technology unit. Especially if you look at the statistics, they said for the average number of FTE, for instructional designers was three. IF you looked at the community college, it the average was one, which we know from California Community College system. 60 percent of them don’t [00:14:00] have any Full-Time staff for instructional design. With your question about figures 10 and 11, we can take it as somebody on a campus in a leadership role has probably a decent understanding of how people feel. I don’t think it’s as representative as the faculty and students surveys themselves.
Jeanette: I agree. That’s that’s a question being answered by someone that wasn’t doing the teaching or not doing the learning. I think they maybe have a more positive spin. It’s a very subjective question and we don’t know that the feelings behind it necessarily, nor were the people experiencing it. I think it’s nice that they think that, but I don’t know if that’s really reflective.
Phil: Yeah, I’m more Goldilocks. I’m not all the way saying we should play the full opinion here. From what I’ve seen at a lot of schools, you give them more impression. The online officers being on campus [00:15:00] hearing what people are saying. I think there’s some weight that we give to this. I wouldn’t reject it, but I guess I would read it with a grain of salt. Be careful about not over-interpreting this. This might be a little bit boring, but it is quantifiable. I think we should go back to a point that you were making, Jeanette. We actually have some better data now on what tools were used and consistent with the Every Learner Everywhere / Tyton Partners report and some others. While video conferencing, such as Zoom, such as Blackboard Collaborate increased the most during the spring of 2020, the LMS, as they described in this report is the workhorse. It’s essentially ubiquitous that it is more used than any other system for even remote teaching. That doesn’t [00:16:00] mean it’s in-depth usage or valuable usage, but we need to keep in mind this sort of counteracts a narrative that faculty just threw things on Zoom and there was no organization or structure to the course. That does imply that there was heavier usage of LMS. How meaningful is that data?
Kevin: If you look at the Every Learner Everywhere information about that same question, where they put existing users and new users and their table on page 19, the learning management system. 78 percent of the population were existing users, whereas only 21 percent were using video conferencing before the transition to remote learning. Then if you look at that graph, 49 percent of the users that were added to that, 21 percent for video conferencing bring it up to 70 total were new users. That was a huge spike. Almost 50 percent of the teaching faculty began [00:17:00] using video conferencing as a result of COVID. Only 9 percent began using it as a result using the learning management system, as a result of COVID. The totals reach almost 90 percent for learning management system. It’s that new user total that really leapt out at me from that Every Learner Everywhere, certainly.
Phil: Now, one thing I would also look at in this data helps confirm it. When you get to the fourth most used system, remote proctoring is another product category that’s become more and more important, and it has a big increase. You’re reading a lot about schools trying to say, if we’re doing things remote, we have to have some way of knowing who the students are. Some sort of academic integrity and remote proctoring truly is increasing in importance across higher education. Then that comes with a cost that comes with real questions about student [00:18:00] privacy and even pedagogical usage. Should you be doing tests that require proctoring? It’s certainly the data certainly confirms how important it is in the usage of systems.
Kevin: I’m not quite sure with 308 chief online officers predominantly overrepresented in private non-profits, if that’s representative of the whole country, because you just don’t know that every campus has the funding to support something like ProctorU or Proctorio or some of their proctoring.
Phil: I’ve seen other studies saying there definitely is a increase, but yes, it has to vary between the different sectors and the size of institutions. It certainly is becoming an increasingly common or important factor within higher education. It’s got a lot of questions behind it.
Let’s jump ahead. There’s an interesting section looking [00:19:00] at online program managers and what their roles are. For example, Figure 16 is talking about how did you manage the transition? 90 percent of school or chief online officer said we managed the transition entirely in-house. 8 percent said they did, mostly in-house with some OPM and 0.3p ercent were saying mostly OPM, but some in-house. It certainly doesn’t appear that OPM are massive factor in the transition that we did in the spring. In fact, as schools look at what’s important, it’s really the number one factor schools are saying is they’ve got to build up their internal capacity. That’s more important than expanding their agreement with OPM providers. I have to be careful about drawing too [00:20:00] much. It’s interesting to see data about how schools perceive OPM players and their role in this type of transition that we’re going through. Jeanette, did any of these findings on the OPM surprise you?
Jeanette: I don’t know if it surprised me. I think that what I see is a real need from institutions and maybe some of a role that OPM can step in to or new vendors of assisting. There’s some out there, assisting the creation of courses, not necessarily programs at the undergraduate level that include content creation and instructional design and working alongside faculty. That’s typically a role that OPM do. There’s right now more of a need to try to build that out internally within these institutions. To do that, OPM can step in or other like vendors can.
One of the sponsors of the report is I-design. Something that you might see I-design do. [00:21:00]
Phil: You have to actually mention, like in figure 17, if you already have an OPM contract, what it is almost one in four said, we do plan on expanding our agreement with our OPM partner moving forward. It’s not that they’re a nonfactor, but it’s definitely a clear message that internal capacity is an even bigger factor, much bigger factor in these schools, even if they already have an OPM partner.
Jeanette: The other thing to add to that is I think what we’ve discussed in other podcasts and things that we’re really seeing is the financial impact that the pandemic is having on everybody. It’s certainly these institutions higher out in K-12 and those revenues are really needed right now by these institutions. OPMs take a lot of it. I think that there’s likely some hesitation from institutions to go forward with an OPM that’s going to be taking a majority of your profits.
Phil: Sure. The one thing I would caution [00:22:00] there. Where the OPM have been taking a majority has been typically master’s programs with full revenue writers. I think a lot of the expansion we’re talking about is across undergraduate programs. I don’t believe those they’re looking at revenue share and 50 plus percent. I think there’s more of a fee for service where you’re getting your OPM partner to help with students that they hadn’t been helping before. I agree, I think there’s a headwind, however, that budgetary constraints have to be a part of saying we’ve got to be able to figure out how to do this ourselves.
Jeanette: Exactly.
Kevin: Well, also to consider that figure 17 is the percentage of the 10 percent or fewer of that figure. 16 schools that had an OPM contract before the pandemic began. 25 percent of 10 percent is 2 and a hald percent.
Phil: There’s certainly not an overwhelming story of OPMs that this [00:23:00] is the time that they’re now into undergraduate education and they’re fully established as a key part of broad based undergraduate. My interpretation of what we’re talking about right now, that’s not happening. There are isolated cases where they’re getting involved and helping schools out. Statistically, it’s still a very small number of cases.
This is this been a good discussion of what we’ll likely have a blog post out as well looking at some of these findings. Jeanette, you mentioned the fact that you’re getting fatigued, and I guess I’m sort of in the same place. Maybe not fatigue, but I feel like I’m getting bombarded every day. I’m seeing a new survey and it’s getting pretty hard to keep track of it. That’s part of my impression, is it’s you have to spend effort to get value out of the surveys personally because so many of them are coming out right now. [00:24:00]Any final thoughts that you two have on this or the general state of surveys?
Jeanette: My final thought, I guess, is that it’s going to be interesting. I think the things that are most valuable right now on these surveys, are the quantifiable pieces of data. I think that we are seeing these surveys over and over again and they pretty much say the same thing. Spring was heroic and how everybody transition to online so quickly, but not ideal and both for learning and teaching was not ideal. I think that because there is so much more of a variety of how schools are going to be either going online or in person or hybrid or Hyflex, I think the fall surveys or the fall data coming out, this is going to be more interesting because it’s not going to be so homogeneous of how people went online. That is where we probably will find real variations in how education is being taught, [00:25:00] how education is being delivered and how we are going to move forward. I think that’s where we’ll start seeing trends.
Phil: Kevin, any final thoughts you had on this or surveys in general at this stage?
Kevin: Yeah, I would say we still are not paying as much attention to students. I know these were faculty surveys, but if you look at the CHLOE, over 50 percent of those institutions are going to require faculty development and only 35 percent are going to require student orientation. If everybody is acknowledging that student engagement needs to increase, we need to help students figure out how to do it. Also just one small thing from Penn State’s thing that made me think of the NFL Buccaneers head coach who said, I think every player on the team is going to get COVID at some point during the season. Two thirds of the employees self-assessed that they’re COVID risk was moderate to high, but only 14 percent were reported being unwilling to return to face to face teaching under any circumstances. It’s almost like they’re acknowledging that we’re at high risk. We need the workspaces [00:26:00] too. 74 percent believe the work spaces need some changes before to support social distancing. They’re all basically accepting that they’re going to go back to some face to face interactions. I know that’s part of the challenge that these faculty surveys bring is what we’re hearing in the news. I thought it was interesting that Every Learner Everywhere brought up how some of the faculty findings in their survey were very different than what’s being reported in the news. I think as we move closer to the fall, as Jeanette said, we’ll be getting some data where people a little bit further away from the urgency and emergency of the situation. We’ll have more about what campuses are actually doing. I’m surprised that we’re halfway through July and we still have campuses that haven’t figured that out yet. I’m an armchair quarterback position. I have the luxury of not having to make that decision myself.
Phil: If I combine those two [00:27:00] a references to the Tampa Bay Buccaneers and you’re an armchair quarterback, you are the Tom Brady of EdTech, so I’ll buy that. As long as we’re going dark, I do want to add one more point. This, as far as a depressing statement, but it confirms what we’re saying. Little attention was paid to accessibility. That’s another area, particularly for students that I think is sort of a ticking time bomb. Is that in the spring that heroic efforts? It was remarkable, the transition, how smooth it went. I don’t think schools have come around to understanding how important accessibility is and the risk they face as an institution by not paying attention to this. The Department of Education, Department of Justice, they’re going to start cracking down on the accessibility front again. Students deserve more attention to be paid. As long as we’re going dark, that’s the I’d like to throw that one in there as well. Moving forward.
It’s [00:28:00] been a great conversation. Expect this to come out on the same day the CHLOE 5 survey is released in public. It’s great talking to you, Kevin and Jeanette.