Tech in EdTech
Tech In EdTech improves the dialogue between education leaders and the innovators shaping edtech. This is your go-to show for actionable ideas and solutions that make digital learning not just possible, but effective, practical, and inclusive.
Tech in EdTech
From LMS Signals to Student Success: Learning Analytics, Data Governance & AI Ethics
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Thomas Cavanagh, Vice Provost for Digital Learning at the University of Central Florida, breaks down how analytics should drive retention, graduation rates, and student success. He explains why adoption metrics miss the point and how LMS data can act as an early warning system for timely student support.
This episode offers practical insight on using data and AI responsibly while keeping decisions student-first.
00:02.39
David Brunner
Thank you so much for joining us today. Today is another episode of Tech in EdTech podcast. Today, our guest is Tom Cavanaugh, Vice Provost for Digital Learning at University of Central Florida. And my name is David Brunner, Relationship Manager here at Magic EdTech. So, Tom, to start us off, could you tell us a little bit about yourself and what got you into education?
00:31.64
Thomas Cavanagh
Yeah, sure. And thanks for having me. I'm at, as you mentioned, the University of Central Florida. I've been here for about 17 years now, which is crazy to think. Spent three years before that at Embry-Riddle Aeronautical University and was kind of in the industry before that. So, I kind of came sideways into higher ed through a series of steps. I worked in the corporate sector, mostly in corporate training and e-learning for Fortune 500 companies, web-based training, and that sort of thing. But, you know, after a while, I sort of felt like I wanted to have maybe a little bit of a greater impact as opposed to just teaching pharmaceutical sales reps how to get past the receptionist and get to the doctor and sell their drugs. And that led me into higher ed, where I feel like we’re just… I feel like we're having more meaningful impact than you can sometimes in the corporate sector. And yeah, been here ever since.
01:36.61
David Brunner
That's fantastic. Absolutely. And I think whenever we talk about success in the professional world, you know, you have to have a passion to be successful. And obviously, talking about what we do in higher ed later on, yeah, it's easy to stay motivated with the impact we can have on so many different people.
01:59.13
Thomas Cavanagh
Yeah, it's true.
02:01.21
David Brunner
So getting into a little bit about what's going on and what needles might need to get moved in higher ed right now. My first question is, you know what's the most important structural shift universities must make so analytics inform the academic and operation decisions more directly?
02:28.25
Thomas Cavanagh
Yeah, I mean, it's a great question. And I think a lot of it would depend on where the institution is. So, for some schools who maybe have pretty advanced and sophisticated infrastructure, they may already have a lot of data, but they're just not doing anything with it as far as impacting the students. Whereas another school might not have the collected data to begin with. So I do think that it's important first and foremost to make sure that you've got a culture that's collecting data, that is trying to find those stories, the information that you need, so that you can make those strategic decisions. And then you need to have the next step, which is do something with that data. You can't just collect it. You have to be able to turn that into action of some sort, whether that's an intervention for a student who's at risk or identifying, like we've done some things here. I'll just give a quick example where we've looked at toxic course combinations. So taken individually, course X and course Y students do fine. But in a semester where they take them together, it kind of tanks their GPA. Well, that's only revealed by collecting the data and looking at it and then taking that next step of, okay, so what do we do about it? So in advising, we advise students don’t take these two courses at the same time. So I think you kind of have to have both. And it's not just a structural shift, but it's a cultural shift. You kind of have to have a culture that is curious about the data, and their decision-making is informed by it.
04:22.55
David Brunner
Okay. So, there at UCF, if you were to be asked, who should collect data? What would your answer be to that?
04:37.44
Thomas Cavanagh
Well, we all have our data, right? We all have our slices of the pie. Here, we do have a Chief Data Officer who sits on the cabinet, and she is kind of responsible. She's in charge of our institutional research office. And as such, she's sort of the owner of the data and of the authority, kind of the authoritative source of information. But, for example, I'm responsible for the learning management system and all the data that's contained therein. Well, that's enormously useful for other things at the university. And we work really closely with the Chief Data Officer and her team to make sure that the data from the LMS gets leveraged as best we can across the university to help inform decision-making and student success. So for us, there is a person, but there's also, you know, data governance councils and advisory groups, and sometimes you work with your vendors. At the end of the day, though, it can't just be in a bunch of silos. There does have to be some centralized coordination. Otherwise, you're going to end up with redundant data or gaps or data that's not being leveraged. So I think it is important to have somebody with a bird's-eye view of the whole thing. But that doesn't mean that person necessarily has control over all the buckets. They just need to make sure that the buckets are all talking to each other, if I can mix metaphors.
06:15.99
David Brunner
Now, there you go. The buckets are talking, and they're full as well, right?
06:19.67
Thomas Cavanagh
Yeah, that's right. Yeah.
06:22.46
David Brunner
You mentioned your title. You mentioned the different title there at UCF. But not all campuses have a chief online or a digital learning officer. We talked a little bit about data, but what other problems does having this role solve, and how should it connect tech and institutional strategy and other departments?
06:48.63
Thomas Cavanagh
Yeah, so I am the kind of, it's not my title, but it's my role as the chief online learning officer or chief digital learning officer. And I work really closely with my colleagues, such as the Chief Data Officer, to make sure that we're sharing data, we're sharing it in the right format. When we report data, we're reporting accurately, and that we're trying to make sure that these data get used, you know, in a way that's going to make a difference. Now, I'll just give one example. Here in Florida, we are a performance-funded state. And so we have all these metrics that we are judged against from the state capital and the university system. And there are actually a couple of different categories of those. I won't go into all the details. But we have these performance metrics, we have these preeminence metrics, and we track them all. And the real dollars is how we are funded from the state, that come according to how you perform against these metrics. It's really hard to have an institution-wide initiative to say improve your four-year graduation rate unless you have a coordinated data strategy, because there's so many moving parts to that. And there are so many ways that you can try to impact that from advising to pedagogy to having, you know, faculty practice to admissions to other policies, just with regular graduation policies when you file your intent to graduate or identifying students who are close to graduating and then doing outreach to them who may be looking at, for example, various reasons why they may or may not plan on graduating and what can you do to overcome that, encourage them to graduate. Maybe it's like you pay for their last class so that they take it and finish in time to count them in this year's cohort, because it's a worthwhile investment compared to the return we're gonna get from the state for reaching the performance funding thresholds. So I think you really do need to kind of have this overarching role that allows you to use data strategically across the university. You listed a couple of different things, you know, between tech and teaching practice and strategy. But for us, in a lot of ways, it does sort of boil down to kind of what are the things the institution is measured on? What are… what's important to us and for your graduation rate and student retention? And these are all good things we should be doing anyway. But there's a certain amount of urgency that's put on them because of the dollars at our stake coming from the state to fund us, that the data becomes that much more important.
09:49.21
David Brunner
Absolutely. Kind of following up to that, you know, not every number tells the learning story. Time leaders, they have to choose, you know, what leading indicators that, you know, from one course to the cabinet. If you could standardize a few leading indicators for digital learning success, what would they be? And which popular metrics would you retire?
10:18.72
Thomas Cavanagh
Would I retire? Well, here's one I would retire. I'll start with the second half of your question. And it's more of kind of a vendor, I won't say criticism, but suggestion that maybe you know, touting adoption numbers isn't the most meaningful thing to be bragging about. Sure, adoption is important. But to me, that is a means to an end. The really important metric would be student success, impact, engagement, retention, speed to graduation, grade improvement within a course, that, you know, all of those kinds of things. You know, if we're adopting some solution, then those are the things that I ask for, and we'll measure ourselves when we do a pilot or other kinds of projects to see if this thing is worth continued investment or expansion. So, just touting adoption numbers, I think, only tells a part of the story and not the most important part. So that's one I might retire. But as far as other kind of like more meaningful leading indicators, I think that the data that I and our, I won't say I, I mean collectively we as in my division are responsible for, I think, some of the most salient leading indicators in the institution because they're coming directly from the learning management system, which means that I know what grade David got on his test this morning. And we could be very proactive in theory to reach out if you didn't do well, to just make sure that you don't get off track. So maybe you need some tutoring. Maybe you need to go to the writing center. Maybe you need some supplemental instruction on study habits and good strategies for, you know, taking notes or something, whatever it might be. We can diagnose and then treat some of these problems in a very proactive, timely way, because we've got all of these almost real-time data indicators from the learning management system. That's not to say others aren't valuable. I know we look at things like, have you registered for next semester? That's an indicator of whether or not you will be retained. If you haven't registered for next semester, you're at risk of not retaining. If you haven't filed your intent to graduate when you're eligible, you're at risk of not graduating within the four years or whatever it might be. But I think the ones that are the sharpest point of the stick are those in the learning management system because it's what's happening today in the course. It's our earliest opportunity to step in and do something to help.
13:16.09
David Brunner
That's fantastic. And yeah, absolutely. You know, we're able to do what you're saying because of the technology that's available now versus in the past with real-time, knowing, you know, with LMS as it's evolved and things of that nature, kind of leading into the next segment with technology and advancement, AI. If anybody's been to an education conference in the last two or three years, the main thing that people are asking is, how are you using AI? Where's AI at? What's the future of AI in education? And so, kind of the question I have for you to lead this off is, you know, AI can be amplified; it can amplify faculty and advisors' capacity as long as those advisors are staying in the loop. Where do you think AI is ready to support teaching and operations this coming year?
14:13.56
Thomas Cavanagh
Yeah, it's a really good question. And again, I think it is one that does vary. It's a little idiosyncratic based upon where an institution is as far as their AI strategy. And that seems to be a spectrum. But in general, I would say that we're a little better in, I think, teaching students how to think about and use AI ethically because they're going be expected in many careers, especially white collar careers, now that when they graduate, they're going to be expected to know how to effectively use AI as a workforce tool. It would be irresponsible of us not to prepare them for that. So I think we're pretty good about that. And as well as sharing the ethics, like what is an ethical use of AI and what isn't, you can't pass it off as your own work, and you got to check the citations because it still hallucinates. And I'm not even talking about some of the environmental ethics, that's a whole separate question, but just the usage ethics of it to solve whatever problem you're trying to solve. We're pretty good with that. And I think that there are a lot of efforts underway right now to integrate AI skills into curricula, especially gen ed curricula, but also within some of the disciplines, which I think is coming next. I think where we're maybe still struggling a little bit is in using AI to build and deliver courses, and because I think it's not every platform, like a learning management platform or other sort of course-building platforms, have really elegantly baked it into their process. I think we're getting there, and I think we'll be there soon. But right now, I see faculty just sort of doing that, separate from the course-building process. They might do it in the, don't know, a ChatGPT, a Claude, or some other LLM like, “Hey, outline a course for me”. Great. But there will come a time when it's sort of baked much more seamlessly into the platform where the course lives. And then likewise, the course delivery side of things, where I don't think we're quite ready, nor I'm not sure if we ever should be ready to have AI-delivered exams where humans are just abdicating that to the machine. Even in multiple choice, there's a certain amount of human oversight, especially in the writing of the questions that you still need to have a human in the center of that overseeing that, because right now AI system LLM you could have a rubric write a test for you, deliver that test, grade that test with sort of a minimal touch of a human and I think the human needs to insert themselves a little bit more into that assessment process. It's only fair for the students.
17:37.62
David Brunner
Absolutely. Kind of staying in the AI stream, shifting a little bit though, you know what would you say are some principles you believe should govern AI use in learning analytics? That way, we can avoid that precision bias or maybe overconfidence when models are wrong.
18:00.60
Thomas Cavanagh
Yeah. I mean, the risk of AI is that it tells you what you want to hear. I've had that experience so many times when I've put in a question, it's like, wow, that's the best question ever. You're brilliant. You know, like, no, it's not, but thank you. You know?
18:12.92
David Brunner
Yeah. I think
18:13.40
Thomas Cavanagh
Um, and I think it can do that with these, like learning analytics models. You're absolutely right. And you have to be careful, because it'll fit the data to whatever it thinks you want to hear if you're not really careful about it. So, maybe a couple of principles. One is obviously you have to abide by the ethical use of the data. So you can't put FERPA-protected data into an open LLM. That would just be actually a legal violation. You'd be breaking the law. So you have to just make sure you follow good ethical practice and the use of data. But then, from just a day-to-day logistical standpoint, I'll go back to what I said about the assessments. I think a human still needs to be in the middle of this whole process, checking, validating, and then, you know, running tests of these models against actual use cases and seeing what happens. Something that we've done over the years and various… like we built an analytics model with one of our statistics professors, and he tested it against previous semester's data to see if it worked, because we know what the outcomes were in the semester data. You can look at the whole thing. So you test, it was… this happened to be a predictive model to see if we could determine with any kind of accuracy how well a student would do based upon some early indicators in the beginning of the term.
And so you can run that against a previous semester's data against the first three weeks or something and see, did this predict this outcome or that outcome, and then look to see what actually happened because we know. So, that's a really good way to kind of test it and validate the models. But at the end of the day, not relying or over-relying on AI, and I'm a big proponent of it. I work at a big school. We have 70,000 students. Like, technology is sort of baked into everything we do because you can't support scale otherwise. But that doesn't mean that we abdicate all of it to the machines because it's too important. These are students' lives we're talking about, and we have to make sure that we get it right.
20:35.29
David Brunner
No, absolutely. You know, you hinted on ethics a little bit with AI. And the next question I have is, you know, if we'd like to set a clear ethical floor that product teams and campuses can implement with growing data exhaust, what just stands on data minimization, you know, for students and for faculty? Because we are at that point where we begin the conversation, “Data is important. Data is important.” And we collect a lot of data. There is the idea of data exhaust happening. And we've talked about how you can format questions in AI to get answers you want. And just kind of curious on your opinion, on your stance on data minimization consent, for students and faculty.
21:31.38
Thomas Cavanagh
Yeah, it's a really good question. And I get it because, man, we're collecting a lot and it does… it actually creates some exposure for the institution if you're collecting sensitive data and there's a breach or something. Right. But I did say earlier that, you know, we're trying to collect a lot of data, and I have a colleague here, Chuck Jubin, who likes to say he's an educational researcher and statistician. That uncollected data cannot be examined or analyzed. So, sometimes you don't know that you needed something until two years after you've collected it. And if you… if you don't have it, it's really, you can't go back and get it. It's too late. So part of me, well, I'm of two minds. One, I agree. We're collecting an awful lot of data that is maybe none of our business, but at the same time, you never know. And assuming we could have the appropriate protections on those data and that, you know, there's not going to be a breach or whatever, that it's properly secured. Storage is not the most expensive part of the data ecosystem. It's my understanding. I'll caveat all this by saying I'm not… my role is not as a professional data expert, nor am I a programmer or a database person. But my understanding is a lot of storage can be gotten for relatively affordable prices now. And you can, kind of, park some of that out there. It's the processing and analysis that is where it gets kind of expensive. So maybe I'm a bit of a hoarder, but I would air more on the side of trying to collect as much as you can in the interest of trying to be as helpful as you can.
23:36.65
David Brunner
And, you know, absolutely. And yeah, it makes perfect sense. With all this data, how do you communicate the analytics to the students and faculty in plain language? So they can, you know, meet it instead of ignore it or not collect it, you know. What's the trick, so to speak, in having all this data that you have and utilizing it for the average student and average faculty member?
24:16.22
Thomas Cavanagh
Yeah, it's a good question. And I guess in your previous question, you did sort of touch on consent. I'm not sure I really answered that part, but I do think that's really important. None of this should be a secret. So I think we need to be really transparent with anyone we're collecting data from, students and faculty, so that they know it's being collected. And, you know, there are ways to do that. It can be part of onboarding for faculty and students. It can be part of like a website, of like, “Here's what we collect, just so you know,” and “Here's how you opt out if you don't want your stuff collected. And here's what you can't opt out of for whatever reasons and usually legal reasons.” And so, I think that there are ways to do that just to make sure people know and have that informed consent and have the opportunity to opt out of what they're able to opt out of if they want to. But, you know, it's interesting when we've talked to students in the past about data, like we've built some dashboards in the learning management system for advisors, and we're pulling course data. And when we've asked students sort of be like, you know, how do you feel about that? They all assumed we were doing it anyway. You know, they all kind of assumed we had all these data and we're using it, and they were, you know, unbothered by it. I think the faculty are a little more concerned about what we're grabbing and collecting, but they also want as much data on their students as possible because they want them to be successful, and if that can help. I think if we're collecting data on faculty, they start to have opinions about that. But just, you know, in the same, by the same token, you just need be completely transparent about everything that we're doing and not hide it. Because then narratives get made up, and nefarious reasons that actually don't exist can be invented when it's just easier to tell people the truth.
26:16.23
David Brunner
Yeah, absolutely. Another question that I have is, in edtech teams want their analytics to matter for campuses? We talked a little bit about a few, and you mentioned one analytics that you would retire, but which data fields or feeds should edtech companies standardize to plug into institutional decision cycles?
26:49.29
Thomas Cavanagh
Hmm. So, by edtech teams, do you mean like from the supplier vendor community, or do you mean within an institution?
26:58.74
David Brunner
Yeah, the vendor third party where those companies can walk alongside and help standardize some of this that you mentioned, you know, with UCF being a larger campus with the amount of students, not everybody has internal resources like you do. And so, for those that have to rely on vendors or third parties, where would that come into play?
27:25.18
Thomas Cavanagh
Yeah, I'll go back to what I said when I was talking about adoption metrics, which are important, but not the most important. The most important are the ones that are able to point to impact. So things like grades and grade improvements, especially retention, time to graduation, all of those kinds of metrics that actually you can put a dollar figure on. Especially those speed to graduation and retention numbers. Those make a big difference. And something we like to do with our commercial partners are collaborations where they have access to certain data in the platform that we don't have access to. We have access to certain student performance data that they don't. And we try to, kind of, work together to figure out what is the best path moving forward. And then we'll present on these at conferences jointly with our partners, or we'll publish with them in academic journals. And it's a great partnership, but it's all grounded at the end of the day in those impact metrics.
28:47.22
David Brunner
Absolutely. I have a couple of lightning round questions for you, Tom. One faculty development move that pays off the fastest.
29:05.41
Thomas Cavanagh
Yeah, I'm going to have to give you more than one. I'm sorry. Uh, it's not so lightning after all, I guess, in the lightning round.
29:10.17
David Brunner
That's okay.
29:14.97
Thomas Cavanagh
Um, well, one very just practical mercenary thing that pays off is paying faculty. If you can pay them to go through formal faculty development, you will get a much… a different response than trying to get them to come otherwise if you're not mandating it. So paying faculty for their time so that they can become better instructors is a worthwhile investment. And that's what it is. You're making an investment in them and their career and ultimately in the quality of the institution. But then, from a curricular standpoint, two things that seem to always come up because we do a lot of faculty development, we train all the faculty who teach online here in our division. And two themes that always recur, one is course structure, a consistent weekly module structure. I'm talking more specifically about online and blended courses right now. But having that structured format is enormously helpful for students who are really busy; they know what to expect and when things are due, and they can plan around it and for it. It makes a huge difference, actually, in student success. And then being as responsive as you can. This is not necessarily limited to any particular modality, but just like responding to emails within 24 hours or 48 hours or maybe not more than that, and just doing the things that a teacher who cares about his or her students does, just makes a huge difference in a student's perception of that course, perception of their own abilities, and their ultimate performance.
31:16.09
David Brunner
Thank you. And again, lightning is super relative. So how about one AI use you'd greenlight this year and one AI use you'd defer?
31:28.95
Thomas Cavanagh
I would green-light; we've got some amazing tools we're building here in the division. But I would greenlight more development in the use of AI in course development and delivery. Especially, this is my holy grail. I know it's common, but AI teaching assistant that lives in the course is available 24 seven, is kind of infinitely scalable, and can answer 70-80% of the main questions that a student might have. For big schools like ours that have some larger class sizes, that would be a game-changer for us, just having that person. And in my mental model is what Ishak Gold did at Georgia Tech with Jill Watson and building his virtual AI-TA. I would like to see that scale nationwide. And then what would I defer? I don't know. I'm sort of a little technologically determinist on this. I think we should continue to push the boundaries on the use of AI. How can we leverage it in a lot of different domains and areas? So right now, I'm not telling my team to stop anything except in areas where it might be redundant. Like, we have two platforms that do sort of the same thing. We don't need them both. Let's pick one and standardize it a little bit more. But I honestly think it's a little so early in the AI development and life that it might be too soon to say, stop doing this. Although you did say defer, I think not, stop. So, yeah, it's a question of prioritization.
33:26.65
David Brunner
Sure.
33:30.01
Thomas Cavanagh
So yeah, to be determined, I think right now we're just trying it all.
33:36.89
David Brunner
Yeah. Well, fantastic. Well, Tom, my last question for you, kind of closing it up, you know, is as data and technology evolve, what gives you the most hope about making learning more personal, fair, and effective?
33:58.39
Thomas Cavanagh
You know, it's the faculty. The folks we work with care so deeply about their students' success. You know, they want them to do well. They advocate for them. For example, when we buy a platform and if we can't afford an institutional license, which is often an issue for us at our size, sometimes they won't adopt it if it means that students are going to have to pay for it themselves. Like they care that much about the cost to students, even if it's a $20 fee or something that a student would pay. They're like, no, I'm not going to add any more financial burden to my students. So I'm going to figure out something else to do with them. And I think that is carrying over into all of these other aspects of edtech in higher ed, where the faculty want to make the right choice, OER versus non-OER, this platform versus that platform, using AI or adaptive learning, that they're using the right yardstick for making these kinds of in-the-trenches classroom-level decisions, which is ultimately what's in the best interest of the students. And as long as we stay there, I think we're going to be okay.
35:21.17
David Brunner
Well, Tom, I think that is the absolute best way to sum up this conversation with what you just said, which is student first. We've talked so much about technology, data, and AI. And again, University of Central Florida being such a large school, but everything you've said has boiled down to student first.
35:44.76
Thomas Cavanagh
Yep.
35:44.73
David Brunner
And because of that, I know that the students and the faculty that you work with definitely have that benefit that is always assumed, but maybe not always practical. But you've showcased that through this whole conversation. And I will just say it's been a pleasure to get to know you and what you do there at University of Central Florida. And I'm so excited to see the future as well. They continue to follow you on LinkedIn and in your posts. And as you continue to grow and your footprint through this continuous expanse, we're excited not just for the students, but for you and the university as well.
Thomas Cavanagh
Thank you. Thanks for having me on the podcast. I really appreciate it.
36:30.01
David Brunner
Well, thanks so much, Tom.