Tech in EdTech

Are We Measuring the Wrong Things About Retention?

Magic EdTech Season 1 Episode 85

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 41:13

Some students log in, submit assignments, and still drop out. So what are we really measuring? In this episode, Shaunak Roy explains why most engagement metrics fail to predict retention and what institutions should be looking at instead. From early risk signals to faculty workload and AI hype, this conversation challenges how higher ed thinks about engagement and what it takes to keep students on track.

00:01.59

David Brunner

Hello and welcome to another episode of Tech in EdTech. I'm excited to be here today with our guest, Shaunak Roy, CEO and founder of Yellowdig. Today's core theme is engagement drives retention, and it deals with accountability for outcomes. So with that, I want to welcome Shaunak Roy. 


00:30.94

Shaunak

Good to be with you, David. So I'm the founder and CEO of Yellowdig. It's a social learning engagement platform for higher education. We have been around for about 9 to 10 years now. And so, you know, my background, you know, I think you asked, like in terms of how Yellowdig came about is, you know, I grew up in India, then I moved to the US probably 20 years back, where I came for my master's. Then, you know, after that, like most immigrants, I, you know, got a job and kind of started working in corporate America. Had a good real experience through the process. But about 10 years back, I was in this phase of life where I had to make some choices in terms of whether I continued to grow in my corporate career, which was going fine, or do something new and drastic, and, you know, based on my passion or personal interests. So I chose the latter, founded Yellowdig. The founding story of Yellowdig was always interesting. You know, I always say that I've been very fortunate in my life because, you know, I had a great family growing up in India who valued education. I ended up going to some very well-recognized institutions in India. I was fortunate I got to IIT Bombay for my undergrad, then came to the US, went to MIT for my grad studies. So, that side, I've been very fortunate for reasons that, you know, I think we understand that, you know, when you're in the right environment, right things happen. But I always call myself a non-traditional learner. And what I mean by that is I'm not one of those students who are, you know, the first benchers in a classroom. I typically would sit in the lot in a back bench and never quite engaged in the formal learning environment. And I always used to wonder that, why is that when I go to a class, and it has happened plenty of times with me, I mean, you know, all of us have been in these classrooms. And we almost have to kind of, you know, get through the class and somehow make sure that we have done our homework and whatnot and get the grades. And it was always a painful experience, even though I love learning. It has never stopped. I always love to do projects on the side or, you know, pick up new things or new hobbies or try to do things in different ways and always been curious. So I always try to kind of bridge that gap, like, yeah, I love learning, but I just don't enjoy the whole learning experience. So yeah, I mean that's where Yelda came in and, you know, came up with this concept of social learning, where it was an idea that came from the early versions of social media, even though we are clear that we are not a social media platform. But it kind of, really borrowed some of these concepts of people can connect with one another at any time and really create those human connections that are often missing. And through those connections, we try to engage students in different ways and really make them excited about what they're learning. So, that was the whole idea. And then of course, you know, we have evolved over the years. We have been fortunate to work with some very, very respected institutions, both in the US and abroad. Have worked with many very talented faculties and administrators who have helped us to design the experience that we have today. Yeah, it's been a long journey, and I would say quite a rewarding journey to come to this point.


04:23.47

David Brunner

That's… it's fascinating. So interesting. You've mentioned the journey. And, you know, something that I am curious about is, you know, since you first started, how has the conversation around student game engagement changed in higher ed?


04:42.69

Shaunak Roy

You know, when I started, and that was mid 2015, 2016, those years, would say the big difference is that engagement was nice to have back in the days. Whereas, now most of the conversations that we have engagement has become a must-have. And, you know, it has something to do with, you know, we have gone through this massive pandemic in 2020, you know, a lot of behavior change at that time. You know, with a lot of growth in online and hybrid learning, where it's, you know, more difficult to engage students, probably has to do with that. As well as, you know, we all hear about how difficult it is to kind of admit or find new students for most institutions, like not for the Harvards of the world, but majority of the schools and colleges that cater to regular students are having a harder time to, you know, recruit students. All in all, the need for engagement has substantially grown in the last 10 years. And, I would say it has become, you know, not only just a must-have, but without that, it is very hard to actually run an institution because if the students are not engaged, there's so many ways of getting disengaged. And now, we have AI now, which is also causing all sorts of problems, you know. You can pretty much do things, many things on your own, and you don't have to go to a classroom to learn from your peers, you know, instructors. So, without having a proactive engagement strategy, you know, it's very hard to run an institution these days, so yeah. I mean, it has become a very important topic.


06:27.03

David Brunner

And you mentioned having a plan. And when you're talking with institutes today, you know, what is the retention problem that keeps coming up the most? And you've talked about how some schools and some institutes aren't dealing with the retention problem, but the ones that are, you know, what is it that keeps coming up the most, especially with the enrollment pressures that are out there right now?


06:53.14

Shaunak Roy

So the retention problem you can think about, you know, right from the day a student decides to come to an institution, right? Could be any institution. So the moment they make a deposit, they are essentially deciding whether they want to show up in a classroom, right? Typically a difference between the day they make a deposit to the time they show up in a classroom, there is a three to four month gap. And we are hearing more and more of that there is a substantial melt where students are deciding not to come and attend that institution or go to a, you know, another option that they have available, or maybe delay their studies. So that's one challenge. And then after that, once they actually show up in a class, that means they are in that institution physically or virtually, depending on what program they are in. It is getting harder for retaining them from, you know, through the first year of experience. You know, many students, when they start taking the first course, they may decide that this is not the right program for them, or they may have questions on the value they are going to get through that experience, or they start to kind of do different types of calculations, whether it's a right investment for them or not. So that's also leading and becoming a more of a challenge for many institutions. So, kind of measuring that first-year experience is quite important. And we do see, you know, sizable retention challenges then. And then after that, term-to-term retention, which is once you go through the first-year experience, and like how do you go from the first year to the second year, or how exactly every institution is different, how they measure these things. So I would say retention is not just a one-time problem. It has become a lifecycle problem for the students. So, kind of, really understanding the students right from the day they decide to come to the institution until they graduate is an entire funnel. And we got to work at every phase of that life cycle.


08:52.66

David Brunner

No, absolutely. And we talked about engagement, and there's a lot of tools out there that can generate clicks, you know, but a few create actual momentum that carries students to completion. In your view, what separates engagement that looks busy from engagement that actually predicts course completion?


09:15.29

Shaunak Roy

Yeah, that's a great question. So, the biggest factor what we have seen is that if the engagement is self-directed, you know, we often use this term called agency, where students actually have the option to go in and participate, or they can contribute into the learning environment or whatever they are learning in that moment. That type of engagement, which is inherently more motivating, is actually what drives the right kind of behaviors, including retention and persistence. The wrong kind of engagement is clicks. You know, it's easy to kind of create busy work. I mean, we often talk about how discussion boards are often disliked in higher education. And the reason is, it is a forced engagement strategy where the instructor will have to, kind of, almost write an assignment that the students have to complete at a certain time, or a group project that they have to do. And if the students don't kind of feel motivated to do that, so they are just going to go in and, you know, get the work done. So that counts as an engagement point because they had to log into the system, and they had to complete the process. You know, they can even sometimes cheat using AI. I mean, you know, that's a growing problem now. But those kind of engagement does not lead to the right outcomes because it doesn't change the behavior. It's not inherently motivating for the students. So, for us, when we talk about engagement, yes, we do track clicks because that's going to be important for us. But, we are also looking at other factors such as how often students are kind of contributing on their own into the learning experience, how often they're reaching out to their peers and having conversations, how often they're interacting with the faculty, what type of content they're writing. Is it just a “I agree”, or is it something a little more thought-provoking? Being able to, kind of, really understand some of those behaviors in the platform and, kind of, nudging the students towards the right kind of behavior, is what I call a good engagement.


11:22.39

David Brunner

Yeah, absolutely. One of the things that institutes are talking about right now is obviously improving retention. But you've kind of touched on this. How should these institutes be defining retention for engagement initiatives so that everyone is aligning on the same outcome?


11:43.93

Shaunak Roy

Yeah, I would say the first thing is to kind of think about how to measure retention. Because one of the challenges, you know, I'm sure, we have all seen is, you know, a student can drop out from a program for a variety of reasons. Some of these reasons are, you know, quite, may not be in the hands of the institutions because, I mean, maybe there's a life change.  They move somewhere, or there's a financial challenge, or you know, a family issue. Things happen, which is what sometimes causes students to drop out. But oftentimes, you know, the final decision for somebody to drop out is not just the financial challenge, right? I mean, there are stories that we hear where the student who kind of decided to kind of drop out because of financial challenge, the immediate question becomes, have you really looked into other options of financing your education? Have you looked into all the resources that are available in the institutions? And the answer sometimes is that, oh, maybe I should have or haven't. The question is that, okay, how do we make sure that student, whoever decides to, kind of, you know, make a decision, that student is kind of supported in the right way. They have the right information available. They have the right connections with their advisors. Those kind of challenges sometimes is not visible because we only look at the dropout data, but truly, kind of, understanding student behavior from as many different systems as possible is quite important for us to, kind of, really answer the question of retention improvement. And then, the other thing we also like to do is to really try to, you know, do studies with our partners where if they are open to doing a study where we will, kind of, let's say, implement our platform and try to see how our platform is changing behavior in students and if those students are more likely to complete the courses or retain. I mean, those kinds of studies are more difficult to do, but those are the best ways of really understanding if a solution is actually having a retention impact. What's not good is that, you know, it's very, kind of, easy to kind of think something is actually working, but when you really dig into the data, it might look like that it's actually not. So, just because something seems like the right idea or a good idea doesn't mean that changes behavior, right? I mean, that's kind of the reality. The other thing I think often happens is, you know, there are situations where the institutions have kind of exhausted every strategy to improve retention, such as having a retention team, having an advisory team, and kind of, you know, making sure the students are always attended. Those kinds of strategies are always kind of expensive, and they've all gone through that, but they haven't really looked into digital strategies. Because most students these days, I mean, doesn't matter; they go to an in-person institution, or they're fully online, they are digital natives, right? I mean, they are on their phones probably like hours and hours, like 7-8 hours a day. So, kind of, really thinking through their digital experience carefully and seeing that if that experience is actually conducive to an environment where they're going to more likely to retain is a very important factor. And as you know, in higher education, that has been a challenge for quite some time. Because most polls will show you that the net promoter score of various tools that schools use are not very high, and that's a problem. So, you know, for us as a tech company, I mean, we try to dig into those kinds of data. We try to understand what's working, what's not working for them, and then trying to kind of, really target parts of the student experience, which could be fully online or hybrid, and really try to engage them through those processes. Like for example, a student may not show up in a faculty R or maybe show up at a meeting with advisors, but they are more likely to kind of respond to a nudge in their phone. And, kind of, as long as the nudges, it doesn't feel like an intrusion, but it's a very natural, good experience. So, really focusing on those experiences is what truly drives some of the retention numbers up.


16:12.26

David Brunner

Yeah, and I don't wanna keep talking about retention, but it's just so important, obviously, in the higher ed space with the enrollment numbers the way they are today and in this space moving forward. Last question, very specific to this, is you've kind of leaned into the indicators and the different institutes. But when you think about leading indicators, you know, what are a few behaviors that are most reliable in correlation with persistence, not just participation?


16:46.04

Shaunak Roy

I think the leading indicator, which is probably hard to measure, but probably is the most indicative of likely retention, is motivation. Which is, if a student is not motivated to be in that environment, you know, say it'd be a classroom or a program, and they're just checking the box to kind of get through. Being able to identify that behavior early on is probably the best indicator of retention. On the other hand, if the student is motivated because they are excited about being in the program, they love their peer group, they love their friends, they're already, you know, forming bonds with their faculties or whoever is kind of helping them through the process, that we see as something to be a strong leading indicator of future retention of that particular student.


17:40.98

David Brunner

Yeah, and obviously, motivation, you know, it changes, just like the students do, you know, throughout their collegiate career. And so, I'm sure, that's always a bit of a moving target. One of the things that we hear a lot about is faculty are willing to support engagement. But we also understand that faculty are sometimes very overloaded. So, what does a low-lift faculty role look like that still boosts community quality without turning instructors into full-time moderators or motivators?


18:19.61

Shaunak Roy

Yeah, so I think this is where technology can play a role. So, one thing we talk about is, if you think about education, let's say, 10-20 years back, which was fully in the classroom. I mean, when I went to college, for example, or when I went to high school or middle school, we did not have any digital technology. So that time, you know, once a faculty closes the class door and they're inside the classroom, it's essentially all on the faculty or on the teacher to be able to motivate the students and get them excited to really drive outcomes. So, which is why, I mean, all of us remember our best teachers, right? I mean, they had a disproportionate impact on our lives because they, kind of, really did a good job purely based on their own talents and their interests to gonna really be a great teacher. But, I think, if you look at the modern times, in this day and age, I mean, more than half of classrooms are now hybrid. I mean, they are kind of doing some parts in all online, some parts in class. A lot of these, you know, courses and programs are using resources which are built by companies like us or big publishers who are creating digital content. So, right now, there is a heavy reliance on technology, which could be bad or good. So, I think the point I would like to make is that the role of technology in making sure that the faculty experience is good, and not only the faculty experience is good, but they are not overloaded as well as is designed for the right outcome for the students, is quite important. So, you know, like for example, in our case, we use gainful learning to reduce faculty load and improve student engagement. If you may ask, how does that happen? And that's not that different from if you have a great group of people where students are able to, kind of, really engage with one another in a meaningful way, you don't really need a faculty to, kind of, you know, instigate that conversation because people are engaged already. And what the faculty becomes a guide on the side, they act as another student, right? An ideal student who is going to be part of the community as opposed to be responsible to drive that community. That becomes a lot of work. So yeah, I mean, to that point, I think this is where technology has a big role to play, and it is playing. And I think that's what really gets me excited to see what we can do to, kind of, make the faculty experience better so that they can do what they love to do. Because every faculty we made, they get it as a profession because they love to teach and help them do that, and support them in the most effective ways. I think it's quite important and possible.


21:05.66

David Brunner

Absolutely. Yeah, you know, faculty, they sign up for, you know, a passion, a career that they have interest in, and then that shifts throughout their career. And so, by like giving them that support through technology, allowing them to keep with their passion, it makes so much sense. You know, there's a lot of engagement systems that can sometimes amplify the loudest voices and leave the quieter students behind. And how do you design community mechanics so participation is equitable and not just dominated by a small subset of students that just happen to be the loudest students?


21:47.48

Shaunak Roy

Yeah, no, 100%. I mean, what you're saying is it's very possible, like if you imagine a real classroom or a real, let's say, seminar, where we've all seen the scenario where there are five people who speak for like half an hour and then the rest of them, they don't get a chance or that there is no kind of equity built into the system, which is, I think, you know, before answering your question on how to do it, I would say that technology actually has a chance because they are not bound by what I would call is time and space. Because if you think about a classroom, you are bound by time, which is you only have an hour. And out of that, everybody has to speak. And we know that the quietest student or student who is not, you know, quite comfortable or doesn't have the confidence is not able to raise their hands and have, you know, may be able to, kind of, make their point. But somebody who is, you know, probably a native speaker and already quite confident might just take the whole space. So, the time constraint actually makes it less equitable, whereas, when you think about technology, by design, it's actually much more asynchronous. Especially in Yellow League, you can interact any time, right? There is no time bound there. And there's also no space ground, meaning like, you know, in a classroom, we can only fit like 30 students. But no, in a Yellow League community or any other technology, you can have 3,000 students, right, if you design it well. So, just by design, you can have a lot more people, a lot more inclusive in terms of designing this experience. So, that's… the potential is there with technology, but it doesn't mean that will naturally happen. Which is where the design thinking and, you know, designing for that outcome is important. You know, in Yellowdig, for example, one thing we pay attention to is that everybody is engaged in that environment. And if somebody is like speaking too much or contributed too much, we have mechanisms in place to kind of reduce that. I won't say reduce it, but I would say create less incentive for them to actually participate more and create more incentive for people who are not participating, you know. We use our point system, our badge accolades, and various other ways we actually make that happen. We give tools to the instructors to be able to do that. So, I think that's what we do as a company. Our goal, one of the metrics that we look at, is what percent of the whole class, let's say the classroom as a community, every individual student is interacting with. Like to give an example, in a typical learning environment, we typically find that one student is interacting with, let's say, four or five students who tend to be in the same part of the classroom or they're in the same group. That could be maybe like 5% of the class, right? So, even though they're in the classroom, they're only interacting with 5% of their peer group. Whereas in Yellowdig, by design, we look at at least 50% coverage, meaning every student is interacting with at least 50% of the classroom, and more. And we try to get them, incentivize them, to go even higher, like 60-70%. So yeah, I mean, there are ways of doing it, but yeah, absolutely, it's a very important topic.


25:07.90

David Brunner

Yeah. So kind of moving on, you know, oftentimes we'll try pilots, and they often work really well in the beginning because of the enthusiasm, but fewer sustain that enthusiasm over time. What's the most common reason for engagement pilot fails that sustain over the first term?


25:34.88

Shaunak Roy

Well, there are many reasons pilots have failed. But I think the biggest reason I would say is, you know, not a good implementation. You know, because technology can only do so much, there are certain roles that each of the users have to play. For example, you know, training up the users, meeting the instructors if they're part of a pilot, making sure the instructional design team is on board. If it's a pilot, you know, outside of courses, there are enrollment leaders or admission leaders and others who might be involved. Being able to train them up so that they can use it well. I mean, those things I see is the biggest failure point sometimes. Sometimes what happens is, you know, like anything else, like a pilot gets started with the people who were, maybe, who are supposed to kind of be part of that. They get reassigned to another job, or something else happens. They don't have the time to invest. And in those scenarios, those pilots can go sideways very easily.


26:36.95

David Brunner

Awesome, well, thank you. You know, talking about pilots… the first few weeks of a rollout. You know, what are some of those early warning signs in the first few weeks that this rollout is starting to drift? And, you know, what's the simplest intervention to correct that course?


26:58.42

Shaunak Roy

You know, I would say it all starts with a relationship with the champion, who is responsible for the pilot. Like typically, in our case, when we are launching a pilot, we have an executive sponsor who is sponsoring the pilot, who has a very clear business case that we want to address as part of our measure, as part of the pilot. And then, we have an implementation team who is responsible for training and making sure the onboarding has happened properly, and they have access to the right data feeds to be able to measure the impact. Having that in place is probably most important, and especially early on, if there are some issues that can happen, you know, being able to reach out to the right people and having a conversation, like what's going wrong. The challenge sometimes is that, you know, if you think about a school, they are balancing multiple priorities. right? In one side, they have to run their existing operation; the classes cannot stop, the students have to be, you know, trained, and they have to do the right things because they have all the priorities in place. At the same time, they're working with a vendor who has, maybe, let's say, launched a new tool, and in that environment. So the movie has to go on, as the new tool is being tested. So, I think being cognizant of that complexity most of our clients are dealing with is quite important, and being able to talk to them. It's something that's going well. I mean, it's not that difficult for, you know, us or any technology vendor to kind of really, you know, pinpoint the problem because we tend to see the issues early on in the data. But I think the challenge is always about being able to communicate that to the right stakeholders and making sure that we can course correct without, you know, creating more complexities of damages. So that's important. And another thing, as a company, one of our guiding principles is making sure that student experiences never suffer, right? Whether they use our platform or not. At the end of the day, we want to make sure that, you know, anybody who is using our platform has a better experience, even though they, let's say, decide not to work with us, but we don't want to harm them in any way. So, kind of really keeping that in mind as a guiding philosophy so that, you know, if we have to have a difficult conversation in some cases, if something is not going well and we have to go of step in and fix it, we will do that, you know, with the idea that, you know, we ultimately are catering to the students and their experience. So, that definitely helps.


29:31.00

David Brunner

And speaking about that, you know, kind of owning the outcome and the mutual accountability. You know, when a campus says they want engagement, how do you translate that into an outcome that's measurable, realistic, and worth being accountable to?


29:49.21

Shaunak Roy

Yeah, I mean, it is, I would say, you know, in terms of getting the mutual alignment, I think pretty much it, I would say, in my experience, most of our clients love it because they love the fact that a vendor who is willing to, you know, put their money where their mouth is, and be able to actually be accountable for the results, and not just going to blame something that doesn't or didn't work. I think they welcome that conversation whenever I had that conversation. And we are having more conversations, you know. We recently made a… we published a new release where we said that we are going to guarantee 1% improvement in persistence if they decide to implement Yellowdig. So that, sort of, a guarantee, definitely, is kind of leading to more conversations for us, which is great. But in terms of like actually making sure that we can stick to it and it doesn't fall through is, you know, it goes back to having that strong alignment. For example, we will set up like regular check-ins with our sponsors and making sure that they're aware if there are issues, things change. Like if something has happened in the last couple of weeks and we are seeing some issues, we would want to have a conversation with them, and we will make some decisions around, okay, if this is not going well, how do we make sure we can fix it as soon as possible? So, being able to have those conversations is what I think is quite important, and that's where the biggest change is happening, right? Because, you know, traditionally, vendors in higher education, I would say, were seen as third parties who were not accountable in many cases, right? They're only accountable to provide the solution and not really accountable to the results because there were many cases not even expected to, kind of, actually interact with the stakeholders, like faculties or their staff. But I think we are changing that paradigm by essentially owning up to the results. And as a result, we are, you know, forming deeper partnerships with clients. So, I do see that's a win-win, but that's a cultural change in many cases. And I think that's something we are happy to do. So, I think that's important for us as a company. It's important for the space. And this is where we would continue to make investments.


32:05.18

David Brunner

Yeah, the mutual accountability is key for success at these institutes and, you know, working with them to diagnose whether it's an implementation issue, a fit issue. The key is, like you said, you're having those regular check-ins, and you're aligning with each other to what's best for the student and the institute. And that's fantastic. Yeah, kind of, transitioning a little bit, which we've been talking about the whole time, underlying, is AI. You know, AI without maybe all the hype, you know, in the engagement and retention space, where is it that AI generally is useful right now? And where is it mostly just noise? 


32:49.24

Shaunak Roy

Yeah, I mean, I would say there's a lot of noise right now. That's what I'm seeing, you know, because, you know, clearly there's a hype cycle. We are in a hype cycle with AI. There are clearly some advantages of using AI, but there's also a lot of hype around how AI can transform learning, you know, in a week or, you know, months or, years, but we all know that there's nothing new about technology in education. Even some of the earlier, technology innovations such as, web or mobile or social or so many other ways we have invented the modern, I would say IT stack. It still hasn't had its impact in higher education, right? I mean, it is a slow process. The reason is by design, because education at the end of the day is not just an option of technology, it's actually creating the right environment for better learning outcomes, better retention through an experience. And those things are a lot more complicated because those are not technology, but those are pedagogy or other deep, you know, deep ways of impacting students. Or so, those are much more complicated than just having AI to solve a problem. So my view on this is AI is impactful, but we are definitely in a hype cycle. And I think that's going to play out in the next probably year or so. In terms of AI being impactful, I would see the biggest impact of AI is in efficiency gain for faculties and administration, as well as others who are in the institution. But learning, introducing AI in the learning process is probably the most hardest and most riskiest right now. I mean, I have many examples of pilots that had a pretty high, you know, expectation around providing, you know, really good support on the learning side of things, but those pilots are rarely successful. I mean, I've been in rooms where I know that even means like 98% of those pilots have failed, maybe one or two percent have actually succeeded. But, if you look at the buzz and the funding around it, I mean, probably 50% of the buzz is around it essentially right now. So, you know, I'm not a, you know, I'm not a person who would say that AI doesn't have an impact there. It's going to have an impact, but it's going to be a much slower process because there's a lot of hard work to be done to be able to design those experiences that has a real high impact. But meanwhile, you know, we are focused on AI as well, but our focus has been primarily on efficiency gain, kind of helping our instructors, our administration, creating AI dashboards to be able to kind of merge data from a variety of systems to be able to, kind of, see the impact much more faster, quicker. Those kinds of applications I have seen have a lot more success.


35:50.01

David Brunner

So, you talked a little bit about some of the successes in AI. Here's maybe a hot take. What's one AI-related assumption you think higher ed is getting wrong and it's going to become obvious, let's say, in the next year?


36:09.11

Shaunak Roy

I think the biggest assumption, I mean, this might be a hot take as well, is there's a lot of hype with AI chatbots as a virtual TA. I think the hype started with Khan Amigo, like Khan Academy launched their Khan Amigo chatbot, which students can kind of ask questions and get answers. I think what I would find is that a lot of the chatbot implementations are so basic that they really do not augment learning. They are essentially what I would call a tier one support structure. So, I think a lot of them might actually go towards like just supporting basic QNFL students. And a lot of the learning is going to move over to much more deeper engagement strategies like simulation or social learning or being able to really keep building deeper learning experiences as opposed to like a chatbot that's going to give you answers or, you know, help you with your learning. I think those are going to be, you know, may not turn out to be as transformative as we think it might.


37:13.76

David Brunner

No, I think you're definitely onto something. So, the next phase of this discussion is something we like to do towards the end, which is like a lightning round. And so I'm going to ask questions, you know, kind of rapid fire, and you give me your best answer. So, one engagement metric that you would demote immediately.


37:40.21

Shaunak Roy

Activity.


37:41.88

David Brunner

Fantastic. One design habit that consistently boosts persistence.


37:52.26

Shaunak Roy

Designing for student agency and motivation.


37:57.85

David Brunner

Ok. And one rollout mistake that kills engagement initiatives fast.


38:08.73

Shaunak Roy

You know, I would say not training, not focusing enough on training of educators and others who are using your technology and trying to, kind of, get it out fast.


38:21.85

David Brunner

Okay, my last lightning round question. One thing vendors should stop saying about AI in higher ed. And, you can take a little while on this one because I think you might have some opinion.


38:34.90

Shaunak Roy

Well, I think, you know, I mean, there is, as you know, there's a huge hype cycle right now. And I think the risk, you know, just this over-promising. Essentially, I think there's a lot of promising that's happening with AI tools that's gonna make learning faster, easier, and, you know, ubiquitous everywhere. I think there is potential, but I think we should tone down on the promises because a lot of these are gonna be experiments. I think more than 90%, maybe 95%, of these technologies are fail. And, what we don't want to do is to, kind of, create a huge amount of distrust between vendors and institutions, which takes a long time to build. So that's something I think we should tone down.


39:19.86

David Brunner

Yeah, it's fantastic. And great job with the lightning round questions. In closing, I just have a couple more questions for you. This has been fantastic. I really appreciate your time, your in-depth knowledge of this, you know, with retention. But one of my last two questions are, if a student success leader wants to improve retention this term, what's one action they can take to make engagement more reliable?


39:53.27

Shaunak Roy

I would say the first thing is to really think about why the students are going to choose them over a variety of other options that are exploding right now, right? Like, for example, you can think about students have now options to, kind of, push you on a low-cost program or maybe take a gap year or maybe you can start a startup on their own, you know. That's what's exploding. So, the optionality has gone up and I think for enrollment leaders or retention leaders, the question becomes is that why they would choose because the dynamics have changed. And then really kind of think about what ways to improve the student experience so that they keep on choosing them. So, I would start with student experience. I mean, there's a long answer to that. And really think about how to be, kind of, and especially on the digital side of things, because that's where students are, is where they're spending most of their time, and do rapid piloting of different technologies.


40:55.58

David Brunner

Awesome, and in my last question, if you had one piece of advice for EdTech product leaders who want to be judged on outcomes, not features, what would it be?


41:07.73

Shaunak Roy

I would say, you know, working with the educators or the users directly. Sometimes, in edtech, we are, you know, we kind of create our own hype by kind of finding the shiniest object, which is always coming, especially with AI. There's something new coming up every week now. And, we can build this like wonderful tool that we think is going to change lives. But the real feedback, all this, comes from people who have done this for decades and decades, and there are many of them available, and they're super eager, actually, to work with us. So, I would say, working with your users and trying to see what actually is moving the needle. It takes time, it's a little more humbling, but I think that's the right path.


41:48.95

David Brunner

Awesome. Well, Shaunak, thank you so much for your time today. It's amazing to hear your perspective and what Yellowdig has been able to do and what it's doing moving forward. And so, I appreciate your time, and thanks for joining us.


42:07.07

Shaunak Roy

Yep, David, I really enjoyed the conversation with you, and thanks for the invitation. Have a wonderful rest of the day.


42:16.81

David Brunner

Thank you so much. You too.