Tech in EdTech

Balancing Innovation and Practicality in Curriculum Design

Magic EdTech Season 1 Episode 56

Sam Nelson, Director of Technical Curriculum at Multiverse, discusses the intersection of technology and curriculum design in the context of apprenticeships and workforce training. Sam offers valuable advice on balancing industry-specific skills with pedagogical best practices, the importance of using AI and new technologies thoughtfully to enhance learning, and the critical role of curriculum design in driving successful workforce development.



00:01.20

Eric Stano

Hi, this is Eric Stano with Magic EdTech, and this is our podcast series, Tech in EdTech. Today I'm joined by Sam Nelson, the Director of Technical Curriculum at Multiverse. Welcome, Sam.


00:16.04

Sam Nelson

Thank you, it's great to be here.


00:18.57

Eric Stano

So why don't we just get started and give our listeners a chance to get to know you a little bit? So could you share a little bit about your background and your journey through edtech? And then bring us up to your current role and what you do at Multiverse.


00:35.03

Sam Nelson

Sure, I'll start kind of at the beginning when I first got interested. So I started my career as a as a data analyst for a consulting firm. We did analysis for big corporate litigation cases where we estimated damages. And so I studied economics, and I went to a company that did economics. And then I had to learn how to code and do a bunch of things and thought, man, I spent all this time in school and now I'm still going through a painful training process at a new company and I thought there should be better ways to do this. So after a brief stint at a small B2B software company, I decided to go back to school. I got an MBA in Masters in Education joint program at the University of Virginia and then started working at Udacity. I got really excited about the alternative higher education or lifelong learning space, given my experience entering the workforce and having to learn how to code. So I worked on their data programs for about three and a half years. So I went from one data to program to seven programs and from a few million dollars a year to a 20-plus million dollar kind of book of business in that area of Udacity. Now that was a tumultuous time there. Survived like five rounds of layoffs and seven bosses and decided it was time to move on to something else. I joined Chegg in their bootcamp arm called Thinkful to run their instructional design arm of their product team. And there we built out new boot camps and kind of expanded on their model. Then was recruited away. At my heart, I love data, so I was recruited away to come lead product at a small data science learning company called DataQuest. Very cool company. I was there for a few years before a brief stint at a company called Zipline's Education, where I consulted on some new program creation before I landed at Multiverse, where I'm one of their three directors of technical curriculum. And here at Multiverse, I'm working on building out their kind of US curriculum offering. So it's a UK-based company that's done a lot of apprenticeships and that's how they've built their business. And now they have a much smaller presence in the US and so we're trying to figure out how to go to market in the US and address the unique needs of this geography versus the UK market.


03:13.65

Eric Stano

Very nice, very nice. And one of the reasons I think that we've come together today is conversations that you have had with my colleagues, specifically in your role at, but drawing on your experience at the multiple companies but around the challenge of integrating platform curriculum and support in your current role. Can you give us some insights into into how you sort of maintain consistency in learning experiences across different platforms, different support systems? You know what sort of strategies do you recommend for ensuring learners from from your vantage point have a cohesive and effective learning experience regardless of the delivery method?


04:00.38

Sam Nelson

Yeah, there's a couple of frameworks that I use to help guide my decision-making here, and it's helpful when I talk to other people to kind of to that lay the foundation for how to integrate these things. So the first one is understanding the difference between a true technology company that sells education technology versus what I would call a tech-enabled school, which doesn't actually sell technology, it sells a learning experience. And the two approaches for those two types of companies is quite different. If I'm a what I would call a true education technology company, I'm selling technology and I'm selling it to education institutions. And so it's really focused on that technology being the product. Now, if you are a tech-enabled school, which is still often just referred to as education technology, um you're selling a learning experience and that the technology is not the end goal. The technology enables the learning experience. And so I call these learning products. So learning product is something that is a learning experience, essentially. And so that's where you want to integrate the platform or the technology part of the learning experience, the curriculum. So it's actually like the learning goals and how, what, what content and what other types of learning methods you use to achieve those goals. And then what, what do they do when they get stuck, right? Like what type of support and that support can be automated. It can be in person, it can be remote and live. There's lots of different support models. But when you're thinking about your learning experiences, the integration of those three three things. And so when you're thinking about your product roadmap, you need to think about how those three things work together and that you ensure that your product roadmap reflects those three things. So let me give you an example. um So if I'm going to create a new program, for example, that often is just on the curriculum team or the learning, we call it the learning team, but there's, I've had the content team, the instructional design team, the learning team, you know, all the same thing and that's on the learning team's roadmap. Now, if it's on the learning team's roadmap, it also has to be often on the platform team's roadmap and the support team's roadmap. So I can't really make a roadmap of stuff I want to build on the learning team without integrating with those teams. So for example, if I want to create a new program on data engineering and I'm going to teach a new technology. So that means on the platform we need to integrate that technology into the platform so it'll enable the learning and then on the support team we now need to hire a different type of coach or we need to If we have an AI, we need to upload a different set of content and and and train that AI model to support on this content. And so you really have to have this joint roadmap. Where this breaks down is when these teams have separate priorities and separate KPIs that hit. And so I see this challenge, especially with like platform and curriculum. Often because at a tech company or what I call a tech enabled school, the product team is really in charge of the platform we forget that the curriculum is also a product team and operates like a product team. And so both teams are like, well, we're the product, but we're the product. And I was like, oh, you both are the product. um And so the and that really the most important thing there is that the leadership is bought into that idea, that it's the combination of the two and that they don't, they're not territorial and trying to secure resources for themselves, but instead think about you know what are we really trying to build as a company and how do we get the right people working on the right things.


08:09.02

Eric Stano

Right. and you And you invoked the fact that, you know, eat each cohort has its own KPIs, its own roadmap, and certainly having the the leadership of each team bought in to sort of similar goals, the same goals, you know, is is important. are Are there other tips even beyond leadership and and having KPIs that align that you would you would give to others to to foster some collaboration among those teams.


08:40.82

Sam Nelson

Yes, absolutely. So you know leadership, very important that they're aligned. At the next level, you think about what a product manager does or the equivalent of a product manager on the curriculum side. We call them technical curriculum leads. Previous companies, we called them product leads. when you think about your role as leading a product team, to think about, well, how does support play into this team. And how does, so if I'm, for example, if I'm a product manager on the platform side, so I've primarily worked with designers and engineers, well, if you have a project you're working on, it's like, well, how should I incorporate learning science and learning design into this? Because right now, for example, our product team doesn't have a ton of learning experience, but we have a ton on the learning team. So how do we incorporate that? Where do we need them? And then how do we you know and also involve the support side of things? So one of the products we're working on right now for our new subscription product is group coaching. Right, so we want to have groups of two to six people in these small-group coaching sessions. Well, how we design the product should certainly be influenced by are the capabilities we have in our support team or we call it our delivery team. And then also like, well, how do we structure that from a learning science perspective? Like what content should we cover in there? How much of it should, how do we structure open-ended questions? How do we enable like a really high quality coaching session? What does that even mean? Well, the product team doesn't actually have that expertise. So as a product manager, make sure you're no longer working with just engineers and designers. You're also working with learning designers and coaches. Same thing on the learning side if I'm building a new course. Well, what platform features would enable the ideal learning experience here? And how do I need to involve our engineering or product and design team to enable that? So I think thinking about a pod as a product pod is no longer just being in the typical product manager, engineer, designer, product designer. You also have sort of a broader group of people.


10:55.27

Eric Stano

Right and you invoked here, I want to shift gears ever so slightly, but I think you gave us a good jumping-off point there. You mentioned sort of those who actually are designing the curriculum or curricula rather. And you know one of the challenges we know, I've seen it from my own perch, and I'm sure you've seen it, is that curriculum designers, I often face challenges in balancing industry-specific skills training and then maintaining pedagogical integrity. How do you address the specific challenges and requirements of each sector, such as data engineering versus product management? You know how do you thread that kind of needle?


11:41.46

Sam Nelson

Yeah, that's a challenging one. So what, just to reiterate the challenge I've seen is each of these, of the education companies I've worked at, and then I've seen many others, is they start off in one area that they see a lot of success in. So a typical example is like a coding boot camp. And they're like, oh, we found a model for a coding boot camp, they do XYZ, and at the end, they have a portfolio of things, and then then they present it and get jobs. Right and they're like, well, let's expand to the next thing and the next thing. And you slowly get less similar to your original product. And so now, let's say you're doing product management. Now product management is incredibly collaborative. And so the approach you take to a good learning experience needs to be very different because you can't just have an engineer coding things. I mean, engineering is collaborative too, but it's a lot easier to build something by yourself that's valuable in that condition.


12:33.49

Eric Stano

Sure.


12:35.30

Sam Nelson

But you can't be a good product manager unless you learn how to you know work with other people and not just so engineers and designers, but also other stakeholders. And so, you know, thinking about those balance is, one is being okay about just saying, you know what, we don't have the capabilities internally to do this. And we did that, at two companies, we were like, we should do cybersecurity. And then we started to export, and then we decided, we can't do it well, because we integrate these kind of labs and simulating environments, and we just, we'd have to make a big engineering investment that it just wasn't worth it.


13:17.43

Eric Stano

It's a tough call to make, too, when you're, when you're committed to the enterprise to say, OK, that's maybe not for us.


13:25.00

Sam Nelson

Yeah, especially in a field that's as big as cybersecurity, and as much money is getting invested there.


13:28.43

Eric Stano

Certainly. Right. It's growing and growing, of course. Yeah.


13:32.80

Sam Nelson

Yeah, so that's one you know one tip is, be okay about saying no if you can't do it well. For example, if you want to do product management, but you're in a completely asynchronous platform. It's like, well, be clear on what the learning outcomes that you can help people achieve. You can teach people a lot about product management in an async environment, but you can't really say they're going to be job-ready at the end. But if you have, if you're doing something that has more you know social learning and and live cohorts and things like that, well, then you can put people in groups and have them do group projects that simulate better the product management. So, I think as this comes down to what the learning, what I'll call a learning product manager, somebody who's a product manager over a learning experience, we call them, again, technical curriculum leads.


That's ultimately their role to understand, hey, this is my audience and this is the needs of this audience. Here are the learning objectives and you know they'll work with learning designers as well as like, what are the requirements of this type of person for these types of learning objectives? And choosing the right learning activities that are appropriate for those learning objectives. And so it's not it's even less about you know data engineering versus product management because there'll be certain learning objectives or learning outcomes that they want to achieve that overlap. But so it's down at that level, right? If you want to learn how to conduct a retro meeting. Well, that's going to be tough to do by watching the video. At least it's going to be tough. I should say it's tough to practice while watching a video versus you got to learn how to build the data pipeline. Well, you can do that pretty easily in an async environment. So anyway, those are some thoughts about that.


15:25.17

Eric Stano

Yeah, and you've you've given some some examples of the different types of of learning that are you know more easily executed, harder to ape in that kind of context. I want to sort of drill into, know I'm going to say specific, but then be sort of not specific, but technical skills, broadly speaking. I began to invoke this a little bit you know technical skills are best learned in context, what methods do you use to to integrate real world scenarios into your curriculum if you have any examples of areas where you consider that really successfully executed the past, would be great to hear.


16:10.57

Sam Nelson

That's a great question, because I think if you look at any learning platform, you talk about all their real world stuff. But it's very difficult to do in a true real world context. Projects are huge for this and you design, you know one of the core principles of learning design is backward design, which you've you've heard the term, right? Where you say, here are your learning outcomes, here are your learning objectives. Now, how would one assess whether someone's achieved that? Well, that's let's take the let's create the assessment first, and that's the project. right I think assessments are...You can also do quizzes and exams and things like that that are helpful, but ultimately a project is a demonstration of someone's skills. And projects across sectors can look very, very different. And then my when I work with people who build projects, my steer is always think about what someone would actually do on the job, right? And just have them do that, right? And actually you can scope your whole course by just saying, well, what do we want them to learn? well what What would they do on the job?


Well, let's have them do that and still examples of where this, and broadly speaking, that that seems straightforward enough. And then as you get into it, it can be kind of hard. Some examples of where I think that this has worked well for technical skills. For data analysis, you know for example, teaching how to do, I mean, this is more like data science, but building models, for example, is often what you'll get is, if you learn how to build a machine learning model, it's like a very, very clean data set. It's ready to go. You just throw it into Python and create a predictive model. But a real world context is usually some sort of like messy data set that lives in 16 different places and you got to put it


18:07.51

Eric Stano

Of course, I'm familiar.


18:10.08

Sam Nelson

Yeah, and so it's we've yeah in the past at Udacity, we tried to do that with a couple of other projects where it's like, well, yeah, here's data across a few different things that's incomplete. And so first you got to put it all together and identify what are the right features to use in your predictive model. One thing that I think Multiverse does really well because they when they built their whole business on apprenticeships. And so their projects are a little less, they're very different than the other ones I worked on in the past where rather than saying, okay, here's a scenario that's, you know, it might be tied to a real company or a make-believe thing, or here's a scenario where you get to practice your and demonstrate your skills. And for a multiverse, it's, here's what you've learned. Here's a project brief that you can use to go find a project at your company that your boss is gonna be happy with you doing.


19:09.21

Eric Stano

interesting


19:10.51

Sam Nelson

That's gonna use the skills that you just learned. Now go and do that and come back to us and show us what you did and reflect on it. And so it's, in that type of project, you might not have the same level of like very specific rubric criteria that you need to pass off, but it it helps you, it helps with learning transfer, which is one of the most important things learning is you can learn it,


19:33.63

Eric Stano

Of course, and apply it.


19:36.60

Sam Nelson

internalize it and apply it and treat for it to a new context, right? So a lot of the time we apply it to these real world scenarios that aren't necessarily our our context and we go back to work and we think, well, how would I do that? One of my favorite learning experiences, which is why I got into this apprenticeship space in the first place is, I did my first year of grad school in the full-time residential program at Darden, at UVA's Business School. Then I joined Udacity and stayed on and during my summer internship, and so then I did my second year in their executive program. What I loved about their executive program is I'd be learning something on a Friday, like I did, took this pricing class and and we learned about this specific survey technique to estimate willingness to pay. And essentially you can plot a demand curve and pick your pick the right price. A very cool technique. And so I learned it on Friday and then I did it on Wednesday and then presented the results and pricing suggestions, you know, the next day to my boss.


20:41.20

Eric Stano

Oh, nicely done Udacity.


20:44.76

Sam Nelson

So it was great that I had learning that I got to apply in context. And so now that was that like really solidified the learning because I had that opportunity to transfer. So I think that if if you can help people say, all right, I learned this in the course, now here are the tips and tricks and ways that you can go find a project at work to go do that. I think that's all powerful.


21:08.44

Eric Stano

Right, and it had a little just-in-time nature to it, too, the learning, which is great, the pace at which you were applying it with was persuasive. Actually, speaking of pacing, you know technological change is quick. And you know when you're there's a course that is all on technical skills, those you know gaps in those skills can emerge quickly if the technology's outpacing you. How do you ensure that your curriculum is agile enough to address any skills gaps that might emerge without compromising the quality of the training you're giving?


21:52.12

Sam Nelson

So this is a challenging problem. And I don't think any company I've worked at that we totally nailed it. One thing we're actually working on right now is that we've tackled this in past companies too is just when you're building a roadmap, how do you incorporate maintenance? And so you have to maintain, so maintaining your content, especially as you expand your content portfolio becomes a challenge and it's always easier to say, well, I can predict, like I can build this new thing and I know that there'll be some revenue that comes from it and growth, but I can't really predict how maintaining this content's gonna help. So, you know, which one are you gonna prioritize? You're always gonna prioritize the new thing. So it requires some organizational discipline to make sure that we're you're investing in keeping your content up to date. So that's, I think that's part of the question you're asking is like, look, I have a way


22:48.90

Eric Stano

It's precisely, it yes. yeah


22:50.82

Sam Nelson

Yeah, so it's like I have a data analytics curriculum. how do How do I make sure that that is teaching the latest and greatest?


22:58.68

Eric Stano

So there's a nurture element to the existing portfolio that you want to make sure is funded and privileged to a degree. But the center of gravity is always going to shift towards the new thing. But you have to make sure to balance the two. is that Is that sort of a fair restatement of what you're suggesting?


23:17.77

Sam Nelson

Yeah, exactly. And just you have to be willing to carve out some resources to do that. I've seen and then you know, especially if you can align your new curriculum to be supplementing your existing curriculum. For example, we're working on, we have a data analytics curriculum and now we're working on an AI for data analysts, like add on to that. So it's a way that we can both kind of capture new revenue, but also be investing in our current programs. So we can make those things aligned, that's that's a good thing. But then you have to keep but keep track of your metrics, right? if your out particularly your outcome. yeah I think about metrics is you have your learning outcomes, which is how well do they achieve the goals you set out for them in the course. And then the career outcomes is like what happens now after they go take those to market. And so we keep track of that by what happens to our apprentices after they complete the program. Do they get promoted? Do they stay on with their companies and all of those things? That's a lagging metric, but keeping track of those metrics can help you understand when you need to make investments in your programs.


24:24.75

Eric Stani

No, no, very persuasive. And you mentioned outcomes, and this is a related question, but yeah obviously, for most, a personalized learning experience is going to be the the most effective one for them, but that's hard to scale in any sort of large organization or with ah a product or a platform or a course that is meant to just speak to you know many, many people. how do you approach, if you do, personalized learning? And how do you if you do, how do you how do you do that at scale?


25:04.37

Sam Nelson

Yeah, i this is actually pretty exciting for what we're working on in Multiverse because this is core to our investment, and especially in the US market. So we think about this, I think about this in two ways. First of all, when we work with an organization, we want to customize or at least configure a program to meet the goals of the company. So we have this this process, we have people go through what we call a diagnose, prescribe, treat. So the diagnosed part is, well, what are your goals? We ask a leader, what are your goals? And have them go through a few questions. And then we we recommend a pathway of learning modules, essentially, that they would that they would have their learners go through based on their goals. And then they can look at the pathway and say, oh, we want to add this, remove this. And and so they they get their own pathway that's specific to their company goals as opposed to, oh, yeah, here's our data analysis pathway. So you could have your data analysis, you could also add some AI to it, you could add a little data engineering to it, you know, so you can, based on what you need, enter your tech stack, right? You could, if you want Excel versus you want Tableau. And then so that what I call, actually refer to that as customization. Then the next level is personalization. So with each individual learner, they have different set of skills, and rather than having everybody go through the same thing we have them go through What we call a skill scan It's not just a like a pre-assessment in the sense of like testing their skills It's also just asking them like how comfortable do you feel with these things. Do you have experience with this? And so It's helping them craft and personalize that pathway to them. So everybody has the same Set of goals, but then you basically say well, I already know X Y and Z. So I just to focus on what you what they need to learn so that's that's one level of like the customization for the company's goals personalization by learner. And then you know back to the project idea that I had mentioned, you know they get to have some say in the type of project that they do and go take that to work And then we're also working with one one of our early customers is they're like, well, we want everybody to go through this AI curriculum, but we have like 14 different business units. Can you do a project for each business unit? And we're like, sure, let's do that. And so each person has the same learning objectives, but we they have a slightly different project for their business unit that's going to apply most to them. So they again, that learning transfers the goal.


27:47.77

Eric Stano

And that's sort of a perfect jumping off point for another question I had on the personalization front. I presume you have a global audience and you've you've talked of of customization. How do you go about customizing if it's actually any different than you do for different departments or different companies? Customizing for you know different regions of the world, particularly when there you are cultural differences that might impact how the material is received or understood.  Is there a play that you have there?


28:24.95

Sam Nelson

Yeah, you know Multiverse is primarily in the UK and now entering the US market. So and you know those those audiences aren't as different as they could be if we had different regions.


28:39.23

Eric Stano

Technically, that is global, though.


28:41.08

Sam Nelson

Yeah, it is global, though. one funny thing, though, is I learned here is originally we wrote our curriculum in using British English. So you know customization, for example, had an S instead of a Z. And you know yeah people in the US can read that, but it turns out people in the US don't like that very much.


29:00.81

Eric Stano

No, that and that in the proliferation rather they'll cut that out in adding of use in a variety of words.


29:10.95

Sam Nelson

Yeah. and so um so we've actually and but It turns out people in the UK don't care if it's US English, just because I think a lot of people are around the world are used to that with all the tech companies and the products that they use. so we've changed everything to use the US because it's ah more broadly accepted. I mean that's kind of a specific, funny example. but the idea is that we try to remove with the unnecessary cultural references, or especially like pop culture, like we stay away from that, to make sure that, you know, especially people with English is not their first language, that people understand it. That's definitely a principle. One of our core, we have this, this major framework, which is measure, applied, guided, and equitable. And one of the things equitable is using language that is accessible, meaningful to everyone. We are looking at different regions in the world, that that and including localization of language, which should be an interesting you know next step for us. But right now, that hasn't been it hasn't been a big problem for us to solve, so it has yeah I can't say much more and to that.


30:21.36

Eric Stano
Sure, sure. no understood. But then sort of switching gears a little bit, you talked to you earlier about we talked earlier about yeah sometimes privileging the the new course or product over over the existing one, the one where we spoke about sort of nurturing it and and updating it out as one goes. But on the the one that gets privileged a bit, which is is the new, when you're you're pitching a a new curriculum initiative to to the various stakeholders, you've got to demonstrate an ROI to them. How do you go about communicating, measuring the return on investment for your training programs? You've already invoked a bit about you know pursuing and people's placement and sort of careers post these training programs. Could you talk a little bit about, again, how you go about making that pitch for for a new initiative and and how you reflect on the ah ROI?


31:23.88

Sam Nelson

Yes, absolutely. I had mentioned this earlier, but I'll reiterate it because I think it's important. so I think there's two types of outcomes. First, there's learning outcomes, which is how well they do in the learning program and then there are career outcomes, which is what impact does that have on the career, or in this case, if it's with a company, it's like what impact does that have on the business. So for learning outcomes, though, they're easier to measure, but slightly less important. But they are really important, though, in terms of, hey, look, you're gonna put 500 people through this program. This is a program where 80% people typically finish, right, versus something like, you know, I'm one of the MOOCs, you know, one of the asynchronous online courses may have 5% completion. So it's like if you're going to invest in something, you want to have it be something that people finish. And so that's an important one, and how many people engage with it and finish it, how they do on their assessments, those types of things. and the completion rate ends up being sort of the the most important one that clients want to see just because they want to see realistically how what what impact can they expect. and then the next one is, okay well what is the impact what's that essentially, what's the learning transfer? like what how does this affect the next step? So we use one is we use a lot of case studies, because it's tough to like, and you know, every company has different goals with their learning. So it's tough to say across the board, we saw, you know, we saw this. So we have a lot of you know companies that, that talk about what the impact was on the, from the program, what they saw in their apprentices, what that drove for their business. And we have some other metrics about the percentage of apprentices that stay on with their companies, which is above 90%. The percentage of apprentices that get promoted or get a raise within a certain number of months after the program. And just to show that, hey look these other companies, they put people through and then they really value these apprentices after they finish their programs because you know presumably they're having an impact and doing a good job. We also have and some estimate of the number of like revenue dollars that we that our programs have helped drive for companies. To be honest, I'm not exactly sure how we get to that number, no but they I know it is something that we're, and that's that, you know, when I just mentioned MAGE, Measure Applied, you know measured is that that one, where we really are trying to get into the ROI of what these programs do for people. To your point, it is pretty tough to quantify sometimes, but we we do our best to dig into it.


34:17.34

Eric Stano

Sure.


34:19.46

Sam Nelson

Because that's part of the the whole diagnose step of the diagnosed prescribed treat is, okay, well, we don't want to just give you a generic program or sign you up for a platform. It's like, well, what are your goals? Let's start there. And then we're going to put together the best learning experience we can that'll help drive your goals.


36:40.11

Eric Stano

So we've invoked this a couple of times throughout the course of the conversation, but you know changes is always present and is always increasing in its rapidity. How do you stay up to date on technology changes and innovation. Are there specific sources, go-to sources that you use that you'd recommend to to our listeners for, again, staying up to date, particularly when it comes to to workforce learning?


37:16.59

Sam Nelson

Yeah, that's a good one. And so honestly, one of my favorite resources for this is the Slack channels at my company.


37:28.82

Eric Stano

Invoking the conversation we had prior to recording wherein I expressed my displeasure with Slack, but go ahead.


37:35.84

Sam Nelson

Yeah, you know, people are always sharing really great articles. And so and I would expand that to even my network on LinkedIn of it's said yeah is It's funny as it is to say that that's the primary source. It actually is a really good filtering mechanism then versus just like having a bunch of different publications that I go to regularly. I have some people I follow on LinkedIn that I think do a good job and I could actually pull up a couple of their names. One in particular does and does' another edtech podcast, so I'll pitch edtech. Edtech podcast is a good source. It is a good way to do it. But yeah, I honestly think that the best way is to follow some individuals that post good content because it's a natural curating mechanism for finding you know what's newest and what's the latest and greatest.


38:34.47

Eric Stano

Right now, and as aggrieved as I am to hear that Slack is the the answer to that. Nothing against Slack, it's a wonderful product, wonderful tool, wonderful company. But I just personally get a little overwhelmed by it, but I think it's actually a very practical suggestion for folks.


38:50.90

Sam Nelson

Yeah. Well, it's just like email is. you know yeah it's a sometimes it can be a necessary evil because you need it. It has value, but it can be noisy.


39:01.87

Eric Stano

Right. right and actually, just relative to just staying up to date, you know, and again, we've we've touched on this a bit, but but how do you ultimately strike a balance between incorporating new tech in your curriculum while ensuring that the content itself remains practical, you've invoked this immediately applicable for your learners in their day to day roles? um how do How do you go about striking that balance?


39:34.84

Sam Nelson

So I try to not chase the shiny thing just because it's the new shiny thing, right? You see a lot of that right now but with all the that new AI technologies that everybody has an AI tutor, but not everyone is thinking intentionally about how that would be actually integrated in the learning learning experience. Like when when would they prompt someone to use that AI tutor and what what should they ask them? How should they be using it? I think every learning experience should have one, but also should think about like, well, okay, great. Instead of just writing a reflection piece, like write a reflection piece and post it in your AI tutor and ask ask it something. So I'm going down a rabbit hole, but the idea here is, thinking about learning first, right? What is the objective of your learning? And then what are the right learning activities to achieve that achieve and assess that activity? So for example, a really cool tool that we're using right now that is like a new shiny thing is this tool called Udily. So you can have a conversation with an AI and you can kind of give it a prompt of about the role it plays. And then you have a full conversation and then it will kind of grade your conversation based on certain criteria. So the example that I saw in the demo was somebody giving feedback to an employee that was kind of hostile. And so they had to handle this angry AI and figure out but what to say to calm it down. And then it gave it feed that person feedback on how they did it. Basically, you get feedback on how they provided feedback and handled a difficult situation, which I think is a huge unlock in terms of being able to help people develop durable, soft skills in an asynchronous environment in a scalable way. So that's that's an example of when you do want to lean into the tool, the new shiny tool, because it's an unlock versus you know like if you're going to teach somebody coding, you know you actually probably don't want to use the that you know AI autocomplete coding at first, that's a cool tool, but like that's actually gonna maybe give them a crutch on how to learn. You probably should just stick to the basics. And so really it's a thinking, learning and objective first, learning first, what's gonna help them with learning transfer, and then and then saying, all right, what is the best tool? Well, it might be a new tech, it might not.


43:04.50

Eric Stano

Right. And actually, you mentioned ah hostility, although it was relative to the AI's responses. But speaking of hostility, you know, introducing new technologies, methods can sometimes be met by folks with resistance or hostility. You know, particularly the learners or instructors who are used to employing more traditional methods or methods that are well-worn for them. How have you managed to that kind of resistance or even hostility? You know what strategies have you found effective in encouraging the adoption of new technologies in in whatever way that is meaningful?


43:49.99

Sam Nelson

Yeah, so when I've hired and trained dozens of different content creators and they come from all sorts of different backgrounds. So sometimes they're a subject matter expert who's never taught anything. Sometimes they're a teacher who doesn't know as much about technology and everywhere in between. And so everybody brings their own perspective and so you kind of need to meet them where they're at. And so there's really two main methods I've used. One is helping them understand the science behind what we're doing. Right, so that for a teacher it might be helping them understand the importance of engagement and user behavior approaches that you would typically see in like a technology product and like how we integrate that with learning. They may say, well, that's not best practices for learning. It's like if we don't get them to actually do, to really invest in the course, then they're not going to learn anything either. So it's like this balance of learning and engagement. um And so spending some time, or and or and vice versa, if you're talking to somebody who is teaching somebody how to code, you need to talk to them about expert blind spot and be like, I know that that you feel like everybody should already know this, but they don't. Yeah, and we need to think about scaffolding and sort of helping them understand the principles of learning science. So it's kind of like a teaching thing about helping them understand the principles. The second part is helping them understand that they're not just solving needs for a learner, they're solving needs for a business and that when you're building a learning experience, we can't provide that learning experience unless we provide it in a way that will have people paying for it. And so that's things like, well, making it, not making the program too long. It's like, well, we need it to be a little bit shorter because otherwise people won't buy it. So how do we cut out any unnecessary, yeah valuable, but and not crucial content to make this thing shorter? And so, I mean, that's less about new learning technologies and more about like helping them understand be on board with our approach. 


46:25.25

Eric Stano

I think you you just kind of led us to a pretty good place, which is really just sort of your part and you know Sam I want to thank you for joining today and and really reflecting on all of your experiences and and best practices and how you navigate really and an ever-changing landscape. Are there parting thoughts that you would want to leave with our listeners just on the future of learning experiences within edtech and your sector of it in particular?


47:13.29

Sam Nelson

Sure. Yeah. I mean, the elephant in the room is AI is going to continue to have a massive impact and we don't know exactly where that's going to go or how deep and how much it will change everything. But I think it's going to continue to unlock new opportunities for learning. So I think the my advice is similar, I mentioned earlier, is like think about the learning first and the goals first and how AI enables that versus a cool new tech with AI, let's figure out how, let's like, we have a solution, let's go find a problem to solve, like stick with problems first, which is big. And that goes to industry too, when you're working with customers or individuals, think about the problems they have or the I like the jobs to be done framework, right? What job are they hiring you to do? And do that do that really well. and And part of that with industry is really understanding the needs of the industry and staying close to that. So I always yeah i think that's an important one to to think of yourself as partners with your customers and to really understand their needs when you bring solutions to them. So it's been a pleasure to be on the podcast. I really appreciate the invite.


48:33.23

Eric Stano

No, absolutely. So thank you again, Sam Nelson, Director of Technical Curriculum at Multiverse. I appreciate your time, your insights. There were there were many. this has been Tech in EdTech. Thank you for listening.