Tech in EdTech

The Secret to Building an AI-Ready Workforce at Scale

Magic EdTech Episode 75

What does it take to shift from degrees to skills-based hiring while tackling the AI skills gap? Lydia Logan, VP of Global Education and Workforce Development at IBM, joins host Laura Hakala to discuss how IBM’s SkillsBuild program and its culture of lifelong learning are helping close global skills gaps. She shares what it takes to build responsible AI guardrails, maintain meaningful digital credentials, and design learning that actually drives employability. A sharp and practical conversation for anyone shaping university partnerships or workforce development programs.



00:09.29

Laura Hakala

Welcome to Tech and Ed Tech, the podcast where we cut through the hype and get real about the future of learning and work. I'm your host, Laura Hakala. We're going to be diving into some really interesting topics today. We'll talk about the sometimes messy transition from degree-first thinking into skills-based hiring. We'll also touch on some of the real pain points in AI readiness and why you can't afford to ignore these shifts. So if you're a university leader or a learning and development professional or really anyone who's trying to future-proof your talent pipelines, you're in the right place. Our guest is Lydia Logan, Vice President of Education and Workforce Development at IBM. IBM isn't just a company that talks about skills gap; they're actively trying to narrow them. And Lydia's team is at the front line of what works and what doesn't when it comes to preparing people for jobs that didn't even exist three years ago. Lydia, welcome.


01:00.91

Lydia Logan

Thank you. Thank you for having me.


01:02.88

Laura Hakala

Of course, we're happy to have you here. So could you share a little bit about your background and what led you to focus on education and workforce development at IBM?


01:11.82

Lydia Logan

Sure. I've been working in education, edtech, education policy, and workforce development really for the last 25 years or so from different roles, whether it's association leadership, working in private philanthropy, running corporate social responsibility programs, et cetera. So I've been working in this issue for a long time. And economic development is very, very closely tied to talent development. So I've always been interested in how those two things come together. And I'm fortunate enough to be in this role at IBM, where I lead our global education workforce development programs, sitting in our corporate social responsibility function. So everywhere that IBM is has an ecosystem in community and clients is where we have programs offering free skilling. So it's a great place to be, and it's really exciting work.


02:06.67

Laura Hakala

It sounds like it. I can't wait to talk more about that as we get in. And thank you for sharing those details about your background and for joining me today. So let's begin by talking about some of those pivot points that can really bring the momentum that's needed for a large organization such as IBM to invest so heavily in a free upskilling program.


02:27.70

Lydia Logan

Sure.


02:27.89

Laura Hakala

For example, IBM launched SkillsBuild, which provides free learning and support programs for high school, college, and even adult learners. So, what was the market signal or inflection point that really brought IBM to launch such a large-scale and open-access scaling initiative?


02:48.05

Lydia Logan

Well, IBM has really been engaged in skilling for decades. I'm not sure if people are aware that, actually, going back to the 40s, IBM created the first computer science curriculum with Columbia University. So this has been a corporate value investing in learning and investing in communities for quite some time. The impetus for skills build was that there's always been a gap between the learning institutions and industry, and we really wanted to make sure that we had a program that helped to close that gap and gave the industry perspective, but also made it accessible and it affordable and high quality, not only in the U.S., but around the world. And so through the SkillsBuild program, we're able to do that. We offer technology for free. We offer mentors. We offer content that's both on technical training but also workplace skills. Because we know that technical skills may be the ones that get you a job, but it's really those workplace skills, the durable skills that will help you to keep it. So we offer all of that through SkillsBuild, and we really think that this is part of IBM's role to invest in the communities where we do business. It also helps to invest in that longer pipeline, right? It's not just about coming to work at IBM, but in these communities for our clients and for the economies where we live, work, and play.


04:25.48

Laura Hakala

That's really interesting. And what I think I'm hearing that would be very useful for our listeners to remember is that a strategic initiative like this that's so large and involves such a big investment, it really has to be aligned with your values at a corporate level if it's going to be successful, because that sounds like it's embedded within the very fibers of what IBM is and what the company values.


04:49.55

Lydia Logan

Absolutely.


04:51.26

Laura Hakala

So let's talk about one skills gap in particular, which is the AI skills gap.


04:56.42

Lydia Logan

Sure.


04:57.26

Laura Hakala

Where do you see the AI skills gap hitting the hardest today? Is it at the entry level or for those who are maybe mid-career and looking to upskill? Or is it at the higher level of the leadership roles? Are you seeing any trends or patterns like that?


05:13.36

Lydia Logan

Absolutely. And I would say it doesn't hit hardest at any one of those three. It hits differently in each of those. So at the entry level, I've been thinking about it this way. It's almost like starting in the middle. The entry-level job is up leveling. So what you would have needed to know, three years ago or even, you know, but I have friends whose kids have graduated from college, and then generative AI was not a skill. Using it in an applied way, in your first job, was not really a skill that was expected, and now it really is. So how can you use it to become more efficient, more productive, to do more faster, but it also allows you to free up time to spend more time with your teammates. So if you're looking in that entry-level role, it's… do you know how to effectively use generative AI in the workplace with effective prompts? It may even be building agents or creating some apps or efficiencies, and then regaining that time to just spend on things that… where humans are uniquely valuable in creative thinking and collaboration with teammates and strategy. So that's really where things are starting to trend in the entry level. At the mid-level, you really see more trends around efficiencies from a team level. How are you deploying some generative AI technologies in order to get, you know, higher performance, different kinds of projects, maybe deploying work differently, how you distribute it among your team members, and where you're putting decision points? And then at the leadership level, what we're seeing is we, our Institute for Business Value interviewed CEOs and found they know that their workforce over the next three years needs to be upskilled, largely due to AI's impact on jobs. And so there's culture change that needs to happen. How do you bring people along? How do you look at large-scale upskilling initiatives? So at IBM, we had a Watson X challenge. We were all expected to use our technology. We had a few courses that we took, each one about an hour. Then we created projects using our own technology, our own AI. And then think about how you use it in an everyday application to make your work more effective, better. It might be doing research, it might be creating drafts or outlines, or, but, you know, project setting up projects. So it isn't that it's about removing people; it's really about augmenting the roles that people have.


07:58.57

Laura Hakala

That makes sense. And so, because it hits differently at all of those different levels, it sounds like kind of a customized or catered approach would be needed based on what the individual needs at this stage in their career, where they find themselves.


08:13.56

Lydia Logan

Exactly.


08:15.48

Laura Hakala

Okay, so according to a recent survey of CEOs that was conducted, actually by IBM's Institute for Business Value, the CEOs who were surveyed said that roughly a third of the workforce would need reskilling by 2027. And that ties in pretty heavily with what you were just explaining about the AI skills gap. So with that in mind, what timelines should universities and employers realistically plan for when attempting to close any AI skills gaps?


08:44.82

Lydia Logan

IBM is a learning company. We are required as IBMers to do a minimum of 40 hours of learning a year. And that's a floor, not a ceiling. And the reason is because we believe that you can always, always upskill yourself, learn more, do better, think differently, collaborate. Lifelong learning is one of our core corporate values. So I'd say there is no timeline for a start or a finish. It's a now and always. So get started now. Think about the ways that you can apply new learning and particularly focus around application of generative AI in everyday use. And think about the teams where it can be most effective because people will always ask, what's in it for me? Not just I'm doing this because I've been told to do it, but how will this make my work um better, easier, more valuable, more impactful?


09:43.05

Laura Hakala

I love hearing that. I love that mindset of how you're approaching lifelong learning and how you're really embedding it right there within the requirements of the employees. That's such a smart approach. So, as schools and companies start using generative AI more to help close skills gaps, what kinds of guardrails or oversight do you think should be in place before rolling out any AI-driven learning or assessment tools?


10:11.38

Lydia Logan

Absolutely, it's AI ethics, and that's another core value for IBM. We really believe in responsible, ethical, and transparent AI that's trustworthy. I was recently in Geneva at the AI for Good conference, and I've joined a council with the International Telecommunications Union. And these are representatives from around the world who are really looking at not only using AI to solve social challenges, but how can we make sure that AI is developed responsibly and used ethically? So I'd say whether you're a small local nonprofit, a university, a system, whether that's a public workforce system, an enterprise, understanding how you're going to use it, what kinds of AI tools you're going to use, and how they've been developed and trained and understanding that the ethical development and application is key to start there.


11:12.95

Laura Hakala

Hmm…that makes sense. So let's shift now to talk more specifically about skills-based hiring. So for nearly a decade now, IBM has been a leader in helping the market shift to a skills-based hiring approach. I know that years ago, they made a public commitment to move away from degree requirements in their hiring processes. So not only is IBM providing access to credentials with programs like SkillsBuild, but they're also following through in their hiring practices, which is just so great to see. And while there are tons of benefits to a skills-based hiring approach, it can also be really difficult to implement in practice. And many education providers, as well as corporations and employers, are really struggling with how to map micro-credentials and other non-degree credentials into their hiring and promotion rubrics. So what evidence have you seen that a skills-first approach can widen the talent pools without diluting quality?


12:13.97

Lydia Logan

Well, at IBM, what we have seen is that by removing the degree requirement for many of our job roles, we have seen a broader and much more inclusive applicant pool. And that's meant that we've had different kinds of candidates, people who may have different work experience, people who may have been in the workforce and stepped out for a variety of reasons, or who otherwise wouldn't have applied but certainly have the willingness, the ability, and in many cases the applied experience. So we've definitely seen a change not only in the applicant pool but who we've hired. So it absolutely can work. We ask all of our managers because right not every company hires only through their HR department. At IBM, we're large enough that many managers who are hiring managers are not sitting in the HR function. We're scattered across the company. And so we take training on how to think about who the best candidate is, what skills are required. And then for the credentials that we issue, we make sure that the skills data is embedded in it. So any employer can look underneath that credential and see exactly what it stands for. That's through our partnership with Credly and the way that those are designed. So we want to make it as easy both for the learner, the potential employer, um, to understand what that person knows and is able to do.


13:49.08

Laura Hakala

So that ties in nicely with what I want to ask next, which is that many education leaders tend to fear what I've heard referred to as credential clutter, it refers to the accumulation of old or unused or maybe or digital credentials that it may not be clear what the value is or exactly what was obtained through the learning. So how can learners avoid chasing unnecessary credentials and ensure that each badge that they earn carries real weight with hiring managers? It sounds like you've got a good policy or practices in place that you know really convey the meaning of your credentials.


14:30.15

Lydia Logan

Yeah, we're, we are very careful about where we, when we issue credentials and what kinds of learning we tie it to. We really have some sort of demonstrated learning tied to almost all of them. We carry, we really take our brand reputation very seriously, as I'm sure you imagine, and so we don't believe in issuing credentials that don't matter. We look very closely at job data and skills. We have a whole council here behind our credentialing that makes sure that we maintain high quality in what we issue with the IBM brand on it. Unfortunately, there are lots of people out in the marketplace, and there is some clutter. And we know that one of the things that we have to focus on is maintaining quality. And for us, it's also a brand reputation issue. So people can be sure that if they have a credential with the IBM brand on it, that it's high quality. And when all else fails, you can... fall back on fall back on us. But really, you want to know what's underneath that credential. Is it aligned to the role that you're looking for? I think what is important to note is, a few years ago, people were getting credentials trying to get a new job, and so much is shifting in the marketplace. People are getting credentials just to keep and be qualified for the job they have and to stay in it. We are also seeing that shift. A lot of people who are learning because they need to upskill themselves and make sure, you know, AI may not take your job, but somebody who knows how to use it will, so they don't want that to be them.


16:14.40

Laura Hakala

That's a really interesting point. You know, we tend to think about upscaling in terms of upward mobility and employability and future focus, but it's really important to remember that all of our jobs are changing and adapting so rapidly that you're right, it might just be a matter of keeping pace with your current responsibilities.


16:35.97

Lydia Logan

Absolutely.


16:37.63

Laura Hakala

So that pivots us really nicely into what I'd like to look at next, which is efficacy in digital credentials and really good learning design principles. Because, you know, as you were already saying, a credential is really going to be kind of meaningless unless the learning is really well designed and impactful, and you can trust the integrity of who issued that learning. So in SkillsBuild, you offer courses of varying lengths from shorter and easily digestible modules to lengthier learning pathways. What data helped you to settle on the durations that are used in your learning programs?


17:15.63

Lydia Logan

So we looked at a variety of data. We looked at what is the standard in the market. We looked at what's required for people at different stages in their career, their learning journey. We looked at feedback from our learners. What is it that they're interested in, and how much time do they have? And so we've designed different learning experiences based on those things. Some people just want a bite-sized piece, something that they can do during their lunch hour. Others really want something much more robust. And so we have what we call guided learning experiences that are offered through SkillsBuild. Those might be a day, four weeks, or 10 weeks. And then we also have a live learning series, and we actually have an opportunity for students to do hands-on learning and building AI agents using IBM technology. So we offer a variety of kinds of learning and different ah varied lengths depending on what it is that a learner is interested in achieving.


18:20.56

Laura Hakala

Interesting. And then so beyond the length of the courses, and you already talked about how you have a council for your credential validation and things like that. But beyond any of that, what validation steps, whether it's a third party or more internal metrics, what validation steps can help keep a digital badge from becoming just another PDF certificate? Or, you know, kind of another way of asking that is, how can we ensure that a credential carries the most value for the learner?


18:50.94

Lydia Logan

Yeah. So I think there are a few things. One is can they demonstrate some applied learning along with that credential? So not only did they do the learning, but did they do something with it? And what they did with it may be part of earning the credential, or it may be something that they did, you know… following the credential to show that they learned it and they know how to do it. So that's one thing at IBM, we look at market data to keep our credentials fresh, and we also use our internal IBM experts. And we also sometimes go into our clients and ask them about what they're looking for. So we have a variety of ways of validating how we're constructing our credentials, but I think from a learner perspective, you wanna earn it, which is a learning part, but you also wanna do something with it, and you need to be able to demonstrate and talk about what that is.


19:46.45

Laura Hakala

Yeah, and application, as we all know, is the best way to drive long-term retention of the learning. So if they're applying it right away, then they're more likely to gain value and see that credential as valuable because they're using it, and they're actually going to remember it long-term because of that immediate application.


20:04.90

Lydia Logan

That's right. For our longer courses, we have projects that our learners are able to do that helps them to create a portfolio that they can then share with a potential employer.


20:17.29

Laura Hakala

That's great. So let's talk about university partnerships. How would you describe your current partnership strategy, whether that's internationally or in the US? And where are you seeing the most successes?


20:32.38

Lydia Logan

Sure. Everyone knows that right now, universities are under the gun. AI is really upending the way that people have looked, particularly at undergraduate degrees. And the pace of change is mind-boggling, and they're trying to keep up. And so we have everything that we offer through SkillsBuild and then across IBM, how are professors thinking differently about teaching and learning? We offer courses specifically for professors that are across disciplines, so both in technical disciplines, whether that's data science or computer science, but also in marketing. How is AI changing the… the industry around marketing, about how you use data analytics and market segmentation and things like that to create campaigns, image generation, all kinds of things? To how would you teach basic cloud computing or AI fundamentals in a class? So you can be a professor and train with us, and then you can use that just for your own efficacy or to reteach that content in your class in whole or in part. So that's one way that we work with universities, and it is with professors. The other thing that we have our, we go directly to students. So that one opportunity that I talked about that's open now, or where students can use our technology and build a capstone project over a period of time and interact with IBM experts and mentors, that's another opportunity for us to work with universities. And then we also work with university leadership and helping them to think through how they might start on their journey of institutional transformation. We bring in our experts with AI ethics. We talk about policies. We talk about you know, how they might bring these… the technology into efficiencies. And we talk to professors about, you know, what are their hopes and concerns about how to use technology in a learning journey. We have some relationships at IBM on the business side where students informed the creation at a university of a campus navigator, that's a new app. So there are lots of ways that IBM works with universities, and through SkillsBuild, we've got student opportunities, professors, and then we also work with administrators.


23:10.59

Laura Hakala

Okay, so narrowing in on the professors just for a moment, there's often some hesitancy when it comes to relying on AI tools. So what has proven to be the most effective in moving faculty away from any of those hesitations and getting them actively teaching with AI tools?


23:31.51

Lydia Logan

So, I think this also gets back to culture change. And there are always some professors who are early adopters, and we often hear from them first. They're the ones who want to try new things, understand the technology, think about how they might start to integrate it in their classrooms. They know that their students are using it. They don't wanna be behind the students. And so they're thinking about one or the other. If you are going to use it to do your research, then how do you construct discourse in your classroom to make sure students can demonstrate what they actually learned? And for some, that means they're allowing students to use generative AI in research or in creating outlines and things like that. But they have to still do some sort of large-scale project where there's no way that they could have done that without the learning, or they have to… they go back to blue books or, you know, there are a variety of ways that they're sort of flipping the narrative. They're saying, OK, we get it. We're going to… you're going to use it, whether I tell you can use it or not. Use it. Make sure that you are thinking critically about the information that you get if you're using it to do research. You may use it to… to create your source list, but then you have to actually go and read all of those materials. You may use it to summarize, but if you've read the material, you understand whether those summaries are accurate, and you have to understand how to tweak them. So it's thinking through what the learning process looks like and making sure that you're not throwing out some of the things that are most important about doing a project start to finish, understanding how they're constructed, so that when you get into the workplace, project management is a skill that you have. Understanding how… how you make revisions and improvements, and making sure that you, as the human, are always the last…the last layer of the decision-making process. You're not just taking what you get and make and assuming that it's all accurate because even though ah generative AI tools can be really good, they're not perfect. And so we want to make sure that students know that and professors know that, and that it is okay to integrate it in the classroom, and that we're here to help you look at different models of doing that.


25:58.10

Laura Hakala

Yeah, that's… that's something I think about a lot all the time, you know, both personally and professionally, is the fact that these learners are graduating into a world where AI is just a part of it. And you know it's almost irresponsible to kind of ignore it as part of the classroom because it, you know a lot of times it does tend to be a better strategy to teach them how to interact with it responsibly and… and how ultimately they are still accountable for the words and ideas that they put forth into this world, whether or not they had AI to support them.


26:31.47

Lydia Logan

Absolutely. You know, years ago we were talking about whether or not we were going to allow students to use technology in the classroom and how that was upending the way that teaching was happening um and whether it was an augmentation for a teacher or a replacement. And so here we are, right about 15 years later, and we're having a different version of the same conversation. And of course, where we landed was, sure, you can use a graphing calculator, but you still need to understand what the graph means and what the numbers are saying. You can use a computer to do some, you know, to do a search, but you still need to understand which resources are good and appropriate for the project you're doing or not. So here we are again. Same conversation, slightly different context. How are you going to use it? How are you going to make sure you're using it responsibly and ethically? And how are you going to do it in a way that it's augmenting your learning and you're preparing for the next step in your journey?


27:29.12

Laura Hakala
And the cycle continues. What will we be talking about next? Well, we are coming to the end of our time together. um And so to close out, we have kind of a fun lightning round of fast questions. So for these, just kind of share the first thing that pops into your mind.


27:46.62

Lydia Logan

All right, ready.


27:46.76

Laura Hakala

All right. So you ready?


27:49.20

Lydia Logan

Here we go.


27:49.97

Laura Hakala

OK, so what is one metric that instantly tells you that a scaling partnership is on track?


27:57.02

Lydia Logan

What tells us if it's on track is usually they're telling us what their learners need, and we do kind of a matching process to see if what we have is what they need. And when we see a lot of learners logging in and moving through the learning pathways, and we're getting feedback both from the organization that's the partner, whether that's a university, a nonprofit, or an institution, and we're seeing completion rates, and we're seeing progress happen. That's where it gets really exciting. I know you asked me for one word, but you know that's where the magic happens is when we see learners really taking hold of the learning and and um you know, getting it on the road with it.


28:41.12

Laura Hakala

No, please don't abbreviate these answers. This is exactly what I want to hear. So, what's the most under-hyped AI role that universities should be preparing their students for?


28:54.00

Lydia Logan

The most under-hyped role, I think it's less of a role than it is about really understanding what AI is, what it isn't, how it's… how it's developed and constructed. And I think the shift that's happening now that universities need to really be aware of is for a long time, there have been engineering schools teaching students how to build the next generation of technology and all the other disciplines. And in the other disciplines, maybe or maybe not, you're using technology. And in these days, maybe or maybe you're not using generative AI to support your student work. But now with Vibe coding, anyone can be a builder. So users are becoming builders. And that's the big shift. If you're not teaching students in disciplines across the university how to not just be a user, but also how to be a builder, it's a miss. Students are going to get into the workplace, they're going to be expected to use generative AI to create, maybe to create an agent, maybe to create an app that helps them to do things more efficiently. It's going to go beyond prompting very quickly. We're not even talking about what's… what's coming, which is hybrid teams of people and AI agents where agents are doing, you know, work, and managers are responsible for project management and determining what is human work and what is work that's done by agents. So I would say universities need to, A, understand that's coming. B, everyone needs some AI fundamentals. And by the way, you can get that on SkillsBuild. And make sure that students are starting to just play around. And I say play around because that's less threatening. It needs to be accessible and affordable so that it’s a way that they can start to do something that matters to them, and then put it into a context where it's higher stakes. But if you're not facile with the technology by the time you get into the workplace, you're already behind.  


31:42.27

Magic EdTech

And what emerging tech are you cautiously optimistic about?


31:47.40

Lydia

Quantum is coming. You know there are things that are happening now, you know, um where it's… It's harvest now, decrypt later. You hear about these hacks, and people may not understand what's really happening, but there are bad actors that are out there who are data mining, and they're waiting for a time when that data may not… may be encrypted, and they can't access it right now. But there will be a time when quantum breakthroughs happen, and they'll be able to unlock that data. IBM is working diligently on quantum advances. We've got quantum computers and teams of researchers around the world, and collaborations that we have to make sure data stays safe. You know our systems protect um most of the banks around the world, and a lot of people don't know that about IBM or Z systems. But there is fascinating work that's happening with quantum, and we'll be able to solve problems that have been intractable and would have taken hundreds of years to solve, and should be able to be solved in minutes when quantum breakthroughs happen. So we've got teams in research that are working on that now.


32:58.94

Laura Hakala

Wow, that was excellent, Lydia. Thank you for all of those ideas. Now, let's close out by offering some advice to our listeners. So let's say a university dean or an L&D director decides today that they want to shift toward a skills-first AI-ready strategy. What would be the first high-impact pilot that you'd recommend they launch within the next six months?


33:27.67

Lydia Logan

So I think the first pilot is to think about where they can find some efficiencies in their systems and start small, start… start with teams. IBM is a client zero company. So we started transformation through HR and those processes that were standardized across countries or across the world and were happening all the time, those were automated, and that freed up our HR teams to be able to work on what we call complex cases. And so that's where our HR teams spend their time now. Things that could be automated are, and we have an Ask HR AI agent, and I can interact with it with plain language, and I can ask it to do all kinds of things, and it can tell me about my benefits, and it can tell me about you know new… If I'm bringing someone new onto my team, it can help me assign them to a workplace and do all kinds of things that used to require lots of forms and… and people, and it saved a tremendous amount of time. It's also meant, like I said, that if I have somebody who needs to go on family and medical leave and has some sort of a complex issue, I talk to a person about what it takes to make that happen. So we're saving, again, humans for the most important role that they have; critical thinking and exceptions and and applications, and we automate what isn't. So the first step is to think about where can you automate, where could you save a lot of time, and where can you have humans doing the most important work that's uniquely for them and get started there. Start small, be specific, think about automation, think about ethics. We run workshops called AI for Impact, and we invite nonprofits to do that. We are also running those for universities. So we're always happy to invite them in and participate in those workshops with us.


35:35.86

Magic EdTech

Those are really excellent strategies. That's that… those are really good starting points to work from. um And finally, where can our listeners go to learn more or to take action?


35:48.15

Lydia Logan
They can go to skillsbuild.org. We have lots of app opportunities for them. And like I said, if they're interested in learning more about AI, experimenting with IBM technology, and building agents, they can they can register, and that's on the skillsbuild page.


36:11.17

Laura Hakala
Well, that is fantastic. Thank you so much for those insights, Lydia, and for your time in chatting with me today. I really enjoyed our conversation.


36:19.71

Lydia Logan

Thank you so much. It was great being with you.


36:22.61

Laura Hakala

That wraps up today's episode of Tech in Ed Tech. For more conversations that aren't afraid to ask hard questions, hit subscribe and check out our past episodes. And if you've got feedback or just want to challenge something you heard today, feel free to reach out. Until next time, thanks for listening to the Tech in Ed Tech podcast.