
Tech in EdTech
Tech In EdTech improves the dialogue between education leaders and the innovators shaping edtech. This is your go-to show for actionable ideas and solutions that make digital learning not just possible, but effective, practical, and inclusive.
Tech in EdTech
Slowing Down to Move Forward: AI Done Right in Schools
Nathan Holbert, Associate Professor at Teachers College, Columbia University, talks about the thoughtful integration of AI and other technologies into education. The conversation goes beyond AI’s “wow” factor to discuss how teachers, learners, and the broader school culture interact with new tools. The main thread is a call for intentionality and theoretical grounding when adopting AI. Rather than treating AI as a quick fix or a “faster” solution, he emphasizes how technology should support the social, cultural, and relational aspects of learning rather than overshadow them.
00:03.88
Katie Warmington
Hi, everyone. This is Tech in EdTech. In this podcast, we discuss technology that powers education and improves learning for all. Welcome to today's episode where we are going to talk a little bit about AI in EdTech. I'm your host, Katie Warmington, Assistant Vice President of Cloud Services at Magic EdTech. And our guest today is Nathan Holbert, Associate Professor of Communication, Media, and Learning Technology Design at Teachers College at Columbia University. Welcome, Nathan.
00:30.73
Nathan Holbert
Hey, thanks for having me.
00:33.07
Katie Warmington
Of course, before we dive into some of your work that you're doing in this field, I'd love to get a little bit more of an understanding about your background and how you got into education and then the work you're doing today.
00:45.34
Nathan Holbert
Sure. Yeah. So I was a, many, many years ago, I was a high school chemistry teacher. So, I kind of got started in this field, in the classroom. I began my teaching right around the time of the no child left behind act. And if you're from the US and if you're of a certain age,that may mean something to you. But the basic situation was that schools really shifted at that time to really attending closely to assessments. And schools were taking all these standardized assessments, kids were being assessed like crazy. And so suddenly, you know the excitement I had about the kinds of things that we could do in the classroom and the explorations that we could go on in chemistry kind of came up against this tension around needing to teach to the test. And so I became a little disillusioned with what I was doing or could do in the classroom. And at the same time, I started really noticing that my students were having all these like really rich, really interesting conversations about learning and problem solving and exploration of things they were doing outside the classroom. So like things like they were doing in video games or things they were doing with their friends after school.
01:54.27
Nathan Holbert
And I guess at that point it kind of dawned on me that learning wasn't a thing that had to happen in just a school and that the kinds of cool inquiry and problem solving investigations that I was wanting to support and enable in my classroom could happen in lots of other places too. And so that kind of led me on a long journey of too much schooling where I eventually got my PhD in the Learning Sciences at Northwestern University. And it was there that I started really focusing in on play, and on games, and on thinking about how we could create learning environments that really enable learners to build connections between things that they're passionate about or to encounter really powerful ideas and practices of things like science or the humanities. So that's kind of how I got my start. And I'm now an Associate Professor at Teachers College at Columbia University, like you said, and I've been doing this for over a decade now and one of the fun roles that I play here is I also direct a place called the Snow Day learning lab where I work with my students to think about how we can build new learning technologies, new tools, new toys that can enable young people to build stuff that they're excited about or that they're passionate about or to take on challenges that might matter to them or to their communities. So, you know, we do all this fun work. We build all these cool technologies. We spend time working with teachers and working with students. And kind of all of that collects, all those experiences kind of collects together, I think, to form what I kind of see now is my real vision. And that is thinking about education is not just a way of communicating facts or of like checking off a list of practices that you're supposed to encounter because of a curriculum of some sort, but to actually think about education as a way of creating the conditions where young people can really expand their capabilities or or really expand what's possible in their lives. And so, you know, we think about knowledge, we think about helping them learn new information, but we're also, hopefully as educators, really engaged with helping young people gain a better sense of who they are and what matters to them and what they can do to kind of change the world around them. So my work now is really focused on that, building those technologies, building those environments, working with school districts to support young people as they explore and create things that matter to them.
04:12.98
Katie Warmington
That's great. You're doing a lot of good work. It sounds like you're helping really make learning fun. That's excellent.
04:18.25
Nathan Holbert
Yeah, well, play is a big part of what we do. We spend a lot of time thinking about learning as a playful space. We spend a lot of time thinking about what it means to be creative when you play. I actually have a separate podcast besides this one where I talk with my co-host Haeny Yoon about play and pop culture and how that can contribute to how people think and learn. So yeah, play is an important part about how I do my work.
04:41.60
Katie Warmington
Excellent. I want to dive a little bit into you know the hot topic that everyone is talking about, of course, artificial intelligence in educational spaces. And I understand you've expressed concern that AI is often adopted too quickly and without sufficient theoretical grounding. Can you share some examples where you've seen this happen and what are some of the risks that you've seen?
05:07.09
Nathan Holbert
Yeah, I mean, I think there is a lot of enthusiasm, a lot of excitement around AI. And I think that has led to a lot of people diving in really deep and starting to implement in all sorts of different ways and forms without really considering what their goals are or considering the ways in which AI's design might support or or conflict with those goals.
05:28.68
Nathan Holbert
So I think really a big issue here is a lack of kind of a coherent theory of cognition with regard to AI in education. Let me dig into that a little bit more though. So most current examples of AI being deployed in educational settings right now happen in one of two kind of buckets, two different kinds of forums. One is a chatbot tutor, where you kind of have a kid or a learner interacting with a chatbot of some sort. And then the other is something like a teacher assistant. So it's a sort of tool that a teacher might use to build their classroom or curriculum or discussions, whatever it may be. So we'll take on kind of each of those individually. So the first, this chatbot, it's kind of an interesting thing, right? I mean, there's this kind of vision that what if we had a machine that could answer any question that you had and could do it in a way that appealed to you as the learner or that adapted to the questions that you're asking. And that feels nice. And that sounds really, I think, appealing on its surface. But what that is, is at its heart, I like to call it just a kind of a question and answering machine. Right, it's a machine asking questions and forgetting answers.
06:38.11
Katie Warmington
Mm hmm.
06:40.96
Nathan Holbert
That's fine. Like that's you know, that's essentially what our search browsers have been for a number of years, but that's not really what education is. Education isn't just a series of questions and answers.
06:53.94
Nathan Holbert
Right? It's not what we used to say, yeah it's not transmitted. Knowledge isn't transmitted. It's not a situation where your teacher has all the answers, and they tell you all the information you need. And if you're a student, you copy it down, and you now have acquired that knowledge, and somehow or another, you know, education has happened. That's a really old and really incorrect understanding of what education actually looks like. And so this kind of question and answer machine really harkens back to that old incorrect model of what education is. Education learning is a highly social act. It involves interactions and relationships with people. It's also embedded in the culture and the context that we find ourselves in.
07:40.91
Nathan Holbert
It's not the kind of thing that you can just sort of disembody. You can't take information and peel off all the context and peel off all the politics and peel off all of the kind of current events or the histories or the experiences that people are having around it and then just kind of magically deliver it to somebody.
07:59.61
Nathan Holbert
And this gets back to this other thing I like to kind of pick at with AI, and that's this word efficiency, all right? So people often talk about AI as improving the efficiency of a classroom or learning technologies, even if they're not AI, kind of computational technologies as increasing the efficiency of teaching and learning. But this word efficiency isn't just speed, right? We often take efficiency and we think that means fast.
08:26.68
Nathan Holbert
But it's two things. It's fast and also accomplishing something. And so if we're building systems that are primarily just about efficiency being delivering information fast, but we're not considering how an understanding occurs and whether or not if you get this kind of fire hose of information, you actually can understand any of it, we're kind of missing the point. And we're in fact building highly inefficient systems as opposed to efficient ones.
08:51.65
Nathan Holbert
So that's one. And the other thing I'll say here is I mentioned the other form of this is often as a teacher's assistant, right? You can find all these like, you know, many, many, many, many tools online that you can download that suggests that the AI will give teachers kind of lesson plans or give teachers prompts for discussions. Oftentimes they'll promise to help you do your grading, you know, given the student's feedback on their assignments and all these kinds of things. And that, you know, on the one hand, that feels kind of nice. Teaching is like a really hard job. It's a thankless job. And so the idea of having some assistance, I think really it is appealing and it makes a lot of sense in some ways.
09:36.35
Nathan Holbert
But again, teaching isn't just delivering information, right? It's not just assigning scores on things. It's really about, teaching is really about helping learners to build relationships with ideas, with practices, with domains of knowledge. And that's a difficult, but that's a really important job. And good educators have a whole lot of techniques. They have a whole lot of knowledge. They have this kind of art of this practice of how to support that knowledge building and that relationship building. And so a good teacher knows about their students. They know who their students are and what they're struggling with, what they're excited by, what they're, you know, possibly even some of their home situations are. A good teacher knows about context. They know about the ways in which certain information or certain facts or certain practices that you're supposed to be learning in the classroom actually intersect with other kinds of ideas and practices and things out in the world. And that's a thing that the machines that we use for these types of tools are often missing.
10:38.05
Nathan Holbert
They're often trying to deliver this sort of disembodied information devoid of context. And I think that, that's something we really should be concerned about and why I think I'm very cautious about the form that AI is currently taking in educational practice.
10:54.61
Katie Warmington
And in your view, how do educational companies, whether they're startups or large, very well-established corporations, seem to sometimes misapply AI in the classroom and classroom materials. So what would you recommend that they do differently to align with proper learning theories?
11:15.16
Nathan Holbert
Yeah, well, you know, so right. The first part is having some sense of the different theories of learning that are out there. And I should note there's not like one learning theory that's correct and all the others are wrong. You know, we build theories, we build models to try to understand this thing that's really complex and messy, and so we build multiple theories of learning to try to kind of capture the different features that we know exist and are present in learning situations. So number one, it is hard, but it's important for people building learning technologies to have some of those theories of learning and then to incorporate them in their design.
11:54.71
Nathan Holbert
But to your question about why does this happen so often? Why do companies and startups often kind of misapply AI? Education is a really funny thing. It's probably one of the only things that almost everyone has had an experience with and almost everyone has an opinion about.
12:13.68
Nathan Holbert
We've all been students, we've all gone through years of schooling, right? And so we kind of have this sense of like, oh yeah, I know what education is. I know what it was for me and therefore I know kind of what it is. So that's one. People kind of have this sense that they know, they already know what education is and what good education looks like. Another, I think, misstep that often happens is, is there are these really persistent learning myths that we kind of like to grab onto and we have a difficult time letting go. These are things like learning styles or multiple intelligences. These kinds of myths drive design decisions and they also are really appealing. Like everyone loves a good like, you know, behavior test or or some like psychology test, they'd love to take a test and then figure out, ah, yes, this is the category that I fit into. And so like, learning styles, like, it really feels good to say I'm this kind of learner. But there's no empirical foundation for these things. These are myths that aren't accurate or true in the way in which people actually learn. And so it's difficult for us to kind of sift through true, empirically grounded evidence about how learning unfolds and how and how to build good learning environments.
13:28.58
Nathan Holbert
And then lastly, I'd say I think another piece of this is ego. And I say this as somebody who has plenty of ego. So you know I can throw the stones because I'm one of them. But I think that most of us that kind of end up at certain places in our lives, we look at ourselves and we say, well, I'm successful. And you know this path that I went on led me to this place. And I suspect others should also go on the same path that I went on. And that's a really kind of like, again, it's a very egotistical way of looking at the world. But it's common. And I think it's also kind of extra pernicious in the tech industry, right? It kind of draws those of us who have kind of big beliefs about ourselves and big beliefs about what's possible. But it kind of leads to this, what I might call a kind of a solution-oriented mindset, which is where we all sort of start to see everything as a problem that has a simple solution, often a software solution that can address it. So all these things kind of together lead us to the situation where we think we know what education is, we think we know which education was bad and which education was good, and we have these kind of loose myths that we can draw on, that we can and identify for ourselves as if they're actual theories or they're actual evidence, right? And then we just kind of go for it.
14:50.84
Nathan Holbert
And AI, you know, the second half of this question is like, why doesn't the AI keep getting misapplied? I think it's because AI has this really fun, kind of magical quality to it that also looks exactly like we always imagined education of the future was going to look like, right? Like if we go back to all our sci-fi stories or television or the movies, right? Schools are supposed to be like the Jetsons. like I'm supposed to be educated by a robot in the future.
15:22.30
Nathan Holbert
That's what it's going to be. Or you know we watch The Matrix and we watch information being downloaded to the brain. And that sounds like, oh, that's so cool. That's going to be the future. Or the one that always really gets me is there's a lot of like tech CEOs that always cite the book Ender's Game.
15:38.04
Nathan Holbert
You know which has kids like learning and doing this like intense computation and problem solving with tablets. What they forget to mention is that the tablets were assisting the children in committing genocide. It's actually like a really dark, dark story. But for some reason, like we just remember the first part and think, oh, yes, that's what school is supposed to be like. So we have all these stories that we tell ourselves. This great thinker and essayist, Audrey Waters, talks about the ways in which our societies are built from the storytelling we do about technologies. And then we build technologies to match those stories. And then the technology we build, then invite us to tell, again new stories from that.
16:20.26
Nathan Holbert
And then even if the technologies that we're building don't quite work as well as you know the storytelling would have us believe or doesn't quite work as well as the advertising that we say it works, there are people who want it. And it looks attractive and it looks appealing. And so we find schools that are hungry for these technologies and hungry to help their communities and their learners in those communities. And they like the idea of having kind of the shiniest new advanced technologies.
16:46.59
Nathan Holbert
And so we kind of have this perfect storm. Where we believe we know the solution. We have this magic, seemingly magic computer device, which convinces us it's kind of like the future. And then we have educators and schools that need assistance and that need help. And it all kind of feeds itself. And I think really what we, in this case, what would be useful is kind of to take a moment, you know, for somebody to say, hang on a second.
17:12.71
Nathan Holbert
What do our students really need? What are the goals that our communities have? What are the values that we have together? And then does this technology meet those values and those goals? And if it doesn't, how can we redesign it? How can we change it so that it does address our needs? Just a kind of, you know, some patience to connect those dots, I think would be really helpful here.
17:37.67
Katie Warmington
Great. And now I want to understand, too, to pivot a little bit into your current research projects. You're developing tools to help teachers notice patterns in student learning. Can you describe how these tools can handle the complexity of real classroom environments?
17:53.61
Nathan Holbert
Sure. Yeah. So I mean, in my lab, my students and I, we spend a lot of time building a whole range of different kinds of technologies. We think of tech.
18:03.89
Nathan Holbert
I mean, so first of all, I should say like, I think of the word technology pretty broadly. So, you know, some advanced AI algorithm or some digital projection system or sticky notes, right? Like those are all technologies that we can use.
18:18.25
Katie Warmington
Mm hmm.
18:19.32
Nathan Holbert
So in my lab, we are building a whole host of different kinds of technologies, some that are very high tech and some that are maybe a little lower tech. But all of those are, you know, like you said, sort of with the goal of supporting learners in connecting with ideas that they're passionate about and supporting educators and teachers to kind of create the conditions for that.
18:43.04
Nathan Holbert
I think I often, I feel like I always need to say it is that we actually know what good learning and good teaching looks like. I mean, we have like decades of research on this. We have many strong theories of not just learning, but also of design. We have all sorts of powerful pedagogical models. You know we have these rich systems to support learning, but it's really hard to actually implement the things that we know.
19:08.72
Nathan Holbert
Like, it just turns out that classrooms, and not just classrooms, but even informal learning environments, they're just really, really, really messy. And they're really complex. Like, people are weird. They have all these needs. They have all these motivations. And when you put a whole lot of them together, you know, the world gets in the way of the perfect learning system as we could potentially design and in a lab somewhere. And so, the problem isn't, do we know what learning is? The problem is how do we build experiences and opportunities to support that good learning? And I think, you know, AI could be an additional tool for that, for that implementation of supporting the complex teaching and learning that we know about. Not to, AI not as the machine to do it all, but AI potentially as a tool that can smooth certain types of interaction frictions that we currently have in our educational systems. Or to what you said in your question and work that we've been doing, to think about the ways in which digital tools can make thinking, make learning really visible when it's hard for us to notice.
20:16.92
Nathan Holbert
So to get into some specifics, I mean AI has this really interesting kind of capability of noticing patterns. And essentially, that's what it is. It's a pattern matching machine. Right? Like it looks through loads and loads and loads of data. It starts to notice things are often, you know, connected. They appear frequently together in the corpus or they are likely to sort of be encountered in similar times or similar ways. And so it's just a giant pattern matching machine.
20:45.12
Nathan Holbert
In fact, the AI that we know today, the sort of generative AI is often really, you know, a decade or two ago was just kind of called machine learning. These were like algorithms that we would build for finding patterns. So whenever we think about how a tool like this, this pattern matching tool might be useful in a classroom, it's important, first of all, to say, let's stop trying to think of it as a machine to replace a teacher because a teacher isn't just a pattern matching machine. Right. But how can a pattern matching machine help a teacher to see their classroom, to see what's happening in a complex, rich learning environment?
21:20.62
Nathan Holbert
So it might be helping the facilitator or the educator, or the teacher notice a particular activity that a student is engaged in or a collection of students that are engaged in, that might be worth paying attention to, right? Like maybe they're, I do a lot of work in maker spaces where kids are like constructing these complex and really exciting interactive, you know, computational objects. And maybe it's like we notice a couple of them are starting to engage in like building circuits that are different than a sort of standard three-piece circuit, right? And so we want to bring attention to that. And so it could notify the teacher to let them know that this is starting to be kind of explored. And the teacher can sort of show up and be there to assist with questions or can bring that particular example to the whole class so everybody can encounter it or look at it. So that's one kind of a thing it can do. You can also imagine it you know creating opportunities to notify the teacher of like, what kinds of materials are you likely to need tomorrow? Because students are working on these kinds of projects. And so it can kind of assist the teachers in thinking ahead about how to provide both material support as well as kind of instructional support. Or it could find these two students are working on similar kinds of projects. And if you, the teacher, know that they get along fine, maybe they could work together to collaborate on those projects. So it's about helping the teacher see things in their classroom that they might not be able to see whenever they're engaged in the whole class throughout the day. And I think this then, you know, to kind of wrap this up, it kind of means that the AI could be a tool for doing both big picture noticing that the teacher might miss, you know, the class collectively is moving in this direction, people seem to really be struggling with resistance maybe in a circuit or seem to be kind of really getting excited about building tools around you know climate change or whatever the big picture might be. And it can also help the teacher notice like really minute, precise details about the kinds of projects and the work that kids are doing. So again, the teacher can really engage and use their expertise, their knowledge to assist.
23:25.17
Katie Warmington
So really it's helping teachers automate and enhance you know some of the tasks, like you said, big picture and small picture, but it'll free up their time then right to have more high value tasks to apply the human element to it to make the to make it things much smoother for their students. Would you say that's sort of how you're looking at it?
23:50.51
Nathan Holbert
I mean I would be a little cautious with some of those words that you chose. I mean, I think certainly, the word automate starts to become a little uncomfortable to me, in this case, as well as, you know, this phrase that I see in every press release is to free up more time for the teacher. And like, I get uncomfortable with that because a lot of times I'll go look to see the list of things that the AI is doing. And it's like, that's it. It's doing everything. What is the teacher? What's left for the teacher to do? And I do think sometimes those kind of, those tasks that feel really difficult and challenging are actually a rich and core part of teaching. And so this idea of kind of freeing up the teacher, I would be a little cautious about because sometimes the teacher, the very work that the teacher wants to do and is best skilled at doing is these time-consuming tasks where they're engaging with students. So I mean, I would rephrase it to be, it's less about like, automating the teacher's work, and it's less about freeing up the teacher's time, and it's more about what's an extra tool that a teacher has access to that helps them see their classroom, that helps them notice what people are engaged in so that they can act, so that they can respond, so that they can participate and engage with the learners. So it's not really freeing up so much as it's an extra pair of eyes to notice things that are hard to notice especially when you have a classroom full of kids doing lots of interesting, complex things.
25:19.68
Katie Warmington
Yeah, I know that makes sense. Yeah, that rephrasing of it that resonates. So based on your experience and what you're working on today, what would you recommend some key questions for educators and ed tech decision makers to ask when they're considering new AI driven solutions for their classrooms?
25:42.78
Nathan Holbert
Yeah. I mean, I think a question that we should all be asking more often, period, is what are the values here? What are the values that this experience is elevating? Or when we're talking about technology specifically, what are the values that are sort of embedded in the way in which this technology is being used? And then do those values match our values? So let me give you a quick example. When I think back to my days as a student many, many years ago, a technology that was in every classroom that I went to was the overhead projector.
26:18.74
Nathan Holbert
You remember the overhead projector?
26:19.73
Katie Warmington
Uh-huh, yep.
26:20.06
Nathan Holbert
Yeah, the light screen and everybody has the transparencies and they were either like you know projecting a worksheet that they'd print out or maybe they were projecting and then writing with a dry erase marker their notes onto the overhead projector.
26:33.99
Nathan Holbert
Right? This is like a, it's a common technology. Like I said, it showed up in many, many classrooms. And what are the values of that technology? Well, one value is that there should be shared artifacts that we all see collectively in a classroom, right? That the project it You're projecting one thing onto the screen and everyone will see the same thing. Another value of this projector in the way that it was often used was that this is a way for the teacher to show you the information, to show you the vocabulary words or to project an object or a diagram on the board. And then the other piece of this was that you the student were meant to copy it down and put it in your notebook.
27:14.49
Nathan Holbert
And so it kind of has this set of values of shared objects and objects that are transmitted to students so that the student can now put them in their individual notebooks. It doesn't have to be that way. Like I could use a projector to like invite all my kids in the classroom to like, you know, shout out answers to this kind of provocative discussion question I've given them and then I could write those down and now a value that's being used by this projector is as a way to invite all of us to see our collective ideas kind of emerging and appearing. But the technology kind of affords certain types of uses, right? And so we have to kind of ask ourselves, well, what are the values that we have as a classroom, as a school, as an informal learning space? And how does the technologies that we're adopting match those values? Or how does it conflict with those values?
28:07.19
Nathan Holbert
I think that set of questions is astonishingly rare for those of us who run classrooms and teach. And I again, include myself in this. It's easy for us to kind of get in the motions, going through the motions and kind of adopting technologies because it's new or because somebody else said they use it without thinking about how it fits the values and goals of our learning experiences.
28:30.63
Nathan Holbert
And then I think we also need to ask questions like, does this technology enable student-centered learning, right? And this is another kind of buzzy word that I want to always be cautious about because student-centered learning is good. And that is something we should be striving for in every educational experience we create.
28:50.75
Nathan Holbert
But we can kind of delude ourselves into what is student-centered on accident, right? We say, well, the machine, you know for example, these AI chatbots, like the machine lets a kid ask whatever question they want to ask. So that means it's student-centered, right? Maybe, but also most of these machines, as I've used them, as I've encountered them, are constantly trying to nudge the student towards the thing the machine wants to really be asking or teaching for the student.
29:17.39
Nathan Holbert
So we'd want to reflect on to what extent is this machine or this device or this technology really enabling student centered learning or to what extent is it really just kind of centering the values of the district or centering the values of the test makers or centering the values of the tech company that created it. We should also be asking how this tech elevates the strengths of our schools and the strengths of our teachers rather than how can it replace their work?
29:45.51
Nathan Holbert
And then lastly, I would encourage us to always be asking how this technology treats knowledge. Does it see knowledge as something that is static and transmitted and you know passed down from somebody with more knowledge to somebody with less knowledge? Or does it see knowledge as something that is constantly constructed and alive and dynamic and knowledge that is about interaction between people's experiences, their cultures, their histories, their values? How does it treat knowledge?
30:16.74
Katie Warmington
That's great. And then also, I'm curious, as we think about balancing innovation with foundational principles, what do you see as the biggest challenges in integrating new technologies into established educational practices?
30:33.55
Nathan Holbert
Yeah, it's hard to do. I mean I've been in so many different implementations and research projects where we build something that we think is like really great and a really powerful tool to be used in a classroom or you know a powerful way for students to learn. And then we find that it just does not catch on. like just like Students don't quite get it or the teachers don't quite understand how it fits. It's really hard to create innovation in educational spaces.
31:02.11
Nathan Holbert
A word that we talk about a lot in my research and in my classrooms is this word culture. Culture is often kind of treated as this simple word that just means like what country you grew up in or what language you speak or something like that. But culture is like this really multifaceted, really rich word that is describing shared communities. Right? That shared communities have culture. Those shared communities could be at the size of nations, but oftentimes they're small communities that have a shared culture and a shared set of language and a shared set of values. So we could talk about the culture of a specific school.
31:41.74
Nathan Holbert
Or we could talk about the culture of mathematicians, what it means to be a mathematician. There's specialized language there, right? There's specialized practices. And that culture would be different than the culture of, you know, the theater arts. So starting to treat not coming into knowledge as coming into a culture is, I think, first of all, a really important thing for us to do. But then we also have to recognize that cultures have because they have within them their sort of specific language and specific practices, there also can be resistance to change. And so if we start looking at schools and classrooms as a place with a very rich culture, which it is, then you know the hardest thing to do is to introduce some new technology or some new set of practices or ideas that doesn't fit the existing culture of that space.
32:31.46
Nathan Holbert
So you sort of say, ah, teachers, you need to start doing this X, Y and Z and use this new technology and it'll help you do that. But if it doesn't fit what the educator understands their classroom to be about and how they understand their classroom to function, then you're gonna have this major conflict between the quote unquote innovation that you're hoping to introduce and the existing culture of the space in which you're introducing it.
32:58.23
Nathan Holbert
A quick example of that, if we go back to like the late 1970s, 1980s, this programming language called Logo was invented. And it was a programming language specifically for children to start working with programming, interacting with computers. It was like incredibly innovative of you know that day and age. And teachers were really into it. They kind of adopted it, picked it up quite quickly. They really liked the idea of the kind of creative explorations that kids could engage in. But simultaneously what happened was computers were really expensive and schools could only afford some of them and they would immediately worry that these computers would get broken or get lost and so they would lock them up in classrooms.
33:43.86
Nathan Holbert
And then they'd call those classrooms that they locked all the computers in ‘The Computer Lab’. And then you could go to the computer lab, which is sort of this lab that kind of evokes a science classroom, you know, recipe driven way of thinking and curriculum. And so then the next thing you know, like 5 to 10 years later, these computers that teachers were initially excited about because they invited kids to be creative and explore and make new things. Now these were computers and computer classes that were primarily about teaching kids how to use Microsoft Word. I definitely had a class like that when I was in middle school.
34:14.66
Katie Warmington
Okay.
34:19.31
Nathan Holbert
So you know, the culture of a school, the culture of a classroom has a way of taking innovation and bending it and shaping it into the forum that it knows and it's comfortable with and so If we want to build technologies that are meaningful to schools and that can actually improve the kind of learning and teaching that happens in these places, it's important to actually understand the culture of those places. To start with, what is this place? How does it function? What do you need? What are the goals, as I said before, of your community and of your students?
34:55.22
Nathan Holbert
And then to build systems that fit within that culture, fit within those expectations, maybe incorporating some new ideas and some new kind of nudges and pushes to expand what's possible in those places, but to start where schools are at, right?
35:13.83
Katie Warmington
Yes, I think…correct me if I'm wrong. I feel like something you said in the very beginning that's kind of been woven throughout the entire discussion today is never on the idea of intention, right? Like you said in the very very beginning, be patient, take a beat, think about the goals and where you want to go. You know, it is exciting times and everyone wants to experiment, right? But let's slow down and start with the intention.
35:39.92
Nathan Holbert
Yeah
35:40.17
Katie Warmington
And yeah, would you agree with that?
35:43.32
Nathan Holbert
Yeah, I think that's actually absolutely right. And I think that, you know, slowing down isn't…there's this idea that the tech is moving fast. And if you want to be part of it, you have to move fast. You've got to be first. You have to, you know, be an early adopter. But I would just say, you know, flat out that the existing generative AI technologies that are out there are not at all capable of the promises that we're making about them. And so in this case, slowing down isn't somehow or another missing the train here. It is trying to attend to the needs of the people that are in your community. And those are the students and the teachers and the larger community around it. And making sure that the technologies you begin adopting, whether that's AI or other kinds of technologies, is truly meeting those needs. There's nothing useful about throwing AI into a classroom that isn't actually helping anybody, right? Or spending all this money on some new technology that isn't addressing the particular goals and needs of the community that it's for. So taking a minute to try to find those matches first is really, I think, vital in all cases, but especially right now.
36:54.40
Katie Warmington
Well, awesome. Well, before we close out today, any other kind of tips, advice, you know, for our listeners that are in this education space, whether they are educated of themselves or developing products, any sort of like closing thoughts you'd like to leave with for the group?
37:08.41
Nathan Holbert
Yeah.
37:12.55
Nathan Holbert
You know, I hope, hopefully I've shared many of them already, but I think that a thing to kind of elevate is, it's just the fact that education isn't just kind of retreading or retransmitting old information. It's really about future thinking.
37:27.92
Nathan Holbert
And it's about inviting learners to reflect on where we are today, think about the kind of histories and communities and stories that got us to this present, and then also to begin dreaming and imagining where we can go. And I think that last piece really requires us to be critical of the state of our present so that we can try to create opportunities to build better futures that represent all of us. And I think innovation, educational innovation in particular, should really be at that site of future thinking.
38:03.34
Nathan Holbert
And part of that is also this idea that I mentioned Audrey Waters earlier, this imagining new stories that we can tell. And part of that is also imagining new storytellers. Who else can tell those stories besides people like me? Right? And how can our values, excuse me, how can our technologies support those particular goals and those particular values?
38:26.81
Katie Warmington
Excellent. And so how can the folks listening to this find you and follow some of the work you're doing? Can you promote all your things on here so people can find you?
38:35.71
Nathan Holbert
Absolutely, I said it before, you know, ego, I know all about it. Yeah, we'd love for you to check out our podcast, Pop and Play. You can find it really wherever you find podcasts on Apple Music or on Spotify. It's a really fun podcast with my co-host, Haeny Yoon, where we talk about play, about pop culture, how all of those things relate to education. This season, we're actually starting to tackle children's media. So we have some great interviews and conversations with you know, people from Sesame Street, people that are making musicals, people that are making television shows. So lots of really fun and conversations around that. And then, you know, you also can just find me online. If you Google my name at Teachers College at Columbia University, you can find out my work and you can find the research that we do and my students and I do as part of our work at the Snow Day Learning Lab.
39:28.17
Katie Warmington
Awesome. Well, that's yeah all for today, folks. Thank you so much to Nathan for sharing your valuable insights with us. And then thank you for all the listeners for tuning in.
39:39.89
Nathan Holbert
Yeah, thanks for having me. Bye everybody.