Tech in EdTech
Tech In EdTech improves the dialogue between education leaders and the innovators shaping edtech. This is your go-to show for actionable ideas and solutions that make digital learning not just possible, but effective, practical, and inclusive.
Tech in EdTech
Bridging AI, DEI, and Digital Accessibility for an Inclusive Future
This episode of the Tech in EdTech podcast features an insightful conversation with Dr. Sambhavi Chandrasekhar, Global Accessibility Lead at D2L, and Rishi. Dr. Chandrasekhar shares her extensive experience and insights on the evolution of accessibility in edtech. She discusses the role of AI in enhancing accessibility tools, the importance of inclusive ecosystems, and best practices for implementing AI-driven accessibility. The conversation also delves into challenges and opportunities in integrating DEI principles, evolving accessibility standards, and the importance of including diverse users, particularly those with disabilities, in every stage of AI development to ensure accessibility.
00:02.89
Rishi
Welcome, everyone, to the Tech in EdTech podcast. It's a pleasure to welcome you once again to our newest episode. Today, we delve into a thought-provoking discussion, bridging AI, DEI, and digital accessibility for an inclusive future. I'm your host Rishi from Magic EdTech, and our guest for today's podcast is Sambhavi Chandrasekhar. Dr. Sambhavi Chandrasekhar is the Global Accessibility Lead at D2L. Dr. Chandrasekhar, welcome on board. Could you tell us a bit about yourself?
00:35.21
Sambhavi
Hello, Rishi. Thank you for having me. I'll give a bit of background about me that explains how digital accessibility, DEI, and AI fit into my life. I'm originally from India, where I worked on IT projects with the Reserve Bank of India for over two decades. In 2005, I moved to Toronto, where I completed my PhD and postdoctoral research at the University of Toronto. My research was about how blind and low-vision web users access and use digital information. For the next five years, I taught in an inclusive design master's program at OCAD University. But during those 12 years when I was studying, researching, and teaching, I also worked with a nonprofit called Inclusive Design Research Center. I was working on pan-Canadian accessibility projects focused on making emerging technologies accessible to people with disabilities. In 2017, I joined my current company, D2L, which was previously known by the name Desire2Learn. My role here as D2L's global accessibility lead is to evolve the company's accessibility practice in all domains, from product development to marketing, sales, and support. A mission at D2L is to transform the way the world learns by reaching every learner, regardless of their age, ability, or location. Accessibility, equity, and inclusion have remained my core values across the three sectors I worked in, that is academia, nonprofit, and industry. And therefore, DEI is very close to my heart. I do a lot of work on inclusion in the workplace. And when it comes to AI, I'm fortunate to work with great minds like Dr. Jutta Treviranus on developing Canada's AI standards. I'm the vice chairperson of Canada's Committee on Accessible and Equitable AI Standards. I'm also a member of OneEdTech's task force, developing an AI data rubric for edtech companies. I think I'll stop here.
04:29.28
Rishi
Thank you. Thank you, Sambhavi, for sharing your journey with us. It's truly inspiring to hear about your extensive experience and everything committed to disability inclusion. Your work across academia, the nonprofit sector, and now in the industry, particularly at D2L, highlights the importance of accessibility and inclusive design in creating impactful educational experiences. Given your deep involvement in these areas, how has the understanding and implementation of accessibility in edtech evolved over the past decade? How do current edtech solutions generally fare in terms of accessibility?
05:05.27
Sambhavi
In my experience, awareness about the need to include everyone in learning has increased over the years. But again, the reason for doing that varies across regions. In some places, it is the fear of non-compliance with accessibility regulations and the financial repercussions that might have. In other places, it is a genuine passion to enable everyone to learn. And that's where people try to make teaching and learning technologies accessible and usable. I must say that there is an increase in the level of compliance across the industry. I would say, I would limit myself, my statement to North America. This could be because there are more stringent laws. One example being the ADA Title II Web and Mobile Accessibility Regulations in the USA that was announced in April 2024. And if we extend our glance to Europe, there's the EU Accessibility Act, which will come into effect next year. There could be another reason. for increasing levels of compliance, it's because there is, again, increasing advocacy and increasing awareness about disability and their needs and how to design. So these things lead to good intentions, intentions for including everyone and my sincere hope is that there is more of the good intention than the compliance fear. Yeah
06:50.20
Rishi
Interesting. And how do you pair up with the you know accessibility along with AI? Can you help us understand AI in digital accessibility? What is your overview of the current state of AI in accessibility tools and platforms?
07:07.63
Sambhavi
Okay. It's about tools and platforms.
07:10.50
Rishi
Yeah.
07:12.16
Sambhavi
AI has brought about automation and speed. It does a lot of good, but it's always necessary to have a human in the loop because just as AI can scale up the good effects, it can also scale up harm. But in terms of accessibility tools, I can talk through a few functions such as accessibility audit, accessibility remediation, alternative formats, administration, which is usually done by accessibility tools and platforms. In terms of audit tools, there are more number of automated accessibility audit tools now. However, we know that they don't cover the entire spectrum and we do need testing with people with disabilities. There are also inbuilt WCAG compliance checkers built into editors, HTML editors, and they also have a way of remediating and checking again so that you're sure that those issues and errors have been removed. When it comes to PDFs, there are tools now available based on AI that do 90%, almost 90% tagging automatically, which is a good thing because PDF has always been scary, especially in the education sphere. And the automated tagging can also happen for complex tables and lists, which is a good thing. Another issue has always been images of text, like documents being scanned and made into PDFs, which is a real scare for people who use screen readers. But now they are there are tools that can scan with optical character recognition or OCR, scan these PDFs, and create text which can be read by screen reader tools. There are other things, necessities like converting images into text for screen readers. So AI helps with automated image description and AI also helps with ah converting sounds into text for people who are deaf or hard of hearing by automated captioning. And that's really improved over the years now. And then for languages, there's multiple language support, both in captions and in translation of languages. and Another very important tool for accessibility is converting into different formats because everybody may not be comfortable with the same format. So particularly PDFs being converted into Word and HTML and EPUB or any other ah format sound, for example. my mathematical formulas also pose a problem and if AI is capable of parsing through math formulas and creating either audio or text versions of it. And most importantly, when all this accessibility activity happens, from an administrative point of view, there needs to be some ah monitoring and information gathering and reporting of what has happened, how much of accessibility issues have been remediated and AI tools are really good in doing that. So I would wrap up by saying that in all these functions related to accessibility, ah like audit remediation, alternative formats, and administration, AI has been really helping.
10:59.94
Rishi
No, that's good. you know The functions which you described from OCRs to image descriptions to automated captioning, they're they're really good, cool examples. And to these notable enhancements, what AI can do, what is the current state in which AI is enhancing them into the learning platforms today?
11:19.04
Sambhavi
So there's a lot that AI can do. In fact, it has already started doing ah to enhance accessibility in learning platforms and this is for learners of different types, right learners who are who experience constraints in perceiving what's there on the screen, such as those with vision and hearing challenges. learners who experience constraints in operating with the system, such as those with dexterity challenges, and learners who experience constraints in understanding, such as those with cognitive challenges. So focusing more on large language models, ah they have the capability for modal and language translation, modal translation being image to text, text to image, et cetera, and language translation, natural language processing, this is another capability, and personalization. Using these three capabilities, a lot could be leveraged to make online teaching and learning on a learning platform more accessible. And I'll mention a few of these functions. Let's take image to text. Now, AI can parse through images and create a text description. For those who use screen readers due to vision loss, a series of images descriptions can also be used to create audio descriptions for videos, when it comes to text-to-speech, Again, AI can convert text to speech. So this involves two things. One is converting audio files and audiobooks. Another capability is text-to-braille conversion, which AI can do, and it can be useful for those with vision loss using braille readers. When it comes to speech-to-text, this is specifically useful for people with hearing loss, for video captioning, and it also helps learners who cannot speak very well by picking up their unique speech patterns and you know correcting for their mispronunciations and normalizing their speech before creating an output. Another important area is language translation. This is huge because of the globalization and learners coming from one country to study in another country. LLMs or AI can provide multilingual support. So learners whose first language is not the language in which they are learning can translate the material that's given for learning into their own mother tongue and learn easily. Another very important function of AI is the simplification of text. I would say, I would challenge that very few of us would be not using this function in one way or the other. Everybody can do with less cognitive load and summaries of text are another very important thing that we can get from AI. Specifically, these are useful for students with learning disabilities. Because the material to learn becomes simpler, it becomes more concise, and therefore more easily consumable. One other useful capability in interfaces, interaction interfaces, is natural language processing. Just talk like how you would talk to a friend, and then you can get things done from a computer using AI, which is a huge thing. Such conversational UIs are especially useful for learners with cognitive constraints. And also, um they make it easier for students to obtain support when they are in need of help while using the system. And last thing I would mention not like to mention, and but obviously not the least, is personalization. So LLMs are able to understand through some querying about the exact needs of learners, and they can tailor the content and the activities to suit each learner's needs. And this exactly is what is advocated by inclusive design. It's not one size fits all, but one size fits one. And that is becoming more of a reality using AI. And I'm very happy personally about that. So to summarize, LLM, or let me just say AI, can improve the online learning experience on a platform, both for instructors and learners with disabilities by providing support for um generating alternative formats, improving communication, and also enabling adaptive learning.
17:25.97
Rishi
That's very nice. So the way if I could replay is what you're saying is the modalities of content and then personalizing it for the user in terms of how they subscribe to the leveling of the instruction are a few notable areas where your AI could influence and it could actually make an equitable instruction for all. That's very nicely said Sambhavi.
17:48.90
Sam
Thank you.
17:50.51
Rishi
So I would also then like to deep dive into how do you see the role of AI in creating inclusive ecosystems. I saw you know you gave these examples of ah you know having different modalities, but to a large extent, when you see of an inclusive ecosystem, both not only in education, but also to a work environment, where do you see you know the world evolving?
18:17.22
Sam
I would say, OK, let me first say when it comes to education, both educators and employers are trying to keep pace with AI and it's coming in at us at a very, very fast pace. It's very important for educators and employers to leverage the tools, AI tools in teaching, and training, and learning activities. But there is a fear and there's a possibility that history will repeat itself and people with disabilities, learners with disabilities will be left behind if we are not careful to bridge the gaps of accessibility and equity. I'm saying this because AI is a double-edged sword. It can address inequities by lowering barriers like I just explained in the previous case that can make education and training better for learners with disabilities in several ways. At the same time, it can amplify inequities and it can create discrimination against learners who do not fit into the norms that the AI system knows to recognize. Now there are several things we can do during procurement and during use of AI systems to address the impact that AI might have on equity and accessibility in the tools that we use for education and training. I will just say a few of them. So while procuring, we can evaluate AI tools critically to ensure that protect that they protect the privacy of learners. I'm saying this because in the case of learners with disabilities, if the privacy is not protected, it will be very easy to make out who that person is just by the ah description. Another thing we can do is to allow choices to the learners to use or not to use the tools. If they feel that it will be harmful for them, they should be able to opt out of it. Or if they feel that it's not accessible, they should they should be able to opt out of it. And we should provide alternative alternatives. It could even be a human-run system and we have to strive to get data about extreme minorities represented in the AI systems that work behind the tools. What I mean by this is, if we are buying a training tool, that training tool obviously draws upon a system, and that system must have been trained using data, which may not have included information about minorities. And when learners with disabilities use that system, use the tool, the tool may not be ready to respond, which might be harmful to the learners. So it is important to get data about the minorities and try to feed it into the AI systems so that the system learns about these are the types of users that are going to use the tools.
22:39.71
Rishi
No, I think this is definitely going to be very valuable to our listeners today. I think one has to be vigilant in literally watching out what and how AI is being embraced into the institution and how every institution can actually know prepare themselves to make users have the right users to it. If we we change a shift into a conversation from here, I want to literally understand that how do you think, how impactful diversity, equity, and inclusion in in improving the user experience.
23:13.93
Sam
So user experience, which means you're talking about diversity, equity, and inclusion in a company that creates, let me say, edu tech, education technology, right?
23:22.81
Rishi
Yeah. Yeah.
23:24.61
Sambhavi
So I would first say companies creating education technologies must embrace not just DEI, but IDEA. What IDEA is, is take the letters DEI and add the letter A and rearrange them to form IDEA, which is Inclusion, Diversity, Equity, and Accessibility. Now this shift from DEI to IDEA ensures that users with disabilities are included in the diversity efforts. And the A in the IDEA can also mean allyship, and that could be useful in creating initiatives in the company through employee interest groups where employees can share experiences and ideas, and create a sense of community, and create a culture of accessibility in the company. Now about directly how the IDEA inclusion, diversity, equity, accessibility can improve user experience, I would draw upon my years of experience with inclusive design. I would say to create inclusion through the product, design should start from the margins. meaning the ones the users that have maximum constraints and undoubtedly, these are users with disabilities. So I would advocate that Edutech manufacturers should include people with disabilities in the first round of their and research, design, engineering, testing everywhere all across the product cycle. They should include people with disabilities and then they should know why they are including people with disabilities. And it's not just about people with disabilities. It's about different types of differences across humans, age, gender, language, culture, name it. And how do you know? Because you have to do research and find out what are the different types of users who are using your system. If it's online, it could be practically anybody. And why do you do that? You do that because you have to know what are the needs of these users. Different people will have different needs. And when it comes to user experience, if you give the same thing to everybody, it's not going to work. I always give the example of four different people, a tall man, a medium-sized woman, a short boy, a small boy, and a person in a wheelchair, all of them being given the same bicycle. Only the average woman might be able to ride the bicycle properly. The tall man and the child have difficulty, means not a good a user experience, and the person in a wheelchair will not be able to ride at all. Whereas if you personalize it to the needs of each person, then all of them can ride. The idea behind this is that equality is not a good idea. Giving the same thing to everybody is not going to give good user experience. Equity is the important idea and equity is about focusing on the outcomes such that everybody has equal or comparable outcomes. That is equity and equity is very important to provide a good user experience and enhancing the impact through accessibility, that is where A comes in, which again ties into inclusive design. Always include people with disabilities in your ah user in your ah but product cycle so that when we create, say, a video with captions for users who are deaf or hard of hearing, the captions are going to serve so many other people like learners sitting in a quiet library, learners sitting in a loud cafe, learners who are trying to watch a video when their child or spouse is sleeping. So that's about IDEA and I believe that IDEA is a great framework for creating a better user experience.
27:28.90
Rishi
So these are some of the good examples you gave on the edu tech side. But let's say we shift our conversion towards a corporate. right and do you see any common challenges which the corporates might be encountering when they are integrating DEI principles? And if you relate to those challenges, what are your recommendations to overcome these challenges?
27:51.16
Sam
I would say lack of awareness at the leadership level about DEI would be the primary reason why there would be challenges in implementing DEI. I would I would have two suggestions. for promoting specifically disability inclusion in DEI because I don't just believe in DEI being about race, and gender, and culture. So the leadership needs to have an all-encompassing vision of diversity that includes disability. So they should know that there is a wide spectrum of human diversity. And this is beyond just race, gender, and sexual orientation. So organizations should be forward-thinking, and they should look beyond boundaries and they should incorporate accessibility also into their vision mission statements and in every area of their business. And I'm very proud to say that D2L does that and our leadership is so very aware and supportive of our idea measures. Another thing that comes to mind is actually, the technologies that are procured for use by companies, whatever they are, they may not be edtech, but they they could be any company. One warning is that um ah however much we say there could be people with disabilities among employees, so you should buy accessible tech, you can never get 100% accessible tech. I say that because technology keeps changing, people's needs keep changing, and things are always in a flux. So there will always be some part of the technology that is not accessible that was accessible yesterday, not accessible today. So what we need to see is not whether, at the time of procurement, everything is OK. We need to find partners, vendors, who continuously keep accessibility as their primary goal and who provide support and who will partner with the organization in keeping accessibility alive for them. So that is a very important thing. One is including accessibility in the diversity measures and the other one is choosing technology vendors who are always on top of accessibility because leadership is a visionary role and leadership has a crucial part in implementing practices and initiatives for fostering an inclusive environment for all employees. And when I say all employees, it includes employees with disabilities.
30:49.12
Rishi
That's a good segue. So, given this whole evolution of AI, you might be noting that Meta has been allowed saying, you know, new models, same goes for Anthropic, Cloud, and GPT itself has its own evolution from GPT-3 to 3.5 to GPT-OMNI. Now, as the industry is being watchful of the evolution of AI and the light that you threw on equitable access, how do you see these tech developers, DEI experts, and accessibility advocates collaborate? What is your recommendation or a strategy you would guide for smoother collaboration among the teams? I would say that they should all have a common goal, they should all have a common purpose although they do different things and the goal or purpose should speak to each each of their professions but it should also put their force or apply their force in the same direction. My point being, they do different things, but they could evolve a common goal and purpose, such as even accessibility and equity. And they all work towards it so that all the forces add up. Because if they pull in different directions, then they could offset the benefits of what each one is doing. So, and like I said, one way of doing this would be to focus on the human beings who will be using their system and how each of them, meaning the developers, the AI experts, and accessibility to advocates, how each one of them could best serve them.
32:43.72
Rishi
And are there any best practices which the tech companies should adopt ensuring AI-driven accessibility? Like are there any guidelines that inform? Are there any criteria? I hear there are WCAGs of the world which could be informed as standards for AI models. Is there any more light you will throw onto these standards which can be used as vehicles to informing AI involvement?
33:07.99
Sambhavi
So I would say I have done some thinking and even sessions on accessible procurement of AI systems. So drawing from that, I would say that all the criteria that are necessary for non-AI-driven accessibility apply here when you have AI-driven accessibility. And that includes ethics, that includes responsive responsible building, and everything, privacy, security, everything. Over and above that, the one thing that AI brings is risk of equity. Risk of equity arises due to problems with bias creeping into the system and problems with discrimination. These things can lead to harm of individuals. When people are flagged in a different way, when a person is flagged as a thief, just by the profile photo and how it looks, that causes harm. And even if the harm is small by any risk measurement, multiple times the same harm happening can cause a big harm. So this is called cumulative harm. So equity underlies all of these problems. So ensuring that the AI system is equitable, there are a lot of ways in which that can be done, but that process has to be added to the regular AI, regular accessibility process.
34:49.87
Rishi
So as every institution or enterprise adopts or embraces AI, I believe you could hear that there's a conversion on human-centered AI or human AI in the loop. How do you see or how do you guide ah the varied accessibility needs for a diverse global user base being a tent while an AI is being implemented? What's your guidance on that?
35:14.55
Sam
In Canada, we have a saying called nothing without us because everything is about us. And when I say that, I just don't address people with disabilities. I address all kinds of diversity, but specifically people with disabilities because they are the ones facing maximum constraints. So going by that saying, I would say that while creating AI systems, while deploying AI systems, while monitoring AI systems, people with diverse capabilities and constraints should be involved. I'm saying that's also because there was a survey conducted by a company called Fable where only 7% of the respondents felt that there's adequate representation of persons with disabilities. So here it's not just about when you say meeting varied accessibility needs over diverse global user base, it's not just about the accessibility of the user interface. It's also about building the system. So across all these phases of AI, people with disabilities and people with other constraints, diverse constraints, should be involved. That is the only way in which we can address the needs.
36:36.69
Rishi
This is a very helpful view. So switching gears over here, again moving towards future trends and innovations. What trends do you foresee in the intersection of DEI and AI in accessibility? Are you seeing any way specifically organizations should prepare to meet these future demands?
36:57.77
Sam
Trends of DEI and AI in accessibility. I think I have spoken about this ah earlier. There is a need to include DEI in accessibility, and there is a way of including that. And AI, again, like I said, is a double-edged sword. It has a nice way of scaling up all the benefits, but it is an equally bad way of scaling up all the harm. So we have to be very careful to be vigilant about what are the harms AI might do and make sure that those harms are mitigated or taken care of. So when it comes to trends in the intersection of DEI and AI in accessibility, I would say there will be more adoption, more use. There is no going away from it. We can't help it. But more awareness about the human and more awareness about the harm that could happen and more ways in which people have a choice of whether they want to be a part of that or not, these are very important.
38:09.59
Rishi
And how do you inform the product organization specifically because at D2L you get closer to a product enterprise? How do you see a company's culture or product development processes get informed by the state of AI?
38:28.17
Sam
To integrate accessibility or AI, how what did you mean?
38:33.55
Rishi
Both AI and accessibility.
38:35.32
Sam
And accessibility. Okay, I will say that accessibility and accessibility using AI should not be practiced at the tactical level, like adding all text to images, or at the operational level, which would be like buying tools, setting up systems, processes, but it should be practiced at the strategic level by leaders. Again, I want to repeat that. I want to position accessibility as a strategic initiative at the leadership level. And I also have a seven-point framework that I always talk about, which starts with empathy, empathy, an empathetic mindset. And then the need to make sure that all technologies that are purchased in the organization are compatible with the assistive technologies used by employees or learners with disabilities. And if there is any content being used, all of that content is accessible. And the way the learning, or training, or teaching is given to the learners, that should also be accessible. Then they should keep a keen eye on regulatory compliance. Earlier I said compliance, eh, no, and all that. But compliance is important because it's the floor unless a technology conforms to WCAG, people using assistive technologies may have barriers to using it. Only then you can go up to the level of user experience once you conform to WCAG criteria. So regulatory compliance is important and there should be a watch on what's happening in the legal scenario and how compliance can be managed. One very important thing at the leadership level is governance, setting up a system for governing, like the management of accessibility, like through ah creating policies, plans, procedures, allocating budgets, so that there is a way of monitoring how accessibility is happening from the top and being informed as to what is the situation, how the organization is maturing in terms of the accessibility practice. And I'll again repeat that accessibility is not a tactical or operational process alone. It is for leadership to take it as a strategic initiative.
41:13.76
Rishi
Interesting. and you mentioned WCAG and standards. Since you see you know things globally, how do you see you know these standards evolving beyond WCAG, particularly in Canada? Can you shed some light onto that?
41:32.69
Sambhavi
Like I can speak to the evolving standards of AI in Canada and because I'm involved in that, I'm just hoping that the standards when they come out will be good enough for adoption as global standards. So in Canada, we have a standards body, which is a statutory body set up under the Accessibility Standards, Accessibility Canada Act 2019.
The body is called Accessibility Standards Canada. Let's call it ASC for short. Now, ASC develops research-based standards for all federally regulated entities on ICT, employment, on lots of things. One of the standards is about AI. So in Canada, there are no two standards, one for accessibility and one for everything else. There is only one AI standard, and it is called Accessible and Equitable Artificial Intelligence Systems Standards and I'm the Vice Chairperson of that committee. So what I know is that the standards we are building will address accessibility for people with disabilities in AI at all levels. It is not just about whether people are able to use the tools, meaning interface accessibility, it is also about people being involved throughout the life AI system's life cycle, such as you know building it by designing and developing, deploying it by implementing and using it, and then also monitoring, evaluating, and improving the system. In all of these stages, there should be roles which are accessible or roles should be accessible such that people with disabilities can be involved in the building and use and monitoring of the systems as well. So there are two things we are focusing on, the interface accessibility, as well as accessibility of the all the roles that are involved in the AI systems lifecycle. And this is regarding accessibility. And when it comes to equity, there are some things which we are trying to do. Everybody should be able to access the system in the same way to get the same outcomes, that is equity. The privacy of the disability data should not be compromised. and bias should be identified and mitigated. Particularly, there is something called statistical discrimination, which arises out of statistical reasoning that happens in the systems that should be specifically identified and mitigated. And all outcomes from the AI system should be checked for fairness so that they don't harm anybody. And if there are harms, then those harms should be identified and then the system should be retrained not to do those harms and when automated decisions happen or come out of black boxes where there are no humans in the loop, there should be mechanisms for challenging automated decisions. Like suppose a student hears that they have not been taken into a particular program or an applicant hears that they didn't get a job. It's the system that's saying that the system is not God. So there should be ways of trying to find out what happened inside that black box and why the decision was made. And then, of course, ethical oversight. There should be transparency. There should be explainability. All of those things are like standard things in AI, but all of this is important when we look at global standards for AI.
45:24.61
Rishi
Okay, so I think this was really great insight. I would definitely like you to recap, you know the overall takeaways from our conversation today. Probably anything that you want to shed light on bridging AI, DEI, and accessibility overall, but one that will really help.
45:43.69
Sambhavi
Yeah, so if you recollect, Rishi, I started with saying the core principles that I believe in are accessibility, equity, and inclusion. So I would like to share one thought about each one of those. So the first is about ensuring accessibility. So ensuring accessibility is about enabling access, participation, and contribution for all learners, including learners with disabilities. And this is done through making sure that we design and deliver learning such that all learners have access to everything. And more importantly, that no one is harmed in any way. Now that's the additional thing that has come about. Not that it was harm was not there in the pre-AI era, but now it is really highlighted. The second point I'd like to make is about equity. And i like I already said, I love repeating it. Equity is different from equality, while equality is about giving everyone the same thing. Equity is about giving what each one needs so that we everybody gets the same or comparable outcomes. And in the context of learning, equity is about setting every learner up for success. The third point I'd like to say is about inclusion and inclusion is about forming a community, a community that includes diversity so that each one can learn about others who are different from them and understand how to support one another to form a collaborative society or collaborative community because let us remember that if there is one thing we all have in common across the world, it's that we are all so different from one another. Our diversity and our differences is our common factor and we need to celebrate that and we need to recognize that in others and make sure that everybody thrives. I think that would be the message I'd like to leave you with Rishi.
47:50.05
Rishi
No, thank you. Thank you, Sambhavi. I think your insights have been really wonderful, and I'm sure our valued audience will definitely admire them. I also want to encourage our listeners to stay tuned for future episodes, and for this episode, if you have any questions, feel free to reach out to Sambhavi Chandrasekhar at D2L or Rishi Raj Gera at Magic EdTech. Thank you. Thank you all.
48:11.31
Sambhavi
Thank you for having me, Rishi.