![Week in Edtech 1/15/2025: LA School Fires, ChatGPT's To-Do's & Reminders Feature, PowerSchool Data Breach, $175M for Element451, Student Loan Forgiveness, and More! Feat. Emily Lai of Pearson, Dominik Kovacs of Colossyan and Adele Smolansky of AI-Learners Artwork](https://www.buzzsprout.com/rails/active_storage/representations/redirect/eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaHBCRzJtVHdnPSIsImV4cCI6bnVsbCwicHVyIjoiYmxvYl9pZCJ9fQ==--1aa94ecfd88ee12691ad44485410874809f9004f/eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaDdDVG9MWm05eWJXRjBPZ2hxY0djNkUzSmxjMmw2WlY5MGIxOW1hV3hzV3docEFsZ0NhUUpZQW5zR09nbGpjbTl3T2d0alpXNTBjbVU2Q25OaGRtVnlld1k2REhGMVlXeHBkSGxwUVRvUVkyOXNiM1Z5YzNCaFkyVkpJZ2x6Y21kaUJqb0dSVlE9IiwiZXhwIjpudWxsLCJwdXIiOiJ2YXJpYXRpb24ifX0=--1924d851274c06c8fa0acdfeffb43489fc4a7fcc/Week%20in%20Edtech.png)
Edtech Insiders
Edtech Insiders
Week in Edtech 1/15/2025: LA School Fires, ChatGPT's To-Do's & Reminders Feature, PowerSchool Data Breach, $175M for Element451, Student Loan Forgiveness, and More! Feat. Emily Lai of Pearson, Dominik Kovacs of Colossyan and Adele Smolansky of AI-Learners
Happy New Year! Welcome to the first Week in EdTech episode of 2025! Join hosts Ben Kornell and guest co-host Matt Tower as they dive into the biggest stories shaping the education technology landscape, from AI innovations to major edtech funding and cybersecurity challenges.
✨ Episode Highlights:
[00:03:16] 🌐 OpenAI introduces a feature enabling ChatGPT to handle reminders and to-dos.
[00:05:04] 🔥 LA schools face closures and damage as wildfires disrupt education.
[00:22:54] 🔐 PowerSchool data breach exposes student and staff data, raising cybersecurity concerns.
[00:29:55] 🎓 $4.2 billion in student loans canceled for 150,000 borrowers under Biden’s plan.
[00:41:44] 💼 Element451 secures $175M for its AI-first CRM and student engagement platform.
Plus, special guests:
[00:44:22] 🎙️ Emily Lai, VP of Learning Impact Measurement at Pearson, discusses how AI tools improve study habits & learning outcomes and the Pearson end-of-year AI report.
[00:56:15] 🎙️ Adele Smolansky, CEO of AI-Learners, explains how AI Learners is enhancing accessibility in education.
[01:06:42] 🎙️ Dominik Mate Kovacs, CEO at Colossyan, explores the future of AI-driven video and personalized learning.
😎 Stay updated with Edtech Insiders!
- Follow our Podcast on:
- Sign up for the Edtech Insiders newsletter.
- Follow Edtech Insiders on LinkedIn!
🎉 Presenting Sponsor:
This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.
[00:00:00] Matt Tower: I think when I look at some of the responses to the fires, you know, I think AirBBA is a good example. They just like did it, right? They just started offering a week or so of free stays, you know, I saw, I know Elon Musk is controversial, but he just like sent a bunch of cyber trucks along the Pacific coast highway with Starlink attachments.
I was like, that's. You know, again, it's a horrible tragedy, but it's nice to see people just doing things to help and trusting that it's the right move now will pay off in some way, shape or form down the line. That's sort of comforting for me.
[00:00:40] Ben Kornell: Welcome to EdTech Insiders, the top podcast covering the education technology industry. From funding rounds to impact to AI developments across early childhood, K 12, higher ed, and work. You'll find it all here. here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter and also our event calendar.
And to go deeper, check out EdTech Insiders Plus, where you can get premium content, access to our WhatsApp channel, early access to events, and back channel insights from Alex and Ben. Hope you enjoyed today's pod. Hello,
EdTech Insiders listeners. Happy new year to you all. We are back with Week in EdTech, and we are here with a special co host, our longtime friend, longtime collaborator, Matt Tower. Welcome, Matt. Hi, Ben. Thanks for having me. And yeah, excited to pitch in while Alex is out. So, not to bury the headline, Alex Arlen, our beloved co founder, Has a new member of the Sarlan family team, Anders Sarlan has joined the team.
So he's got a baby boy just came this weekend. And so we're going to have some guest hosts for the next couple of weeks. Matt is actually well known as the founder of etch, but also works at whiteboard advisors. And often is our thought partner. Uh, he has. Been a great prognosticator of ed tech disasters, past, present, and future.
So today we'll get a little bit of Matt's crystal ball, hopefully sunnier skies coming up. Before we jump into, we've got some guests on the pod. We've got Adele Smolansky from AI learners. Dominic from Coliseum, Emily live from Pearson. And then on the podcast, we just released an episode with Dr. David Yeager.
We also have our new year, new ideas series, which are really like postcards from the Google AI event. So every single day this week, we've been releasing new episodes of that. We also have some predictions part two with ed tech leaders. Yeah. Year Shapira is joining us and Jeff Magian called off. The CEO of Coursera on the 27th.
Meanwhile, generative aim, IMAP is cooking and we're hoping to have a V four ready to go in just about a week. So what's going on in the ed tech community in terms of events, we've got our February 12th. Summit in San Francisco at the Cooley offices. If you're an ed tech insiders plus member, you've already gotten your email and your invite.
If you're not, we are going to be opening up the remaining slots. If there are any, you know, two weeks ahead of the event. So with that, Matt, we're going to jump into the news and I don't think there's anywhere we could start, but the LA fires. You know, here at EdTech Insiders, we have so many friends and community members in LA and our hearts go out to them.
Many have been impacted or have friends and family who are impacted. But what is the education or EdTech story here? You know, Matt, as you look at this.
[00:03:50] Matt Tower: Yeah, it's hard to articulate like how hard it is for a ton of families in LA right now. You know, we, we've seen the numbers of like thousands of teachers displace, you know, probably hundreds of thousands of students that are either out of school or if they're in school, they're Probably not paying a whole lot of attention and it's, it's through no fault of their own.
And, you know, it's sort of a reminder that mother nature is more powerful than all of us, which, you know, it's easy to forget. And to me, you know, I think that the key education storyline and ed tech has a role to play here, but I think it is even bigger than that is school buildings. And the physical plant of where we go to learn and to grow as people, right, as young people.
And I think it's both very literal in that I think there are probably some buildings that burn down in the fires. And also figurative of, you know, how do we design communities as we start to rebuild in Los Angeles? What learnings do we have and do we change the physical plant of our school buildings?
So that's sort of what I'm thinking about. But I don't know that there's any. You know, definitive answers today because the fires are still raging, but that's where my head goes as, as I start to think about the future.
[00:05:04] Ben Kornell: Yeah, you know, we've lived disasters before that have displaced educators and students, you know, Katrina is probably the most.
Like visible example where you had a mass exodus of students over to Houston or other places, and they took that as an opportunity to reimagine and rebuild the New Orleans education ecosystem with varying degrees of success. You know, I think there's a lot of retrospective view that some of the kind of market force.
Strategies didn't play out as had been hoped. And, you know, the diversity of neighborhoods that we're talking about here in the LA fires, some of the school districts are top performing school districts that had the wildfires come in. So it's not. All in a situation where it's like, okay, this is our chance to restart and rebuild something better.
I think people are clamoring for the quality that they had. Meanwhile, the like student impact half a million students were out of school because of the LA fires. We know that LA unified is back in session this week, but Pasadena. Arcadia, some of the other school districts, you know, there's kind of a ring around L.
A. That has the smaller school districts, varying strategies there, and it does make me think that given that we're headed towards a higher frequency, you know, climate event future. How might we package disaster response tools for education in a way that allows continuity for kids and also, you know, ultimately, like child care and distraction for parents.
And so, you know, I did see that. OutSchool was offering free courses, you know, Airbnb offered free stays for a week for people. And I think if we, as the ed tech community could come together with some sort of like pre made, you know, and potentially pre funded disaster supply kit for educators and parents and families, that could be really valuable in the place where, you know, in person school's not going to be, you know, reliable for the next couple of weeks or months.
[00:07:12] Matt Tower: Yeah, and on some level, like, COVID has a little bit of a silver lining in that we did have to do that on a mass scale, you know, just four years ago. So we're still obviously unpacking how ESSER money was spent and, you know, I think that story has a long way to play out. But By and large, it seems like a lot of that spending went to good uses.
You know, sure we overspent in some places, you know, I'm sure there's things we could have done better. But I think the way that companies approached COVID, and especially I remember those first few weeks of, you know, everybody just wanted to like do something to help. They didn't really care about the implications because we had no idea what was going to happen.
It was, you know, a global crisis. So, you know, I think. When I look at some of the responses to the fires, you know, I think Air BBA is a good example. They just like, did it, right? They just started offering a week or so of free stays, you know. I saw, I know Elon Musk is controversial, but he just like, sent a bunch of cyber trucks along the Pacific Coast Highway with Starlink attachments.
I was like, that's, You know, again, it's a horrible tragedy, but it's nice to see people just doing things to help and trusting that it's the right move now will pay off in some way, shape, or form down the line. That's sort of comforting for me.
[00:08:31] Ben Kornell: Yeah, well, you know, one thing that I would just say for the edtech insiders community if you do want to help in LA, you know, we have some community members that are, are really active and good to reach out to, you know, one that I would point you to is Jacob Cantor.
Who, you know, is super, super involved in that space. And, you know, there's also a few things that I've posted on LinkedIn that give, you know, some areas where you could do a GoFundMe or other supports. In terms of edtech space, we also had a bunch of news, but some of that's overshadowed by what's going on in the AI land.
So let's do our quick around the world in AI. Matt, what stories in AI stood out to you?
[00:09:18] Matt Tower: So I think the most interesting one to me, which is sort of hilarious in its attempt to attack the mundane, is OpenAI released a new model for tasks, what they're calling tasks, which is sort of like you can pre program OpenAI or a chat GPT to do stuff.
To be as sort of plain about it as possible. And that's as simple as, you know, every morning, tell me the weather at 7am, right? And so you can pre program chat GPT to send you the weather at 7am, you know, like I said, attacking the mundane, right? For me, and you know, selfishly, I think about how this stuff applies to my everyday life.
You know, one of the tools that I've been using just in the past couple of days is granola. That is literally just like a note taker, but it takes really good notes. And, you know, so these sort of tools to attack the mundane, I think it's the outgrowth of the investment we put in in 2023 and 2024, starting to come together into products that feel a little bit more accessible to non technical users.
So I think, you know, we've spent the past couple of years having the tech folks really hype up AI. I think this is a real manifestation of, of that. And it's something that's accessible to somebody like me, who is not particularly technical, you know, can hang in an Excel spreadsheet. But, you know, I can start to dig in on some of this AI stuff on my day to day life, not just for, you know, learning and other stuff like that.
[00:10:45] Ben Kornell: Yeah, we've moved into this phase where kind of error one was. It's a thought partner. It's like generative of ideas and lists and so on, but not incredibly practical. And then, you know, phase two, I think the programming community and also the kind of writing community put AI into their production process, not necessarily, you know, end to end, but like, okay, at what point will we use AI to kind of fill in the gaps?
So from writing process, you might go from an outline To something more fleshed out and then you edit it back or, you know, in terms of code, you might start with your initial code base and then say, okay, here's the code I want to build. And then you refine. Now we're actually getting to the agentic era where a very narrowly defined set of activities for an AI.
is actually really helpful at automating very particular discrete tasks. And the folks that are heading in the more advanced direction are chaining those things together to create increasingly complex things. This is why I think superintelligence will be hard for us to recognize when it occurs because it will be more like an ant colony where it's a bunch of different agents acting in consort.
Rather than any centralized, you know, sentient being and that the group intelligence, you know, it's also hard to predict what the collective impact of that might be given that each one kind of is operating on its own programming instructions, you know, on the open AI front. One that stood out to me is New York Times.
Lawsuit has been proceeding against open AI around copyright infringement. And, you know, first, like the orientation of most of the AI models has been to crawl the internet, kind of take that and, you know, essentially say that's public information, public data. And New York times obviously is behind paywall.
You know, it's, they've got strong IP claims. And then the question is around derivative product, like. How much of ChatGPT's output is actually based on New York Times versus just a derivative product. But I think for education, there's some really interesting questions. Things like, you know, math problem sets or things like curriculum textbooks.
The question is like, are those in the public domain? Are those not? And pretty clearly from a publisher standpoint, they're not. And yet we've seen many of these things by students published, you know, like they might copy a page out of the textbook or they might share a problem set or something like that.
So then, you know, the platforms have this interesting claim of saying, well, it may be your copy written material, but we accessed it. Through a public forum, which essentially means that you've got to go after that individual that posted it. And so I think this is going to be a very, very important, you know, legal decision in the future of AI.
And it's hard for me to imagine the government finding a way. To protect the intellectual property holders. It does seem like the horse is out of the barn and it's going to be very hard to put it back in. I mean, what are your thoughts, Matt?
[00:13:56] Matt Tower: Yeah. So I, I sort of split this into two discreet buckets. Right.
And so for the New York times specifically, I think there's this interesting question of what are you paying for and there's a writer. I really liked Ben Thompson who talks a little bit about this and it's. Most of us are subscribing, we sign up for the New York Times thinking about their archive, right?
But really we're subscribing as a prepayment for future content, right? And it's so like, an article that the New York Times produces today, I'm probably not ever going to read again, right? So I, when I pay, I think I'm subscribing for their archive. But like, the reality is, I'm paying them so that I have access to their content in the future, and so that they can produce content in the future.
So, I think this IP question is like, the New York Times, in my opinion, like, should squeeze some value out of their archive from somebody, and like, good on them for squeezing open AI. I don't think they'll get much, but. Sure, they deserve something. I think folks like you and I are still going to subscribe, regardless of what OpenAI has, because we're anticipating their future content production, which OpenAI is still not capable of doing today, right?
They don't have a reporting function in the way that the New York Times does. So that's like one bucket for me. I think on the publisher side, in the education space, there's this even more interesting question of like, who owns calculus? Does Pearson own Calculus? Does Cengage own Calculus? Does McGraw Hill own Calculus?
Absolutely not. And so, it's sort of like when I'm paying for a Calculus textbook, what am I paying for? I think they have much less of a right to their content archive than even the New York Times. And for them, It's going to be an interesting question of how do they transition to paying for the anticipation of future content or for structures on top of the content, right?
And that's where we get into this courseware world that we've been migrating towards for 20 years and its value on top of what is actually public domain IP, right? So I think the legal battle and the education publisher world looks a little different than in the sort of mass media world.
[00:16:06] Ben Kornell: Yeah, I hear you.
And yet You know, what is calculus, right? I mean, if it is a domain, but you're a publisher, you're giving your view of that, and by the way, there's also explicitly public domain properties in each of these disciplines that are ready and available for you. So why do third party AI firms need to use.
Copywritten materials. You know, I also think there's an issue of misattribution in the public space often around learning content. And there's this sense that it's all open source or under creative commons or something like that. And that's not always the case. Related, the other kind of AI story that I am like paying attention to somewhat adjacent to that is meta as part of a trend of reducing fact checking.
And while this is not explicitly an AI story, it is an AI story because one, there's a political shift in the winds of like, who's governing tech and who gets to say what and how things are edited. There's also like an. Acceleration of AI content moderation tools that could be bringing in like an age of golden content moderation, where it is really, really efficient to do content moderation.
And yet this kind of move by meta, I think signals a broader tech move. Towards essentially saying, we are not going to regulate speech. We are not going to regulate content whatsoever. Precisely at the time that AI enabled misinformation. And, you know, when we're talking misinformation, these could be videos.
This could be sounds. Basically they're throwing up their hands at the moment when we've like unleashed the ability to fake. And so I think it's really concerning, especially if you bring the lens of kids and families, and particularly for those in education spaces where if you're an education platform, or maybe you're even a gaming platform or social platform that has kid specific spaces, this is going to be really, really hard now because you're not getting any help from big tech and is hard enough for adults to figure out what's real, what's not, And so, on one hand, we're taking real content, turning it into AI, and then the AI is turning it into fake content.
And, you know, that's really the wash that's gonna come over from this next wave of unfettered AI misinformation. Where do you think that's playing out?
[00:18:41] Matt Tower: Yeah, like, I've seen a couple of perspectives. You know, one is that Mark Zuckerberg's sort of playing the political, like, you know, under Biden, he got Harder core on moderation under Trump is going to get less, you know, on moderation, maybe like I've never met the guy, you know, I don't know.
I think there's also an element of exactly what you're saying of. You know, as long as they are active moderators, they can't win. It's a losing situation. So why not just, like, absolve yourself of the whole thing? I think, like, to try and find some silver lining in it, I would think about their interest in the Fediverse, which is sort of a funny term to use, but you know, essentially their threads product is built on top of this Fediverse concept where it's a protocol level access to the social media information and different moderators can build different implementations of it.
So you can have A implementation of threads eventually you can't do this today but the thesis is where you know you and I can decide what level of moderation we want to provide and then we can actually sell that service to other folks we can have the ben and matt theory of you know content. That, you know, we could build one for kids, we could build one for adults, and, you know, that cuts both ways.
You could say, like, we only show content that skews to our political beliefs, and that's a real risk, but, you know, at the very least, it gives us more control, right? So, you know, my hope is we get some level of that. We've actually seen a couple examples of that for, for children's web portals specifically, funded just in the past year.
Common Sense might have even been involved. in one of those deals. I don't remember specifically, but one of the apps is Heyo. There's one other too, whose name is escaping me. I'll look it up that is trying to build this for kids. Right. And so I think we'll see more examples of that, of like putting a perspective on content moderation now that we backed
[00:20:42] Ben Kornell: one called a cove.
Okay.
[00:20:44] Matt Tower: Yeah.
[00:20:44] Ben Kornell: And yeah, I think the reality that I saw through the lens of cove was. It's a nice to have not a need to have today, you know, until content moderation is regulated or requires some sort of regulation. It is hard to get platforms to agree to do anything other than the bare minimum. And, you know, even things like hate speech and so on are now kind of up for crabs.
So I think it's a really. Challenging time to be a teen already and on top of this, the other big news right now, and you know, by the time you listen to this, maybe you'll know Tik Tok is also on the verge of like shutting everything down. So literally you're like, okay. What are we doing with content? How are we stopping misinformation?
And like, how's the government going to play in roles of like addictive short form media? It's a jump ball right now across all of those spaces. And, you know, many of my friends are like, well, how is this related to ed tech? You don't realize how many kids are on TikTok all the time, and many educators are on TikTok.
There's a lot of learning experiences that are live on TikTok. And so I think, you know, we talked about this in some of our year predictions. You know, we are in an era of accelerated change. And it feels like the first two weeks of 2025 have definitely previewed the accelerated change to come.
[00:22:16] Matt Tower: Yeah, I just want to double down on your point of like, it must be so hard to be a teen today.
I have so much sympathy for that. Like, you know, the social status you have today could be gone tomorrow because the platform that you're on just disappears. You know, fortunately I grew up in a time that was like internet enabled, but like you could make mistakes that to me is the hardest part is it's so hard to make mistakes in today's world and like I needed that I needed like I didn't do anything particularly bad, but like I needed to make mistakes and it's just so hard to do that as a young person, right?
[00:22:54] Ben Kornell: So with that, let's dive really deep into the ed tech. Side of the pool power school so power school has been in the headlines quite a bit over the last year because of their acquisition from Bain, but just this week they shared a data breach, which likely happened in late November, early December. That exposed student and staff data in a massive, massive way.
Tell us a little bit about what's going on, Matt. And what's your take on the situation?
[00:23:24] Matt Tower: Yeah. So this was like the hacker got in through a, from what I've read, relatively simplistic backdoor that sort of says like, why exactly was that a
[00:23:38] Ben Kornell: customer support representative at power school? So they had like this.
Godlike status, which by the way, what are you doing giving Godlike status to a single customer support account? My God.
[00:23:52] Matt Tower: Yeah. And you know, I think it speaks to sort of the meme of there's like the tower of blocks and the one like keystone block at the very bottom of the tower is a Microsoft Excel logo.
And it's like all the companies in the world are held up by Microsoft Excel. And if you pull that out, like everything goes to heck. You can say how long this podcast, I'm just saying. But, you know, I think it's sort of funny. We have this sort of vision of every company in the world running, you know, perfectly smoothly.
The ones we don't work for, you know, everything is perfect, but the ones we do, we're like, oh, that's so messed up. And I think this is an example of like, I'm sure they set up the customer support backend and like the 1990s when Power School all was. You know, probably not even named PowerSchool at the time, right?
And, and this is, you know, you cobble together all these systems and you hack it together and until something breaks and then you reassess. I'm sure they'll redesign their entire backend now. So the short version is, it's really bad. It's really ugly. It had nothing to do with Bain. You know, I, I saw a couple of comments of like, Oh, this is a result of private equity.
And it's like, well, no, cause the, the acquisition Bain wouldn't have even gotten in the door until they closed, which was in late fall. So,
[00:25:11] Ben Kornell: you know, to be fair, it was owned by private equity prior to Bain as well.
[00:25:16] Matt Tower: Well, it was, it was publicly traded. Tama Bravo owned 80 percent of it. But like, I would venture this vulnerability has existed for, for years and years and years, which doesn't make it okay.
[00:25:26] Ben Kornell: But, but this is the state of ed tech infrastructure are one of our largest companies doesn't have the money to, you know, pay the tech debt. That requires to modernize systems. And, you know, I will caveat neither Matt, nor I have any insider information on what the actual breach was. And they're being very careful about not sharing that because you don't want to create copycats.
But, you know, if, if you basically are a privacy advocate. Are all of our schools are, you know, essentially building their data sacks with these partners that are using like late nineties, early two thousands technology to protect said data. And meanwhile, the cyber criminal market has. You know, uh, vastly accelerated, really
[00:26:17] Matt Tower: sophisticated,
[00:26:18] Ben Kornell: by the way, did you read about how they paid a ransom to get it deleted?
[00:26:23] Matt Tower: So I had not read about it, uh, about power school, but my understanding, which I, I think it's sort of a faux pas to put on the internet, at least in words is that you're actually supposed to now it's because the hackers treat it. Like a business, it's it's so it's less of a political challenge the way we have sort of conceptualized hacking for the past few decades.
And it's now it's just a business and they're like, cool, you know, you paid us, you know, now we're gonna go hack somebody else. So, again, like, I think you're like, not supposed to, like, you know, the FBI will never Put that on the internet, but, but my understanding is that's actually sort of the party line now is you're just supposed to pay it and move on.
[00:27:05] Ben Kornell: Yeah. So this is reporting from cybersecurity dive, and I think it appears on Axios, which is that supposedly. They paid a certain amount and not, not disclosed, but likely a very large amount. Then they had a video from the hackers showing the data files, deleting it, which is like, so mind blowing to me.
Can you
[00:27:29] Matt Tower: imagine being on that zoom?
[00:27:33] Ben Kornell: And then they delete it. But like, did they delete it? Did they not delete it? Like really hard to know. And like, why would they? If it's a criminal enterprise. So, you know, I think there's an element here, Matt, that is actually going to continue the exodus from public schools.
There is like a, there are a wide variety of why people are homeschooling or going to private schools or doing these other things, but there's a decay of trust in public institutions and schools are definitely in that zone. And when you basically say, I don't want my child to be on Facebook or on the internet, and I don't want their picture to be on, and then my school backdoors, all of their personal information to social security number, like that's a betrayal of trust, you know, in our kind of Slack groups and our WhatsApp channels, we're talking about DPAs and how onerous and challenging and restrictive they are.
And now you kind of know why, because the school districts are trying their best. It's just, you know, from a school district infrastructure, from an ed tech infrastructure, there's not enough capital to actually stay ahead in the cyber security game. And from a DPA standpoint, the more you get this, like, you know, every single district having their own DPAs, the cost of maintaining said systems goes up.
So we're really in this like profound dilemma and, um, you know, this kind of event really makes. Europeans look at us and say, what the hell are you doing? Why wouldn't you have one unified data infrastructure system across your entire country that you can focus all your cyber security efforts on and create like that national clearinghouse or database.
And, you know, you have to think they've got a point. Even that, if you took Florida, Texas and California. At that state level, they could afford to do so, not that I trust them to build it correctly, but, you know, at that level, there should be a sense of unification for the protection of student data. So more to come on this one.
If I'm Bain, I'm feeling like, Oh crap. Now I'm really learning a lot about education. I
[00:29:55] Matt Tower: mean, Bain is not a stranger to the education space that they have at least one other active education company under management, Meteor Education, which actually deals more with school buildings and ed tech.
[00:30:07] Ben Kornell: But is that Bain Capital?
Is that Bain Impact Fund? Is that
[00:30:10] Matt Tower: Bain Ventures? Like, it's the leverage buyout group. Okay. Specifically. And then, you know, those other groups also have, like, you know, they've been in this ecosystem across their various funds for a while. So they're not strangers, I think, like. It certainly doesn't help their underwriting.
I think this is super morbid, but like, Illuminate had a huge hack two, three years ago, right after they got bought out and like, didn't really make a difference to the financial side, which to be clear is That's part of the problem. Yeah. It's an ugly thing to say, but it is true. And like, you're right. It is part of the problem.
And it's something that we need to design for in the future.
[00:30:47] Ben Kornell: Yeah. Well, topic number four related. government data. Tell us a little bit about the national clearinghouse. This is more for higher ed, but tell us a little bit about that.
[00:30:56] Matt Tower: Yeah. I mean, speaking of like designing national infrastructure for, uh, student data to give you the headline before we get into the, the impact.
So the National Student Clearinghouse collects information on higher ed enrollments across the U. S. from every university in the U. S. every year. And every fall, they give us a preliminary report of enrollments. And especially the past couple years, the focus has been on first year enrollments, freshman enrollments, because we're hitting the demographic cliff and COVID seemed to infect particularly first year, first time enrollments, etc.,
etc. And this fall, they predicted a 5 percent decline in first year enrollments. And the whole industry sort of rallied around this and said, man, this must be the impact of COVID, the impact of devaluing of college, and all these themes that we talk about consistently seem to be playing out. And we thought, we were like, all right, cool.
We have like a national data set that confirms these trends that we've been writing about, honestly, for, for multiple years, right? And, you know, the Chronicle ran a story, Inside Higher Ed ran a story. You know, it was a big deal. And I want to call out Gates Bryant at Titan specifically, who wrote to Phil Hill to say, Hey, just so we're clear, this is a preliminary report.
Let's be careful about how much we read into this. And I think Gates deserves credit for being sort of a canary in the coal mine. I'll remember this forever of him saying, you know, they did caveat it. This often changes. We don't know. Don't overdo it here. And it came out this week that they, they had sort of vastly missed.
And in fact, First year enrollments had grown year over year, which is, is a pretty dramatic about face, you know, if it had, if enrollments had dropped less than expected, that would have been one thing. If we went from like 5 percent to 3%, but the fact that they actually grew is a really important sort of counter narrative to like everything we've been talking about in higher ed, which has been doom and gloom for 18 months, it's a massive about face.
Yes, the National Student Clearinghouse caveated and said, like, be careful about reading too much into this data. But also they admitted that they made a major error this week. So, you know, I think both, both things are true. And it speaks to even things that we think are super quantified, you know, have.
Human error involved.
[00:33:30] Ben Kornell: Yeah. So how are we in the higher education space going to trust that the data that's coming from the source in the future is actionable and valid. And, you know, we had the FAFSA debacle, we've had like financial downturn of many of these colleges and universities, and then the data we're relying on to make.
Strategic decisions is fallible. This really, it's kind of like creates this perfect storm of confusion at a very, very critical time for higher ed and higher ed decision makers, but also policy makers. So, you know, first, like, what would be your remedy for the actual data systems? And then second, from an outcome standpoint.
Given this report, how do you think school leaders or system leaders, you know, university leaders should be thinking about the go forward practically?
[00:34:31] Matt Tower: Yeah. I mean, I will say like between FAFSA and this, the Department of Education could not have picked a worse time to have two terrible data issues happen, like, you know, right as the Department of Government Efficiency gets stood up.
And like, I don't want to get into that, but like, The timing is bad. Like, I think we can all agree on that.
[00:34:50] Ben Kornell: I mean, would you say that this department of education has been one of the worst departments of education ever? Gosh, that would be
[00:34:58] Matt Tower: a big statement. I'm just going to punt. I don't, you know, I've been following this market for a while, but I don't feel like I have a good enough perspective on that.
That'd be a fun poll though. I would, I would love to pull. Yeah, maybe we put that out. Maybe we should team up on that.
[00:35:12] Ben Kornell: There's much to love about what the department of education has done, but in terms of the gaffes. You've probably had some of the biggest gaffes like in memory. And I would also include the kind of the letter around higher ed, third party OPMs, basically they crushed the OPM market and then walked it all back.
And that was really, you know, had pretty devastating results to a bunch 2U.
[00:35:39] Matt Tower: Yeah, I think all three, the DCL, the FAFSA debacle, and this data problem, it makes the department look really bad. And, and like, You know, I'm not somebody who thinks like it either should or will get abolished anytime soon. I think there's, it's too complicated to do that.
And I don't think that the political win, I don't know what it accomplishes politically other than like being a talking point. I certainly think it calls for some careful thought about what. The function of the department is, you know, Michael crow had a good quote about it of saying like, look, this department is important, but like, it's been cobbled together over 40 years.
It's like a system of bank feeds that has been sort of thrown together. Like, maybe we should rethink this.
[00:36:26] Ben Kornell: Well, and the idea of combining it with department of labor. I'm actually like, that's not that department of labor is like running like a smooth machine either. But if you imagine that our entire talent pipeline from cradle to grave.
Could actually be tied together. And, you know, there's a way in which that would actually be more reflective of that learning experience of human beings in modern America. It's not like, uh, you're done with your education now, now go work. It's a lifelong thing. So, you know, there's ideas here that are, are worth exploring and building.
And every time the department has like. You know, snafus like this, it lends credence to the, let's like, rethink,
[00:37:10] Matt Tower: you know, I think people forget, like the stuff isn't set in stone. Like it's okay to change and evolve. The department of education has only been around since 1979. I think it was Jimmy Carter who,
[00:37:21] Ben Kornell: yeah, I think,
[00:37:22] Matt Tower: yeah, like it's okay for things to change and evolve and adapt to new environments,
[00:37:27] Ben Kornell: by the way, it's been a tech insiders podcast.
[00:37:32] Matt Tower: So I just like. You know, I think this stuff gets super divisive and it's like, I'm not saying it needs to go away, but like, yeah, we should be thinking about cradle to grave education. I think that's an awesome concept that, like, we should spend time on rather than just fighting like. You know, will it, won't it?
Right. So I just want to hit on quickly, you brought up like, how should higher ed administrators be thinking about this? And I think this is a really important point. And I think it gets at a concept that I think about a lot, which is, I think the most important thing in higher ed today is having some form of legitimate differentiation.
And like, differentiation is not your like Google SEO strategy. Okay. Or like your arbitrage on some like internet platform to get email addresses of high schoolers or adult learners for that matter. I think differentiation is having a very specific, you know, it could be a like vocational focus. It could be a religious focus.
It could be, you know, there's some schools who are trying a sports focus that has its own sort of complexities involved. But, you know, I think like rather than just Playing on a rising tide, which is what higher ed has been for the past 30, 40 years, it's really important to think about, like, what is the point of your school?
And like, do you have, you know, relative to the school down the road differentiation? So that's what I'm most interested in for every school that's not one of the, you know, five biggest online mega players who are going to continue aggregating students.
[00:39:04] Ben Kornell: Meanwhile, on the policy front, you know, the final days of the Biden administration saw loan forgiveness for 150, 000 new beneficiaries.
Before you get all worked up about the numbers, which I think it's some sort of total of I think 180 some odd billion in loan forgiveness. 80, 000 of the borrowers were cheated or defrauded by their schools. 60, 000 borrowers have total and permanent disabilities. 6, 000 are public service workers. You know, the details really matter here and the crushing weight of student debt.
You know, there's a economic argument here that by freeing up this debt, you're actually going to create more taxable income going forward in the future and more prosperity, but this pretty much, I think represents the end of the. You know, borrowing, I don't know, like it's, it's almost like a pardoning. So yeah, it's 183 billion total.
And this round is 4. 2 billion in student loan forgiveness. Kind of crazy to think of 150, 000 people having 4. 2 billion. And then also, you know, we've seen that Trump has signaled that's. Not going to continue going forward on the business side. There was also a really big round for a company called element for 51.
I wasn't familiar with them before, but 175 million round, my goodness from PSG and basically element for 51 is a student engagement platform. Think of it as an LMS learning management system, but that is essentially AI first AI managed. It also is a CRM. So it's got the CRM element is really around recruiting and enrollment and getting students in.
So it's like, how do I recruit them? How do I retain them? It doesn't seem to me as I dove a little bit deeper that it's necessarily competitive with canvas. Like you could have both at the same time, but there is this growing. Thinking in the investor community that the LMS is that have kind of defined K 12 and higher ed may be under threat with AI native kind of do everything LMS platforms that, you know, allow for creation of content, adaptation of content, more dynamic assessment, connecting enrollment and retention to learning.
And so on. And so, you know, what do you make of this huge round? And then, and this company is based in Raleigh, North Carolina. So I mean, like out of nowhere, from my perspective, Element 451, like, where do you think this space is headed?
[00:41:44] Matt Tower: I'll give you one more
[00:41:45] Ben Kornell: detail that'll make you
[00:41:46] Matt Tower: turn your head. The company is only like four years old.
I want to triple check that six years old, but still, you know, it has grown. Really, really fast in a small amount of time. I'll just a couple of different thoughts here. One, the round PSG is a, is a growth capital buyout firm. So it's not, you shouldn't think about it as a traditional VC round where 100 percent of the capital.
There's probably some debt and there's probably some secondaries. The founders probably took some money out and or really other early investors. So it's not. 175 million straight to the balance sheet. It's some amount to the balance sheet, not, not given, like they didn't disclose. Some amount probably to, you know, exit founders and early investors, at least to a certain extent.
Some amount of debt, you know, it's, it's a more complicated financial transaction than, you know, what we think of in, in traditional VC rounds. And they also, I think this is, this is really reflective of what is valued in higher ed today. I think they spend the most of their time as a business focused on getting students in the door.
So I think they are probably working backwards towards the classroom and keeping students around through retention strategies, but, but really where they are. And, you know, I think this is an example of where the money is, is, is in enrollment. So. I see your thesis of, you know, I think this is potentially a good time for a reset and having more AI native companies start to do things that traditionally you would have thought of from the LMS.
Owl just funded a company called JotIt that is AI native and actually much more classroom focused than I'm interested in. But this one is right where the money is, which is enrollments.
[00:43:28] Ben Kornell: Well, if that does happen, this is a good point to wrap up. You're going to hear about it on edtech insiders. I do want to thank all of our listeners for all of your support heading into 2025.
And thank you, Matt, for joining us today for this awesome water cooler conversation about what's going on in edtech now onto our special interviews. Thanks so much for joining and have a great week. It was a treat. Thanks, Ben. Emily Lai is the Vice President of Learning Impact Measurement at Pearson. She earned her PhD in Educational Measurement and Statistics from the University of Iowa and is a key member of Pearson's efficacy and learning team, where she leads a group of researchers dedicated to studying the use of Pearson products in teaching and learning.
The team's work focuses on understanding the impact these tools have on learning engagement, study habits, academic achievement, and student progression. Thanks for being here, Emily Lai. Welcome to EdTech Insiders.
[00:44:22] Emily Lai: Thank you, Alex. It's great to be here.
[00:44:24] Ben Kornell: Yeah. Nice to have you here. So one of the things you found through your data is that students who are using AI study tools are four times as likely to adopt active study habits, like self testing, which is so important, or note taking than students who don't.
What do you think drives this shift? And how do you think this is going to impact the learning outcomes for those students?
[00:44:45] Emily Lai: Yeah, thanks. So we were really excited to be able to talk about this research. And so I think, you know, it all starts with our approach at Pearson, where we try to be really thoughtful about how learning science informs the design of our products.
And so that was very much the case with our AI study tools in our higher education courseware. And there are a couple of learning science principles in particular that really kind of We designed these features around. So one of those is sort of desirable difficulty and scaffolding. And really, this is just the idea that, like, a little bit of challenge is actually good for learning.
So, you can kind of apply a Goldilocks principle to learning. There's kind of a sweet spot for task difficulty where it's not too easy, but it's also not too hard. And this helps to kind of maximize a person's motivation and effort. So, when tasks are kind of on the challenging side, Then you kind of need to provide a little bit of scaffolding, which is just that you give to the student, and that can be delivered by the instructor, or in this case, it can be delivered by digital tools and resources.
And so in practical terms for the AI study tools, what this meant is that we didn't want our feature to just be giving students the answers. We actually wanted the tools to provide scaffolding to students, which might just be kind of step by step instructions on how to solve a problem, or Like breaking complex ideas down into more manageable bite sized pieces.
So that was one principle that we really kind of leaned heavily into. Another was feedback for learning. So, you know, research tells us that effective feedback is one of the most important factors in learning. And in order to be effective, it needs to be timely, it needs to be actionable, meaning it kind of has to spell out specific ways for a learner to improve and understandable.
And so, we took this principle to imply that Our AI features needed to provide feedback that's immediate and elaborated, meaning it provides kind of a clear and detailed explanation and uses a supportive tone. And we wanted to provide that kind of feedback, not only when students have made an error, but Also, even when their thinking is correct and we're trying to reinforce that good reasoning.
And then the third principle that we really leaned into is this principle of active learning and retrieval practice. So, active learning is really any approach that gets students out of that kind of passive mode of simply Receiving information and gets them to cognitively engage with the material by doing things like asking or answering questions, trying to connect what they're learning to something they already know, or explaining a concept to themselves or to someone else.
And so. When we applied this to the AI study tools, we looked for multiple ways for students to be able to interact with and manipulate and even interrogate their learning content. So we needed to provide ways for students to be able to ask questions, to practice retrieving that information from memory, and to be prompted to connect what they already know about a topic to what they're trying to learn.
And so those were kind of the learning science principles that we really leaned into when we were designing the tools. And then because we intentionally designed the tools to kind of encourage that deeper cognitive engagement with the course materials, it was really encouraging, as you say, when we saw that there seemed to be kind of ripple effects on students.
study habits more generally in the e text. So, the research that you referenced, in that research, we saw that students who use the AI study tools, they tended to go in their e text more frequently over the semester, and they were also spending more time during those sessions, and they were using more of those active study behaviors, as you mentioned, like note taking and self testing.
And so in terms of, like, why is that a good thing, why is that encouraging, We would expect better learning outcomes for those students because those types of active learning behaviors like spacing out your study sessions instead of cramming, taking notes while reading, and then engaging in retrieval practice, we know from learning science research that they support better retention of content.
So we would expect them to do better.
[00:49:21] Ben Kornell: You know, one of the things that people have been worried about with AI tutors or with AI support tools is that it would start to outsource some of the critical thinking that students are doing to the AI and have the students be more passive than ever. And what is really exciting about this work is not only are you seeing, no, no, they're not more passive.
They're actually more active in very specific ways. But by weaving decades really worth of learning science principles into your tool, you're actively encouraging that type of activity, you know, active reading, active learning, feedback, that immediate feedback. These are all principles that we've known about, you know, for a long time, but it's very hard to actually embed them into the experience for students in a really smooth way.
So it's really exciting. So speaking of the sort of queries, the idea of retrieval practice of elaboration of students actually asking questions or they call it elaborative interrogation, right? You found that nearly 40 percent of the queries, the questions students were asking in these Pearson Plus e textbooks were higher order skills.
They were involved higher order skills like analyzing or evaluating. That's exciting for instructional designers. What does that tell us about how students may be leveraging AI for deeper learning?
[00:50:33] Emily Lai: Yeah, this is one of the findings that we were the most excited about and really interested to learn more.
So we first started by trying to break down usage of the tool to see, like, what specific features students were using. And when we did that, we could see See that the vast majority of the interactions with the AI study tool are with the explain feature, which is the feature that allows students to kind of ask a question and get a detailed explanation.
And so in spring 2024, for example, between like 70 and 80 percent of all of the interactions were with. The explain feature, which indicates that students are using it to ask questions or to request explanations, clarifications, and because they spent so much time with that feature, we kind of wanted to understand a little bit more about the kinds of questions they were asking, because there's research that suggests that the types of questions students ask are kind of reflective of the way that they're thinking about a subject.
And so to do this, we looked at the first input of every conversation to the explain feature from students who had used our Campbell biology textbook during the spring 24 semester. This is around 50, 000 inputs that we looked at and we analyzed those inputs in terms of several different linguistic measures.
So we looked at, you know, what verbs they were using, we classified their syntactic and lexical complexity, and then we also clustered the inputs. So we coded their questions as falling along two dimensions. One was the level of cognitive complexity, and the second dimension was the type of knowledge implied by the question.
So we did this very thorough analysis of the questions. First, I think it's worth mentioning that But we have a way right now of sort of detecting when students simply copy and paste Pearson homework questions into the bot. It's not a 100 percent perfect yet, but it's It's kind of improving all the time.
And when we look at how often this seems to be happening, we found that it was never more than 6 percent of all inputs were just examples of students copying and pasting their Pearson homework questions into the bot. And it was generated by a tiny minority of students. So I think that is noteworthy for people who, you know, this kind of external narrative that AI is, you know, dumbing students down or it's not.
Supporting their ability to think critically, and we went further when we looked at the questions that they were asking, we found that the vast majority of the questions are, you know, at a base level on topic relevant to the course material and kind of at a lexical complexity that you would expect from a college level course, which is Good, that's not surprising, but, you know, good to see.
For the most part, students questions, they're asking questions in order to kind of know and understand scientific facts and concepts, which is very consistent with this intro level biology textbook. But we were really excited to see that around 38 percent of the questions seemed to indicate a kind of higher level of cognitive engagement.
So those questions were kind of getting more into the application, analysis, evaluation levels on Bloom's Taxonomy. And this is really exciting because it could potentially signal that students are using the tools to engage in some of that higher order questioning. And kind of just engaging more deeply with their course materials.
[00:54:14] Ben Kornell: And one of the things that I find really amazing about retrieval practice and this sort of questioning and, and active reading as a procedure, the idea of, you know, stopping while you're reading, going back, trying to digest the information, bring it back up from memory, you know, sort of piece it through your working memory and discuss it and then ask really good questions of it.
Is that. It really, you know, it doesn't work that well in the context of just a true reading, right? And I think this has been always a mismatch in education. It's like you give somebody a chapter to read and then say, if you're learning science principles, you say, you know, every four paragraphs, stop, close the book, see if you can remember what you actually just read, ask questions of it, make sense of it.
But very few people read like that. It's such a strange way to read. It's not such a strange way to read if you have an AI companion working inside your digital textbook that can actively stop you, that can, you know, ask you questions, that can receive your questions. So I'd love to hear, you know, how you're thinking about that kind of give and take.
It's basically a way to make reading interactive.
[00:55:13] Emily Lai: Yeah, I think that's a really good point and we know from speaking to students and instructors that Students are asking us for you know, good study skills They're asking us to help them understand how they can be, you know, better more closer readers of their text and we know that You know, several things kind of get in the way of students having those good study skills and being able to apply those active techniques.
A lot of them have just never been taught how to study effectively. And so they tend to use kind of ineffective practices like reading and re reading the same parts of the textbook.
[00:55:47] Ben Kornell: Highlighting, they say, doesn't really work.
[00:55:50] Emily Lai: The false sense of security, yeah, it gives them the sense that they know the material.
More deeply than they actually do and we also know that students are like really pressed for time You know, they have busy lives many of them are working at the same time as they're in school And they're they're really looking to kind of make the most of their limited study time And so I think the good news is that you know to study effectively It doesn't necessarily need to take more time.
It's more about kind of the quality of the practices or like what you're doing with the time that you have. So, you know, even getting students to just kind of break up some of their marathon study sessions into like smaller chunks and space them out over time instead of cramming right before an exam, or, you know, instead of like rereading portions of your textbook that you've already read, Like you suggested, getting students to kind of spend that time instead, quizzing themselves.
[00:56:47] Ben Kornell: Exactly.
[00:56:47] Emily Lai: It calls for a little bit more effort, but like in the longer term it's going to be better for learning. So, I feel like we're, you know, what we're talking about here is kind of small changes, but they, they can make a big difference.
[00:56:59] Ben Kornell: Huge difference. This is, as you mentioned, you know, a lot of learning science principles that have been out there for a while that it's, but it's just hard to integrate them into software and it's very hard to change behavior.
And in many cases, as you said, students just don't haven't even heard about a lot of these techniques interleaving, right? We, you can do naturally with this. This is all sorts of great stuff. So. Let's talk about faculty. You know, you're working in the higher ed space. Pearson Plus e textbooks are, you know, in the higher education space for the most part.
Pearson sort of split off its K 12 business a while ago. It has a very different kind of model. When you're working with faculty, faculty in higher ed are another group that sort of wants, I think, at heart to improve their teaching, improve their, the efficacy of the learning, you know, make their students get as much as possible out of a class.
But sometimes they themselves have been ever taken a class. Sometimes they in teaching, you know, sometimes they don't always have the time or the tools to sort of bring active learning into their own procedures. Do you feel like the technology can support faculty and sort of up leveling their own teaching?
[00:58:01] Emily Lai: Yeah. So, I mean, I think generative AI has kind of enormous potential to. Reduce the workload for teachers, especially around kind of more administrative tasks like preparing, you know, lessons or, or lectures, creating content, creating and scoring assessments and so forth, and reducing the administrative burden on teachers, frees up their time to then focus their attention on the things that humans do best, like building relationships with students, understanding the Individual contexts of their students, their motivations and emotional needs, and all of these things are really crucial for establishing positive learning environment for students.
So generative AI is like, we like to think of it as a tool that can help scale teaching excellence to a greater number of students. It's not a replacement and it's also not a silver bullet. It's not going to solve all of the problems that we see in formal educational systems. In the end, like whether it's beneficial or harmful for learning has a lot to do with exactly how people are using it.
But so we've seen, you know, a majority of higher education faculty that we've talked to definitely see the positive potential for Gen AI to enhance their instruction. One, in offloading some of the more administrative tasks I mentioned. Also in, you know, stretching them a bit into more kind of innovative areas.
So, you know, the disruption that generative AI is causing is kind of inviting instructors to rethink how they design assignments and assessments for their course. So as they look for ways to assess their students that are kind of less susceptible to, to cheating through chat GPT. They're kind of moving away from some of the more traditional approaches towards kind of richer, more authentic, more performance based assessments and assignments, things like projects, presentations, case studies.
And so generative AI can kind of also help with design and grading of these more innovative approaches. So it's providing a lot of opportunities, I think, for instructors to kind of rethink traditional ways of teaching.
[01:00:19] Ben Kornell: As I hear you answering that, two things jump to mind. One is that I think the professorial class, one of the things that they do is really curate fantastic curricula, fantastic readings, and I think there's something very complimentary about the textbooks being able to then Increase some of the engagement and activity for students.
That's really exciting. And then one other piece in terms of the different types of assessment that you're mentioning, you mentioned that, you know, you know, the questions that the students are asking the textbooks, basically, they're asking to evaluate and analyze. Do you foresee a future where part of the analysis of whether students are learning or part of their assessment is actually their conversations with the textbooks and the engagement and interaction becomes part of the entire learning process?
[01:01:03] Emily Lai: Definitely, I think generative AI is making that type of ubiquitous assessment more possible, right? And I, I hear a lot of people talking about assessment kind of moving in that direction in the future. So I definitely think, you know, they, and people talk about it in terms of the, you know, ongoing assessment or continuous assessment behind the scenes.
So I definitely think that that is something we should be looking out for in future. There's a lot of potential there.
[01:01:34] Ben Kornell: Yeah, and it can work both on sort of the integrity side. You mentioned, hey, we know the 6 percent of students who are copying and pasting. Well, we can identify that kind of behavior. But on the flip side, we can also identify students who are deeply engaged in raising the level of conversation and engaging in deeper order conversations.
It almost creates an oral exam type environment where you're sort of talking back and forth to the material itself. And it's something very interesting about that as a community. Piece of data or a learning object or an assessment, um, as well as all the ones you mentioned, presentations, persuasive essays, you know, things that, that sort of take the traditional higher education assessment and sort of take it to the next level of what you're actually asking students to do.
I'm very bullish, as I think you are all as well, about how AI can actually improve teaching and learning and some of the, you know, fears we should take a beat and just see if we can, as a EdTech community, sort of, embed some of these learning principles into our software. So what do you think is next for Pearson when it comes to these e textbooks?
You're thinking about active learning. You're thinking about feedback. You're thinking about retrieval practice. Are there any other learning science principles that you're really excited to sort of start weaving into the software and hardware itself?
[01:02:47] Emily Lai: That's a great question. I know that we are looking at, and I think we've recently maybe released even a teaching assistant sort of feature where the AI can kind of help instructors walk through the process of setting up, um, assignments and sort of having some guidance in that process.
In terms for learners, I think continuing to refine around the principles that we, that we've already kind of leaned into, you know, feedback can be made kind of even more granular and helpful. The ability to kind of make recommendations and feedback as well. So not just kind of evaluate the quality of a student's response, but also point them in the direction of additional learning materials that might be helpful for them and those types of things.
So I think potentially just. Kind of refining and really honing how we're embodying those learning science principles that I mentioned, you know, because we've just started, you know, we, we just released these tools a year ago. So I feel like there's, you know, we can always improve.
[01:03:58] Ben Kornell: Yeah, well, seeing results like, you know, four times more likely to adopt active study habits than using traditional textbooks is really exciting going in very much going in the right direction.
I have one more very rapid question for you, and then I will let you go. But. Something that is on a lot of people's minds this week. When it comes to AI, we saw opening. I put out in Sora video tool and Google put out its new video model. You're also seeing better. There's, there's a increasing amount of video generation tooling available, and that can mean video of anything.
It can be talking head video that you can actually engage with. It can be video explaining scientific principles or showing medical procedures or anything like that. I'm curious, what would be sort of your dream for incorporating video?
[01:04:44] Emily Lai: It's not one I've spent a great deal of time thinking about, so these ideas may be just kind of shooting from the hip here. I think, you know, there are a lot of disciplines in higher education where Having visual aids is really important, and especially dynamic visualizations is really important. So I'm thinking of disciplines like anatomy and physiology, where you can kind of like, Hey, let's have a 3D tour of the, you know, the valves of the heart or something like that.
Where I think it would be really cool if those types of things could be generated. On the fly in response to students, real time questions and dialogue, that would be incredible.
[01:05:34] Ben Kornell: I think it's possible. I mean, I think it's, it's a natural extension of the work that you're doing, right? You read a paragraph about that, you read three pages about the valves of the heart and then stop and you say, okay, wait, I really want to understand this better.
How does the valve open when to make sure that it doesn't go back and says, let me show you. It pops up a video. I love it. inside of the heart and the valve and voiceover and shows you exactly the thing you're asking about. You know, I would have said that science fiction three, five years ago. Now it feels like it's just around the corner.
I'm really excited to see what Pearson does with all of the amazing AI technologies that we're already starting to see. Emily Lai, thank you so much. Emily Lai is the vice president of learning impact measurement at Pearson, doing really, really amazing work in bringing learning science principles into learners and faculty's actual experiences in the classroom.
Higher education. Thanks for being here with us on EdTech Insiders.
[01:06:23] Emily Lai: Thank you, Alex. It's been a pleasure.
[01:06:25] Ben Kornell: For our deep dive this week on Week in EdTech, we're talking to Dominic Kovacs. He is the CEO and founder of Colossian, which is doing AI driven video solutions for a variety of different industries, including education and L& D.
Welcome to the podcast.
[01:06:42] Dominik Mate Kovacs: Thank you so much, Alex, for inviting me. It's such a pleasure to be here. I'm a huge fan and advocate of basically sharing our learnings with the industry and the space. And yeah, looking forward to discussing some of the topics we talked about before.
[01:06:55] Ben Kornell: Yep. So let's jump right in. So Colossian is doing AI driven video solutions and basically allows people to create videos with either a real person being simulated or a completely fictional person, an AI generated person, that can teach.
What does that look like? What brought you to that idea? You have 22 million in Series A funding. That's really exciting. What brought you to this idea of AI video and how did you get into it?
[01:07:20] Dominik Mate Kovacs: Basically in 2018, I was already working with algorithms that we call today generative AI. I have a background in computer science and artificial intelligence and it was so Obviously, this was the immediate future of content creation.
No one really believed in this back then because it was so initial, but education changed my life and I had the opportunity to study in Hong Kong and Copenhagen as an engineer. And, you know, I was doing all of these market validation around how the technology could be utilized. And there was a really strong pool from the learning and development teams.
In the corporate sector, also for universities. So by 2020, it made us found Colosseum with a mission to make knowledge transfer easy, and ever since we incorporated the company, grew to close to a hundred people as of today, located across two continents and serving more than 2, 000 companies. That's really
[01:08:13] Ben Kornell: quite incredible.
And you're serving some very big companies, Ericsson, Novartis, you know, UPS, Paramount. So, I mean, you said you immediately recognized that this was going to change the future of content creation, you know, with your engineering background and sort of understanding how this stuff worked. Tell us about what that realization was like, because I think, you know, people are getting this sort of realization that the creation of content that used to be expensive, like, Actually, you know, studio shot video has suddenly plummeted, but you recognize that right away.
Tell us why that's true. What can AI do to make incredible amounts of high quality content at a much lower cost?
[01:08:49] Dominik Mate Kovacs: I think, you know, I was working on, for example, lecture style videos at university, and I hated it. Like, it's just so much struggle and all this nuance and you cannot edit it again. And I was a technical person, but it was still hard, all this software for me.
And You know, when you were working with these algorithms and you could create such results really easily, it was just, you know, an obvious alternative. And I think it's the same feedback that we get from our customers and from the market that we are solving a very hard problem because the UX, the user experience also has to be really easy.
We are not trying to create another complex video editor and coupled with this very complex technology. It's just a very hard challenge to solve, but it's something we are doing. For sure.
[01:09:33] Ben Kornell: Yeah. I've done a lot of different things in education technology, but two of them were very video heavy. I was at Coursera and I was at Skillshare, both really video platforms.
And one of the comments we always got, which is absolutely true, is that watching video can be a very passive experience. It can be lecture style, as you're saying, and it can sometimes just go right by a learner. They might not actually pick. Much up from it. They might not be able to transfer it to their environment.
Video can be tricky. You have been working on changing that from passive consumption of video to much more active learning. What does that look like in your context?
[01:10:07] Dominik Mate Kovacs: So basically the realize that even if you watch a video, it is better engagement rates and better like end to end learning effects, even, uh, primarily for like the younger audiences, like the tick tock generation, right.
Who is entering the workforce and overall. We wanted to make it even more like higher quality, because everyone's talking about how AI can make things work faster, easier, but we also want to ensure that we can focus on the quality. And what I mean by quality is the fact that we turn the video experience into the whole content experience, more like into active content consumption.
This means that you're able to interact with the video. You can click on the buttons, you can click on the questions and have like a. Choose your own path, choose your own adventure type of experience. And we talked to several of our customers, like scientists, and they mentioned that it used to cost like 100, 000 euros to like create a single piece of content because of the exponential effects.
So if you click on this button, then this thing should happen, et cetera, et cetera. And we realized that Qualsien is an asset creator and we wanted to integrate this interactivity components to make it more like a, not just a passive, but also an active. Content provider products at the end of the day.
[01:11:22] Ben Kornell: Yeah. And I mean, I think it's so interesting, sort of the relationship between, as you say, the cost of creation of these assets in the past really limited people. And some of the more innovative and thoughtful content creators were trying to do the kind of, you know, branching scenarios or moments when you can sort of stop and do something in relationship to the video, which actually generates a different video or has that effect.
But the cost of video was so high, that was just not even doable. The cost of video is so much lower that suddenly a whole sets of different kinds of interactions with video start being possible. And one thing that starts to unlock is what people are often calling, you know, personalized learning or precision learning.
Like you can do video in response to individual specific interests or their specific skill sets. What does that look like? And what are you doing now? And where do you hope to go in terms of personalization of video?
[01:12:14] Dominik Mate Kovacs: I would say AI coaching is definitely the future and the way basically our company Colosseum is like tackling that is why I'm moving towards a more of a real time video consumption product line, which would mean that you're able to feed in your knowledge base as like PowerPoints, PDFs, etc.
And you're able to like have a conversation with a video where there is a person talking to you. It's like, uh, how one of our customers phrase. a more 21st century like pedagogical experience because you have the visual interface. It's not just a chatbot eventually talking to you and it's close to like a real time lecture or like a simulational experience.
So this is how we are trying to tackle that, to be honest, from our angle.
[01:12:55] Ben Kornell: Yeah, that's incredible. You know, you're mentioning coaching, AI coaching in the ed tech space, you know, different parts of the education ecosystem are thinking about different possible use cases for that kind of personalized, especially real time interaction.
AI tutoring is an area that's gotten a lot of attention in K 12 because it's something that it's been very hard to scale but is known to be very effective. Do you see a future in AI tutoring and A secondary question, you know, I mean, I, I'm sure you do, but what does it look like? And how do you see the relationship between, you know, a student's relationship with a human tutor and maybe their relationship with an AI simulacra of that tutor or somebody else?
Like, do you see a hybrid sort of model coming where it's, you combine human expertise with grounded AI expertise?
[01:13:44] Dominik Mate Kovacs: The trick there is that, for example, all the coaching budget at, for example, enterprise companies went for the upper management so far, the individual contributors in a case of like a large corporation that has a hundred thousand employees, the bottom 99%, eventually the individual contributor, like a layer.
Never received a CPM part of this budget. So they have no alternatives right now. They don't have an individual coaching experience. So this, this technology would allow basically the scalability of that in multiple languages and everything. So that's why most enterprises are really excited about that.
For example, and that's, I think I, it made me understand the impact of this because there was no alternative for this, for example, for example. So we are talking about use cases where there wasn't such a case or they are not, they didn't have a budget to do such experiences. So, yeah.
[01:14:35] Ben Kornell: I mean, you're bringing up such an interesting point.
And I think this is part of the push. But, you know, I've talked to a number of different tutoring providers or different people on this podcast. And I think one of the things that people really wrestle with, as you said, you know, if you're an employee at a mid level or entry level employee at an organization, you have absolutely zero individual coaching.
It just does not happen. There's nobody around. You have zero. So having, you know, a virtual Marshall Goldsmith or some, you know, amazing virtualized career coach is a huge, huge jump. forward. Sometimes people look at these situations from the other lens and they say, Well, you know, is it as good as a human tutor or human coach?
You know, wouldn't you want everybody to have a coach? And I think there's an interesting wrestle here. I tend to agree with your scaling perspective. But I think especially people already in the coaching or tutoring industry are saying, Well, hold on a second. Isn't there something being lost if you have access to a virtual school?
Companion, a virtual tutor, a virtual coach, but not a human. I'm curious how you address that kind of concern.
[01:15:36] Dominik Mate Kovacs: I mean, just to mention a few use cases where this is critical, like think of sales enablement or, or, or like manager training, you wouldn't really want to go to a person, even like a real person and share difficult feedback.
Think of me as like a new manager, right? And I want to have a real life, like simulation, a training experience where here's like a virtual coach or like a video that I could talk to. And it could react to my, to my context, show emotions, and I could practice how I should communicate in my direct reports.
How to announce a promotion, how to share difficult feedback, right? And this is so much in demand regarding like what we see because it just helps a lot with ramping up, for example, leadership training.
[01:16:21] Ben Kornell: It's true. That's a, that's a really good point. I mean, I think there's advantages to different types of.
coaching and tutoring. And I, you know, I feel like I'm channeling an opinion that is not entirely my own, but I really appreciate where you're coming from with this. So there are a lot of educators who listen to this podcast as well. And educators, you know, higher ed professors, there are K 12 teachers, there are people who do corporate training and L& D and tutoring.
And I think one question I have for you is if you're an individual educator right now, and you now have the ability to replicate Yourself potentially through something like Colossian and be able to create huge amounts of content and put it out in the world and even make it, you know, active and interactive.
That's an incredible opportunity. What would you do if you just, you know, woke up tomorrow and you were a, you know, an eighth grade teacher at this moment in the history of the world? How would you? Take advantage of these technologies.
[01:17:13] Dominik Mate Kovacs: I think this, uh, personalized coaching is definitely great. So for example, if you are a teacher, you could upload your lecture materials to like the knowledge base and send out like a virtual teaching assistance to your students.
It's also great for corporate trainers where, for example, some of the workforce is not like IT educated, or they're just not even that much familiar with the computer, but they would prefer to talk verbally. It's a much better user experience for them. In addition to that, with uh, like schools and universities, we see like a huge demand for localizing the content, localizing past lectures one has, because all universities have these localization efforts, internationalization efforts for the new students coming in that are from different countries.
And with AI video, you can basically like create a single content in 20 different languages, something that some of our customers are doing pretty frequently. I think it's great to see how it aligns with the objectives of these institutions, for example.
[01:18:10] Ben Kornell: Yeah, double click on that. Just, I'm honestly curious, but I also think you've thought about this in a really systematic way.
So if you were a, an educator or a corporate trainer who did. Pursue this, who are wanting to be AI forward and said, you know what? I'm gonna upload some knowledge base. It could be my existing curriculum, it could be a, a passion project, something that I teach and love to teach, and I'm going to make it available, you know, to the world.
Two questions. Where do you host that? Is that, would that be hosted on a platform like Colossian or is, do you put it on YouTube or do you put it on. Skillshare, or do you put it on Udemy, like what would you suggest for sort of getting that type of content out to the world, no matter what kind of educator you are?
[01:18:48] Dominik Mate Kovacs: It's up to preference. I think everyone has so many export options in their mind. So from our end, we try to cater for all. So whether it's Quorum, HTML embedding, sharing, or downloading as an MP4. All of these are possible and it's up to the exact use case you want to strive for.
[01:19:05] Ben Kornell: The sky's the limit. You can basically make content like that and share it through any channels because you have all these different options and with any language.
To your point, the language can be changeable and localizable to any different. You know, market or group of students. That is incredible. So, you know, if you're a teacher in a bilingual school, for example, you can create an English version of yourself and a Spanish speaking version of yourself or one that, you know, is multilingual.
It can code switch or switch languages and just make yourself available 24 7 to the students. I mean, it's a pretty amazing https: otter. ai Last question, where do you see this type of video AI technology going? I mean, we've seen so much. We know that Sora is about to come out. We've seen, you know, players in the space like Synthesia and Heijin, and there's been so much progress in such a short amount of time.
And you were right at the forefront of that. In a few years, what's next beyond real time video? Where could we go from here?
[01:19:57] Dominik Mate Kovacs: I would say Towards more of the interactivity with the video as well. So like what our vision is that to embed these interactive modules, we built into the real time video. So you can click on the button as well, while the video changes based on the content that you're giving based on the speech that you're giving, that's another evolution that we see in like the type of content that doesn't really exist today, I think.
And it's going to be exciting to see how it's going to play
[01:20:22] Ben Kornell: out.
[01:20:22] Dominik Mate Kovacs: I
[01:20:23] Ben Kornell: love that. I mean, how exciting. Just the future of active. For learning through video or real time interaction, you know, it's just, I think it's a type of learning. We just haven't even gotten our head around, but clearly you've seen it coming and congratulations for all your success so far.
You're Forbes 30, 22 billion in series a funding, a hundred people already working for Colossian. We will be keeping a close eye on your amazing progress. Thanks for being here. This is Dominic Kovacs, CEO and founder of Colossian. Thanks for being here with us on EdTech Insiders.
[01:20:52] Dominik Mate Kovacs: Thank you so much. I appreciate this.
Thank you
[01:20:54] Ben Kornell: for our deep dive this week. We're talking to CEO and founder of AI learners. Adele Smolanski has created a really interesting and very new platform that combines AI and accessibility. Welcome to the show.
[01:21:09] Adele Smolansky: Thank you for having me.
[01:21:10] Ben Kornell: So, you know, we met recently at events in Northern California and you are doing something that I just haven't heard anybody talk about in the AI space.
So can you give us a little bit of a background of what brought you into EdTech and what you're doing with AI learners?
[01:21:25] Adele Smolansky: Yeah, so AI Learners is an educational platform that helps students with all abilities learn early math, literacy, and social skills. We're specifically focused on students with disabilities, students with mild to more severe disabilities.
And I got interested in the educational space really from my family. They're immigrants from the former Soviet Union, and they taught me that value of education. And I've always been really passionate about making education more accessible for people all across the country. And then specifically, I got into accessibility and special education because of my younger sister, her name is Laura, and she has a disability called Rett syndrome.
It's more of a severe disability and watching her, especially during the COVID pandemic, try to struggle with online education, that really showed me that there's so many challenges in special education, and we have a lot of opportunities with technology, especially nowadays, and it's really an area that I'm passionate about.
And I think. There's a lot of need in that space.
[01:22:22] Ben Kornell: Yeah, you know, we're at this moment where AI is starting to be incorporated into so many tools. There's lots of AI startups, and there's more interfaces to interact with AI than there have been in the past. Beyond text, we're getting to voice and video and all of these different things.
But you've recognized that screen readers, as well as other assistive devices, are not yet sort of caught up with some of the other interfaces. Tell us about, you know, how accessibility and AI don't fit together quite yet.
[01:22:50] Adele Smolansky: Yeah, so for a little bit of context, so there's something called assistive technology, and this is something that students with disabilities may use to help assist them when using technology.
So a screen reader is what people typically think of when we think of assistive technology. A screen reader is for people with vision impairments, and if they can't see, then we can essentially tab through elements on a screen, and then there would be audio for all of these parts. So now if we think about an educational website, we have to essentially translate everything that's on the screen to a student through audio and enable a student to slowly iterate through all of the elements on a screen.
That gets really complex once we talk about very dynamic webpages. for education. And most tech companies have not tackled this. And for one reason, it's very difficult. For another reason, special education students that are using screen readers make up a very, very small population. There also are other assistive technology devices, and they all have other complexities related to it.
And in the past, there haven't been any legal reasons for EdTech companies to be tackling this challenge, but now we are starting to see this. And with the legal requirements, and just in general, this has been something that AI Learners has been looking at for a while. That was, we started off actually initially trying to understand how to make EdTech accessible with assistive technology devices.
My younger sister uses an assistive technology eye gaze device. So that was one of the reasons that I got into this space. And I personally have seen the challenges of creating edtech that is compatible with it. It does require a lot of extra work and especially it's important to consider accessibility from the start.
Cause it really does require building the entire software platform to be accessible from the beginning. And it does have more challenges to add the accessibility on top. But that is something that we are seeing more companies work on.
[01:24:43] Ben Kornell: Yeah, those of us who have worked in EdTech for a while have all had experiences at times sort of glancing off the WCAG, the Web Content Accessibility Guidelines, which is sort of the gold standard of accessibility.
They have these double A and this triple A. And it's exactly like you said, one of the things that makes it so tricky is that. Everybody wants their product to be accessible, but it is expensive. It takes code changes. It takes auditing. Sometimes you have to bring in an external vendor just to audit your site.
And as you say, retrofitting, it can be difficult if you haven't built it from the get go. So even though, you know, ed tech is not intending to sort of shut out any learners, it is surprisingly rare that ed tech companies truly are fully accessible. So how do you make sure that what you're doing with AI learners?
[01:25:32] Adele Smolansky: So it's exactly what you said, it really does require that accessibility to be built in from the very beginning. So before I started building AI learners, I was asking the question, how do we make an educational website that is accessible for people with disabilities? And to be honest, I actually failed twice.
At the beginning when I was trying to do this and we completely scratched the code base and we rebuilt it twice. And finally was that third time after we made so many mistakes in the process that we learned how to actually create something that's accessible. And I like to break it down in two main areas.
So there's something called technical accessibility and usable accessibility. When we talk about technical accessibility, it is about meaning that WCAG requirements. So it's, can we tab through all the elements on a web page and have audio for all parts of it? Do all of the links have readable text? It's like, there's a lot of these requirements.
There's, I think, about 160 guidelines in the WCAG. And so it's really about ensuring that we have all of those things that are done correctly. And that really goes to the technical accessibility and making sure there's ARIA labels for everything and doing more of that groundwork accessibility. But then when we talk about usable accessibility, that's really where we're not just having something that can go through a checklist, but it's something that is beneficial for students with disabilities and can actually be used and beneficial for them.
And there's no easy process for this. It's really about doing that usability testing with students and with educators and getting feedback on how. They're using the product. And of course over time as I've done that, I've learned more and more about what teachers and students are looking for in a product.
A lot of it comes to customization options. And making things readily available for students and not overwhelming students in any way. So now I've made an internal checklist about that usable accessibility, but it's very customized to directly my product. And then it's again, like really specific to understanding all of the different aspects of what we're providing for students.
[01:27:33] Ben Kornell: And, you know, you mentioned sort of in passing that in the past, the legal ramifications for this tended to be relatively minor. There were some moments in the past where companies sort of got dinged or even got sort of really temporarily shut down because they didn't have certain standards like subtitles on their videos and various things like that.
But it's notable to everyone here that these laws are actually changing and they're changing exactly in the direction that you've been pursuing. Can you tell us about the new accessibility standards that are starting to enter into the ed tech world?
[01:28:03] Adele Smolansky: Yes, the DOJ passed a new act that's going to take place for most public institutions in 2026.
So essentially what it means is that all public institutions will have to be using accessible products. And so how that's applicable to ed tech is all public schools will have to be using educational products that are accessible for their students. And this accessibility means that they're compliant with WCAG 2.
1 AA standards. The problem with this is that most schools probably won't know about this, or if they do, they will have a challenge to actually meet the requirements. And unfortunately, as we've seen in a lot of areas, not just in ed tech and accessibility, is that even if we have a new law, it doesn't actually Get really widespread and used from everyone until there's some lawsuits.
So we'll see exactly directions of this going. I'm very hopeful that schools will recognize this and not just wait till there's lawsuits, but we'll see how ed tech companies are responding to this. So I have a unique position because I can go to schools and inform them about this new law and say, hey, we meet these requirements.
But we'll see how other schools and other tech companies respond to this as well.
[01:29:16] Ben Kornell: Yeah, that's a great point. And, you know, as you say, it's the regulations in place doesn't necessarily mean that it's enforced, you know, right off the bat. But these laws do create a system where Both schools and ed tech companies are incentivized or at least, you know, less disincentivized to sort of ignore accessibility standards and sort of consider it just one in a list of compliance things.
Instead, it's sort of rising in the ranks of something that could actually be a. True compliance issue that could be from a technical standpoint, as well as, of course, it's always a moral issue to be able to make something accessible to all learners. I think that's a big deal. So let me ask you something else about this, you know, as this law comes into place in 2026 and in general, just as ed tech continues to spread and as ed tech founders.
Want to, you know, most people I've talked to want to meet the standards. It just becomes difficult. You have to go in this semantic tagging. There's all these different things you have to do to make sure you, you work with different assistive devices and color contrast, it's not crazy, but it takes work, I want to ask you two things about it, you know, one, Do you see a role for yourself and for AI learners in being a sort of consultative support to other ed tech companies that do want to be WCAG AA compliant, but may not know how, or may not realize the value of it.
[01:30:36] Adele Smolansky: Yeah, absolutely. Yeah. I mean, I've learned so much from my research experience at both Cornell and Stanford in the education accessibility space and through my work, building AI learners and actually deploying it with schools. So I definitely have this. Wealth of knowledge when it comes to accessibility and education and would love to definitely support other companies in that so definitely happy to kind of see where that leads us and what I really think is the best way to actually help at tech companies become more accessible is twofold.
One is it is this evaluation, right? It's looking at what you currently have and explaining where the pitfalls are. But then we also want to prepare for the future. We really want to explain and teach the designers and the developers that are going to continue working on the product what it means to make something accessible.
Because getting a checklist that just says where your problems are and potential ways to fix it won't actually help solve problems. Years and years down the line, like when we're creating new features, we need to be thinking about accessibility, and that's really the best way to do it. And when I was a student at Cornell, I was doing research at two different research labs on this topic, and I had a great, essentially, class that I had created at Cornell for different students where we all talked about accessibility and technology.
And it was fantastic to really be able to see how designers and software engineers who were taking other classes at Cornell were also learning in this class that I had created. And we were together learning about these new ways that we can have technology be more accessible.
[01:32:04] Ben Kornell: That's really admirable work.
I mean, I remember there was a learner at Coursera when I was there who had taken some absurd number of classes, I think like 30 to 50 classes, even though he was fully paralyzed and could only move his eyes. And he was using an eye gaze type technology to navigate the platform and take classes. And it was, you know, when we sort of learned about it and showcased that his work internally, it made that accessibility.
Feel a whole lot realer than it had in the past. It feels very different when you're thinking about it as like you say, a checklist and a compliance document and versus actually understanding the value for end users. So my other question for you, you are, have an engineering background. You obviously, you know, no AI and accessibility.
Do you foresee there being AI tools that themselves can evaluate or even? improve code to make it more accessible, given that a I copilots for coding have become, you know, an extremely rich vein for the engineering space.
[01:33:01] Adele Smolansky: Absolutely. I mean, there already is some movement in how I can help with accessibility.
We've had a lot of research in the space where you know you can use a I to analyze the code and then it can create some suggestions. And I'm sure there are some tools now that are more widespread that are being used. But we still are seeing a lot of pitfalls with these, with these tools that are just in their early stages.
And I think the biggest reason for this is because there's so much manual work that does have to go into accessibility evaluations, especially because developers should actually be testing the tools themselves when trying to see what the experience would look like, especially when it comes to dynamic experiences.
But that being said, there are ways for AI to still enhance and help with this process. So, for example, you could automate the way that a user would be able to iterate through a website, and then play that out to a developer to then help them understand what that looks like, and then help them identify where there's challenges.
And then, of course, we'll always have more of that. simple analysis where AI could analyze the DOM, which is all of the text or all the code that's behind a web page and look at it. But then we still do need some additional intricacies where it comes to that usability aspect and more of those dynamic ways that are so critical for accessibility.
[01:34:18] Ben Kornell: All fantastic points. I love the way you're mentioning that AI could potentially sort of supplement the usability research, because one thing you mentioned that I think is a big part of this space is that it's often hard for ed tech companies to find, you know, the time and resources and structures to do significant user testing, even with their people right in the middle of their target audience, let alone, you know, users who have learning differences or disabilities in any way.
So the idea of being able to say, I want to. See an AI navigate this website with a screen reader, with an eye gaze, with an assistive device, or, you know, and see where they get tripped up could be a way to expand the ability to test beyond, you know, having to find users with each of the different disabilities WCAG tries to address, which is actually a lot.
I asked a lot of guests this recently about, you know, could AI simulate the student experience in various ways just to enhance what we know about ed tech. I'm curious if you see there being any potential there, even if it's not going to do all the work.
[01:35:16] Adele Smolansky: Yeah. I mean, I definitely think there could be, it would be interesting to see if like an AI could also take different personas.
Like let's take the persona of a three year old student, of a five year old student, of a digital native, if not a digital native. And this could definitely be applied to ed tech and a lot of other areas as well. It's actually been an idea that's been on my mind for a couple of years. I've just been a little bit busy with AI learners, but if the right opportunity comes about or a co founder comes through, I would definitely love to pursue this idea.
And I think it's really needed, especially as. More and more people around the world start recognizing accessibility and as we get more digital applications.
[01:35:52] Ben Kornell: I really think there's a many, many multi million dollar idea in a company that basically could provide, you know, infinite number of simulated students with, as you say, different personas, different ages, different previous backgrounds, different standards, alignments, different languages, different to.
Learning differences, different disabilities. I think that could elevate the entire field in a really interesting way. So I love that idea. Any co founders out there who are working on that or thinking about it? That's a, you know, this is a clarion call to you. So, you know, last question for you. We've talked a lot about the accessibility aspect of AI learners, but there's a couple of other things you do in AI learners that I think are worth.
talking about. You mentioned that it's about numeracy, literacy and sort of early reading and learning across subjects. And you take a sort of very game based approach. So tell us about how you feel that gaming complements ed tech and sort of what, what gaming principles do you try to inject into your AI learners experience?
[01:36:47] Adele Smolansky: Absolutely. So, core usability for students on AI Learners is, is a series of games where it's essentially like skill based practice. So we've taken some drill and practice games that you'll see on other platforms, and we've gamified them a bit with adding additional breaks and reinforcements throughout them.
We really do want to ensure simplicity when we're working with students with disabilities, because if something is too gamey, like a video game like, where there's so many moving parts, it makes it too overwhelming for a student. So for our students, they think that they're games, or really it is this just gamified, essentially like worksheet experience that gives students this immediate feedback and has additional supports, hints, and explanations.
So if a student answers incorrectly, we'll show an example of the game, or at the start of the game, we'll give students an example of how to solve the problem to, help teach them, and then they get to practice. And this essentially gives really great data because it's almost like an assessment in a more gamified manner.
And for students with disabilities who really struggle to take assessments as well, this is amazing data that administrators and other educators can be using to show students learning progress. I also do want to highlight one other feature that We recently added to the platform. So, as we know, across all of education, social emotional learning has definitely become more of a conversation, especially after the COVID pandemic.
For students with disabilities, SEL has always been a topic of conversation, but now we're seeing this across all learners. And we, after hearing so many of these behavioral challenges that educators are facing, We started to go more into the social emotional learning space. So we started with adding some, some social stories.
So we created a couple dozen of our own social stories, and now we're having a generator for teachers to be creating them themselves. But we didn't just, you know, use OpenAI GPT API to let teachers create it and have a very simple form for it. But we're really trying to make the experience easy for teachers to identify what their students want to have to learn.
what all the learning goals are, what the age is, what the reading levels of the students are, and different parts of the story themselves. And then they can create activities after the social story. Because where a lot of social stories fail is that they're not actually supporting students after the story, and we need that follow up.
So we've brought that all together, and then we let teachers really carefully review the story and activities that we've created. And then revise it and ultimately confirm that it's appropriate for their students. And we've seen this feature really take off. We're interested to see how it can impact our students with disabilities that we're supporting and then other students around them as well.
[01:39:31] Ben Kornell: That's fantastic. And I think that that sort of holistic emotional well being approach to learning is so needed for all populations, but especially for special needs populations and post pandemic. I mean, there are all these articles during the pandemic about how, you know, basically. A lot of special education services were just, they didn't know how to deliver them virtually.
So a lot of them just shut down. I sort of created a, a whole different type of lost generation COVID gap for special needs students. So I'm sure that's, you know, your school partners are really excited to hear that you're thinking about social, emotional and mental health and wellbeing for this particular population.
It's really exciting work. And so it, people can find AI learners at AI hyphen learners. com. And this is Adele Smolenski, CEO and founder of AI learners. Thanks for being with us here on Weekend Ed Tech from Ed Tech Insiders.
[01:40:22] Adele Smolansky: Thank you.
[01:40:24] Ben Kornell: Thanks for listening to this episode of Ed Tech Insiders. If you liked the podcast, remember to rate it and share it with others in the Ed Tech community.
For those who want even more Ed Tech Insider, subscribe to the free Ed Tech Insiders newsletter on Substack.