Edtech Insiders

Week in Edtech 8/6/25: Google's Guided Learning vs OpenAI's Study Mode, GSV Summit x BETT Merger, VR/AI Breakthroughs, Cambiar’s $100K Thrive Grants, and More! Feat. Derwin Sisnett of Cambiar Education, Dr. Alan Bekker of eSelf & Andrea Pasinetti of Kira

Alex Sarlin Season 10

Send us a text

Join hosts Alex Sarlin and guest host Claire Zau of GSV Ventures as they unpack a huge week for AI in education, major edtech mergers, and new funding opportunities.

✨ Episode Highlights:

[00:00:00] Claire Zau on AI tutor memory, personalization, and data contamination risks
[00:01:58] Google Guided Learning Mode vs OpenAI Study Mode – Socratic AI tools compared
[00:14:07] Google Classroom integration and the personalization edge from LMS data
[00:15:16]
Instructure and OpenAI partnership as a counter to Google’s advantages
[00:25:20] GSV Summit and BETT merge to create the world’s largest edtech events network
[00:27:35] Acquisition roundup: Curriculum Associates, Top Hat, Alpha School, Torch
[00:31:44] VR and AI content creation with Praxis Labs, Torch, and Google Genie 3

Plus, special guests:
[00:34:04]
Derwin Sisnett, Entrepreneur-in-Residence at Cambiar Education and CEO of Adaptive Commons on Cambiar Education’s Thrive Big Ideas Challenge
[00:54:39] Dr. Alan Bekker, Co-Founder and CEO of eSelf AI on scaling AI English tutors in Israel and building guardrails
[01:15:22] Andrea Pasinetti, Co-Founder and CEO of Kira on Kira Learning’s approach to AI-powered STEM education

😎 Stay updated with Edtech Insiders! 

🎉 Presenting Sponsor/s:

Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.

Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

Tuck Advisors is the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

[00:00:00] Claire Zau: Do I really want everything that I'm saying to my AI system as a therapist and as a friend, contaminating the data in my school learning setting or vice versa. There are interesting ways to think about how memory as a technical concept even exists, and can you create these like guardrails or, or walled gardens around different parts of your memory because your school self, it might be very different from your personal self, very different from your family self.

And as these AI systems become embedded in all of those aspects, I think there is a question of data can contamination and. Do you have a choice in what data flows into what, and even though it might improve your experience and improve that personalization? Do you want that level of personalization is another question.

[00:00:43] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod. Check out our newsletter and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.

Welcome to this week in EdTech. We have an incredible guest host for you, Claire Zau from GSV. She is the AI lead and partner, and we're just gonna jump right in because it was a very big week for AI in education. First off, this is fresh off the process. This came out, I believe 10 minutes ago. As of the recording of this, Google just announced its guided learning mode.

That is their response to open AI's study and learning mode that they have released. And Claire, you've been at right at the forefront of both of these. You've played with both of them. So tell us about OpenAI and Google and what are they doing for. 

[00:01:58] Claire Zau: Yeah, so we have back to back releases from two major AI labs launching their versions of Socratic versions of their default chat offerings.

And so open AI is one came out last week that was study mode, and happy to go deep into my testing around that. And then Gemini launched guided learning, which is their version of it. 

[00:02:17] Alex Sarlin: Yep. 

[00:02:18] Claire Zau: Essentially, both of these have, instead of offering an immediate single solution, uh, one shot answer is how they typically describe it.

It instead breaks it down into a scaffolded learning experience where it'll ask the learner open-ended questions. It will guide them through step-by-step reasoning. It's supposedly able to adapt the explanations to the learner's level. In study mode's case, it'll ask you what age you are, what you're studying, what you already know.

They both say that they'll be able to show visuals, videos, diagrams, and quizzes. So specifically built for the modality of learning. But it is interesting to see both of them launch back to back so closely in timelines to each other. And philanthropic actually launched their version way back in April, right before A-S-U-G-S-V.

So now all three big AI labs have some version of an education mode. 

[00:03:09] Alex Sarlin: Exactly. You know, I, I wrote an editorial last October about the learning layer, just when Google was, had its learn LM model out, but they hadn't yet incorporated into anything else. But the idea was, look, these are general, you know, the LLMs are incredibly powerful, but their general purpose and all of these ed tech companies have built learning specific implementations of LLM usage using the APIs to pull from LLMs.

But there is this sort of interesting in-between layer where you can take these really powerful frontier models, which are continually getting better and better, and you can add a sort of learning orientation to them. And now here we are less than a year later, and all three of the really major frontier models, we haven't seen perplexity yet.

We haven't seen Alibaba yet, I don't think do this. Deep seek, but like you, they're starting to say, yep, we can do that. And we're doing it and we're launching it and we're creating a version of our frontier model that is specifically for the learning and study specifically. Often study use case. The Google press release that came out about this specifically called out that AI homework help is like reaching incredibly high search results.

People are looking all the time for AI homework help. You can make of that what you will. Depending on your, how you think about that. But part of their response of creating a learning specific guided learning version of Gemini was that there is clear need for students. Students are looking for AI support and they're looking to learn with ai.

So open AI and Google and philanthropic are all creating versions that are trying to not just be homework help giving you the answer. If you upload an assignment or take a picture, but actually walk you through the problem, go through bit by bit. The Google guided learning demo has a question about a trajectory, like a physics question, and it does a, an image in real time of showing, Hey, based on the question you just asked, this is what it would look like for the shuttle to launch and then to hit its apex and then to slow down and the potential, you know, potential energy turns into kinetic energy.

It's very Socratic. It's trying to be very back and forth, and as you mentioned, it's trying not to just sort of give you the answer and be sycophantic and butlery and, and sort of, you know, do what you ask it to do, but instead sort of engage in learning. I am thrilled about this moment. Whether or not they're fully there yet, they're probably not fully there yet, but the fact that they're even trying to do this and that they're putting serious resources into it and it's, I don't know.

I, so far, I played with both of these and they seem pretty okay to me, but I think you're a little more critical. What were you finding when you played with GPT study mode? You were not quite sure. 

[00:05:35] Claire Zau: Yeah, I mean, I totally agree with you. I love, love that they are investing resources into an education mode.

I think it's needed. I think it benefits them too. I mean, if you look at how people use Claude or open AI in their releases, so much of usage is learning and education oriented. Even if you're an adult learner outside of school, there are so many queries that. Straddle both. You know, you have a lot of queries that are productivity oriented.

Get me this thing done. I want it as quickly as possible. But for similar to Google, a lot of queries are out of curiosity, out of learning, and there is a good intention behind them. There just isn't a scaffolded way to get to that learning process. So I'm excited. I mean, I think it's a step in the right direction.

That being said, I think, as I've mentioned a couple of times on this podcast, I do think the vision of a perfect AI tutor and, and the vision that people have been teasing around this ambient AI tutor that's able to help you 24 7 and is able to help you get to any learning goal. I think we have the tools to get there.

I think actually what we are missing is a lot of, well, one from an architectural standpoint, memory, but also contextually a lot of learning context on learner. So I'll get into that, but. Where I still see all of these AI tutors struggling with is probably just one I identified with study mode. Just offering a little bit too much structure early on.

And I think why this matters is because while it's great and I actually think it does a good job of breaking down what I don't know, and I think part of the problem with first iteration AI tutor bots was that they just gave students a blank text box and was like. Figure out for yourself, tell me what to teach you.

And I think so much of the learning process is, I don't know what I don't know. Yes. And so I appreciate that they've embedded the scaffolding, but I actually think great teachers know how to bring in structure at the right time. And there's a right balance between stepping in and stepping back. And when I tried it with essay writing, it almost didn't let me do as much playing and, and the messy parts of learning, of having bad ideas.

And so I think on that front, I wish there was a little bit more of the right timing around building instructured learning and that aha moment experience. And then I think, as I mentioned, just on a architectural standpoint, I think this is something that I'm sure Gemini will struggle with. Claude struggles with open AI struggles with, and this is something that exists outside of study mode, is just largely with all these AI systems, there is still a big gap around memory.

All of these learning experiences, even if you do something in study mode, I believe it'll remember that you had previously queried a topic within physics, but it won't remember exactly what you struggled with. It also won't be able to allow you to jump back in into a previous conversation. And without memory, that just means you don't have space repetition.

There's not really a great way to study longitudinal outcomes of how a person is learning. So I think that's probably just from an architectural standpoint, memory as a whole. Even OpenAI generally for their chat bots, they're trying to figure this out, and once they can embed that, I actually think that will really amplify the learning experience.

And then the other point you brought up on Vinci, I just, I still think that hasn't been solved yet. And that again, is also a technical problem that they're pouring a ton of research into. But ultimately, if these models are still. Stay within the realm of being a little too eager to agree and affirm. I still think that is a little bit problematic, but from, you know, what we mentioned last week around companionship, but also in learning experiences because you don't necessarily want a tutor that wants to please you or wants to flatter you all the time.

So those are my high level thoughts, but I can go endlessly into what I think the prime AI tutor should look like. 

[00:09:26] Alex Sarlin: I think it's a really important conversation, and I, I agree completely. I think the personalization piece, the, the, the memory piece is absolutely vital to any kind of meaningful precision or personalized learning.

Uh, if you're trying to have a persistent relationship with an AI tutor, or even if you just want the tutor to remember what classes you're taking from session to session, because you're, you're coming back every week asking to do the next unit in your environmental science class, the fact that you have to remind it, Hey, I'm an environmental science major.

I, by the way, and I'm studying this and this is what I wanna be, you, obviously that's, that's just silly. But I do think that's the problem that is on its way to being solved because that memory piece, it's important for therapy, it's important for companionship, it's important for dating, it's important for almost any kind of, you know, relationship building that you would have with any kind of AI for any reason.

So I would be very surprised if we don't see major announcements and advancements from the frontier models about, and to be clear, there's different kinds of memory, right? I mean. To your exactly your point, what it's remembering, remembering that you asked something about physics is different than remembering that you're a physics major and you love physics, or you're doing it because your dad was a physicist, or you're doing it even though you all really wanna be a, a welder.

Like, those are totally different things. So it is very important to think about how memory is gonna work. The ency is, is, is, is really interesting. One thing that I found interesting about playing with some of these study modes is that right off the bat they start to ask you why you wanna learn these things.

Which I think is, it seems very obvious, but that is a level of structure that you don't often see from other conversations. If you say, I'm gonna upload a document, do this with it. And they're not like, Hey, what are you trying to accomplish here? Is this for work? Is this for fun? Like, what do you need to remember this in a, in a year?

But it does feel like the learning modes are starting to get into that right off the bat, which I think is a, is a progress. That said, you can't always remember it, so that's a problem. But it, it's always one step forward, two steps back here or vice versa. But I've been pretty excited. I mean, one of the things that I found really interesting about playing with the Google mode, you mentioned that it can pull in images, it can also pull in YouTube videos.

And I think that speaks to something that is a major advantage that I think you and I agree on, but I'd love to hear you talk about it, which is that we've talked about how Google has Chromebooks and Google Classroom and has a lot of presence in education, but Google just has a lot of mind share in our normal lives, right?

I mean, almost everybody uses Google products on a daily basis, whether it's maps or Gmail or sheets or docs or all sorts of things. Or, or YouTube. And YouTube is, is used like constantly by young people. So the fact that they're starting to pull YouTube. Into their learning mode and say, oh, if you wanna know this, we have the entire YouTube library and we can make our own videos too, with incredibly interesting tooling like vo.

And that multimodal teaching is very exciting to me. And that surface area question is really exciting to me. When I think about Google's advantages here, what do you make of that? 

[00:12:14] Claire Zau: Yeah, I think you're exactly right. I actually think on the visual aids front, I do think that Google probably has more of an advantage.

I think one of the problems with most of these AI tutors is they actually don't know when to surface visual aids or external content. What I think is interesting is Google has actually a lot of surface area of content. I think if you think about Notebook lm, you could see a world where you're. Working in Gemini guided learning mode, and it suddenly suggests, Hey, we could generate a podcast around this, or we could generate a mental map.

Let me pull this in for you. So having that massive surface area is actually really helpful because I think it just means there's so many more resources that you can pull into the learning experience versus I think if you do the same thing in Attach PT or Claude, it might be generating something, but it doesn't necessarily have full context and it doesn't have the versatility of modalities right now.

I think another I, I mean this is my own thesis in the space, but I actually think another massive advantage that Google has in the space is actually their integration with Google Classroom. So I actually think to your point around personalization, one of the key issues is getting any sort of contextual data around what the student is studying, where they're at in their coursework, their grades, all of that is information and a gold mine of data that sits within Google Classroom.

And so I actually think in the announcement they also said that teachers can build their own versions of guided learning, embed them within Google Classroom, they can assign them to certain students on an individual level at a group level to the whole class. And I think what's powerful about that is.

If Gemini wants to expand into that personalization and memory layer, they don't need to do the prompting of being like, Hey, what class are you in? Or What grade are you in? They just know that because they have all this data on Claire did badly on this concept. Exactly. Or Claire is excelling at this 

[00:14:07] Alex Sarlin: or has a test coming up next week or, 

[00:14:09] Claire Zau: yeah, exactly.

They have the full syllabus and so that's kind of, I think when you think about. Advantages that compound in the AI era. I do think that data layer and that context layer is so, so valuable, and maybe that's where something like Google Classrooms has an advantage. Maybe if you think about Magic school, brisk school, ai, all these AI operating systems that have so much data around the students and teachers, that's where they could build a better AI tutor because they can plug into that.

That being said, I also think you have someone like an open ai. I think they just announced a partnership with instructor. 

[00:14:41] Alex Sarlin: That's what I was about to say. That's the counterbalancing trend, right? Yeah, 

[00:14:44] Claire Zau: yeah. So either own it. I think in the case of Google, you own Google Classroom, you tap into that, or you build the right partnerships such that you can plug into that contextual layer.

But I mean, again, I, I know there are all these data privacy. Hurdles to also, maybe we don't wanna be sharing all this data into the guided learning experience, so it'll be interesting to see how they evolve. But as I think about, if I were just purely from a product standpoint of how do I build the best learning product possible, I actually think Google has a really strong data mine that they can tap into.

[00:15:16] Alex Sarlin: I, I totally agree. And I think the news, and I, I don't think we've covered it on the podcast 'cause it, it just happened a week ago, but that the announcement, the big announcement. Instructure Con, which is just about a week and a half ago, that where they're basically, you are working directly with OpenAI to get OpenAI and Instructure to sort of have this combined product suite, I think is to your exactly your point.

It's exactly their attempt to balance the built-in advantage that Google already has by having classroom and already being, you know, basically a third of the LMS market, especially for K 12 instructure's more even bigger, especially if you combine the higher ed and the K 12 impact of Instructure, which I think, you know, for, for open ai, higher ed, I still think mm-hmm.

Is, is more their bailiwick. So even though they, they wanna move into K 12 as well, but I think that was a really interesting announcement for exactly all the reasons you said because if you can combine all of the information about a student that happens in an LMS, when we talked to Steve Daley from instructor on the podcast up.

Maybe six months ago. And that was exactly what he was saying at the time. It's like, well, we know so much about what a student is taking, what they've looked at, how well they've done, and every, everything you just said, that becomes an incredible source of information and data. If the privacy is there, you know, you, you obvious, you can imagine a world in which you go into Canvas and it says, Hey, we've made this new partnership with OpenAI, and if you want, you know, click here to connect to OpenAI.

By the way, small print there is, they're gonna get access to everything you've ever done in Canvas. But I mean, that's a trade off that might really be worth it for many, many students and for professors, frankly, and teachers, because you're putting together this very personal database about you and all the classes you've taken and your schedule and the syllabi and your major and your everything with the power of these LLMs to then work with you in a consistent basis.

[00:17:03] Claire Zau: Just one little thing that you got me thinking about is, I am curious maybe there is actually a, a world where this backfires at, at least on the higher ed level, where you actually see a lot of students wanting, they only sign up for tragedy, GBT on personal accounts, not their student accounts. Even though they might get better pro features with the student account and their EDU account because there's a fear that.

Even though it's a supposed to be a judgment free learning zone, that some of that information or their cheating conversations may get reported back to the institution. So I, I wonder if there's a world where people actually are less inclined to use maybe something that is connected to their institution, even though it is probably going to be more personalized, it's probably gonna be more convenient.

You might get access to better models even. But I actually think there's also that human behavioral element that I, I, I wonder how that'll play out. 

[00:17:48] Alex Sarlin: It's a great question and it depends on what the value prop is, I think, right? Mm-hmm. I can imagine a student saying, well, I definitely don't wanna be asking for answers through my college account, but maybe it would be really useful if it's gonna recommend classes for me, or if it's going to put all my schedules together and create study guides for me.

Like there are things that could do proactively, even if it knows so much about what you're doing academically, that might be of value to students. But your point is well taken, which is that there are certain things that students are doing in chat GBT that they would absolutely not want, you know, administrator eyes on.

And I think that goes to the heart of some of the tension in the whole EdTech space right now between, especially between the sort of B2C models where so many of the ed tech AI tools are for students and saying, Hey, we'll get you exactly what you need, whatever that is, you know, wink, wink versus the tools that are for universities or for schools or for districts that are saying, Hey, let's do this, the, the educational way and not allow workarounds.

It's a really interesting tension and I think you're, you're making a great point 'cause it's gonna hit a real crescendo here as these tools start to come together. 

[00:18:53] Claire Zau: Yeah, and I also, I mean, I think it also, on the flip side, there's actually even a architectural question around this just from the technical standpoint, is how do you actually manage memory at scale?

So say we actually do unlock full memory, full personalization, but do I really want everything that I'm saying to my AI system as a therapist and as a friend, contaminating the data in my school learning setting, or vice versa? There are interesting ways to think about how memory as a technical concept even exists.

And can you create these like guardrails or, or walled gardens around different parts of your memory because you know your school self, it might be very different from your personal self, very different from your family self. And as these AI systems become embedded in all of those aspects, I think there is a question of data contamination and do you have a choice in what data flows into what?

And even though it might improve your experience and improve that personalization, do you want that level of personalization is another question. 

[00:19:49] Alex Sarlin: It speaks to an issue that is so important. And actually I give a little pitch here. We are doing a webinar at the end of this month about neurodiversity and learning differences in ai.

And I think that that, because when I hear you talk about, hey, where are the lines between my academic self and my therapy self and my, and my, you know, well that's, that is something that schools, especially K 12 schools have been dealing with a long time. That's basically why IEPs are like legal documents that are contracts that are super private.

But they also influence academic choices, right? You, you say, oh, this student is gonna get extra time on an exam because they have an official diagnosis of X, Y, Z. Suddenly the lines are starting to blur between your academic self and other parts of your personality. And I think the same kind of stuff is gonna come up when it comes to ai.

And there will be areas where you'll want them to overlap like that. And there'll be areas where you absolutely don't want them to overlap, and where it's incredibly complicated. It's incredibly interesting. I love this stuff, but I totally agree with you. I think there's gonna be these a lot once this memory piece, this idea of what does an AI actually know about you Becomes, you sort of mainstreamed, I don't think it is yet.

There's going to be so much discussion about, like you said, about if you have your therapist, your therapist, your ai, you know, relationship. I shouldn't, I shouldn't keep going back to that, but I know that's gonna happen, right? And your teacher and your tutor and your workforce trainer all in the same app.

Oh my gosh. You know, what does it mean? 

[00:21:14] Claire Zau: I mean, I could see a really kind of almost scary scenario where it is, you know, these ai, the tutor experience wants to surface as much personalization as possible. They're going to tap into that memory. Like, and so maybe there's a scenario where hypothetically, you're in high school and your parents are going through a divorce and you're talking to your AI system about how to navigate it.

And then randomly in a Google, or, sorry, not even Google, but any classroom setting, if it is plugged into an AI system that taps into this memory layer, is it randomly, if you're studying a humanity subject, it's gonna be like, here are a couple prompts based off of your personal life, right? Why don't we talk about how divorce impacts x?

And it it just, exactly. There are a lot of, I think, scary scenarios where there are, I don't know. And then is it up to the AI system to decide what information flows? Are you supposed to create different kind of firewalls between them? I don't know, but I actually think, yeah, as you mentioned, it is something to be solved later on.

For now, there is not that level of. Memory sophistication, but something definitely to think about in the future. 

[00:22:15] Alex Sarlin: I can imagine a world in which these things are sort of personified in a way where it's like you, you're talking to a tutor, let's say an AI tutor, and it says, I'm just getting a note from your AI therapist that there is something happening in your personal life that might be relevant to this, but I don't know what it is.

But they seem to know, is there something you would want to, you know, would you wanna make that connection and you'll get the choice and autonomy as a learner? I, I don't know. It's gonna be crazy. 

[00:22:40] Claire Zau: Yeah. Yeah. We'll see. Another thing I'm a little bit concerned about as we just think about long-term AI and learning systems is if it does have a built-in profile around what kind of learner you are, who you are, what you like.

I do think there is somewhat of a risk of reinforcing behaviors and that they go into a learning conversation and they're like, Claire is this type of learner. She likes this. She does not like this, and that's great for me and my current moment, but right. Does that build in any room for growth? And people change, we evolve as humans.

And my eight years old self is not going to like the same things that my 9-year-old self liked, God forbid. And so how do you factor those things in into the learning conversation when these AI systems are going to have these prebuilt. Parameters around who you are, and then they might reinforce certain behaviors and, and who you are.

And, and maybe we don't have as much autonomy as we think. Yeah. 

[00:23:34] Alex Sarlin: Yeah. I don't know. No, it's, it, I mean, I totally with you, and I'm actually a little optimistic about that in that I think these systems are so sophisticated that it could, you know, let's say last week, everything you said to the AI was really depressed.

It was, you were down, you said, who cares? None of this even matters anyway, blah, blah, blah, blah. And the AI goes, okay. I'm gleaning a little bit of a negative attitude towards this subject. The next week, the same person comes in and says, okay, let's do it. I can imagine the AI going, you know, last week you seemed a little negative about this, but you seem happier today.

Like, should we do it in a different way? Like unlike some of the old rule-based systems where it was like, okay, I'm setting a parameter that says Claire likes this and not this, and that parameter can only be changed if she updates her settings. AI is actually pretty, you know, unlike most other technologies, it's pretty capable of sort of dynamic interpretation.

I mean, you can do it in almost any conversation. Say, you know, nevermind, I know I just asked about this, but I don't care about it anymore. Let's go this way and it'll just follow you wherever it goes. And part of it's sick of fancy, but part of it is also that it's using the new information and sort of, it's using it to continually adapt what it's doing.

And I think that there's a world in which some of that, what we used to consider personalization, which is, you know, these role-based parameters, this person likes soccer but not baseball. And so all the things should be soccer based. I could imagine, you know, it being a little more sophisticated than that if we get it right.

But that said, it is definitely a risk, especially when it comes to things like, oh, this learner prefers video, so I'm always gonna give them video. Always for everything, you know. But there is one other major piece of ed tech news that you are very close to that I'd love to talk about. We hasn't been covered on this podcast, the BET News, GSV and bet.

Can you tell us a little bit about the world's largest EdTech events joining forces? 

[00:25:20] Claire Zau: Yeah. Yeah. So for those of you who are less familiar with bet, it is an event that is hosted across the globe and four, three different continents. They have one event in London, one in Malaysia, and one in Brazil. And they are mostly focused on serving on ground teachers.

I would say it's pretty similar to what we've been trying to do with the AI show and reaching actual educators who are on the ground using learning technology, using ai. And then we, you know, for those of you who haven't been to a SU gc, we hope you get to go one day. It is an event that we've been hosting for the last 15, 16 years.

Now, that event started out as small in-person, 300 people in a sweaty conference room, and that's now grown into about $7 in people. It's hit capacity. If you've been to the Manchester Grand Hyatt during A-S-U-G-S-C, it is a festival and it's been called everything from Davos of Education to Woodstock of Education.

I don't know. But the idea there is that you are getting the most important people in the senior level education innovation around education innovation, where you can connect people in K 12 to people in higher ed to workforce. There are about a thousand plus ed tech CEOs who are present. So CEOs of small prese companies that are still iterating to CEOs of major companies like PowerSchool, Instructure, Duolingo, all in attendance.

And when we bring that founder ecosystem with the actual buyer ecosystem, a lot of magic happened. So all of this is to say there was a very magical moment and partnership. Opportunity to bring all of these events together and actually massively improve our scale. We think that learning and education is going to be even more important in this day and age when education is truly lifelong, especially with ai.

And so we're excited to see what we can do in partnership with and see how we can expand our thought leadership in the education innovation ecosystem, but also reach more educators who are doing the actual important work of educating the next generation. Um, hoping to also expand more in higher ed and workforce learning and lifelong learning as well.

So all around very exciting and stay tuned for more updates in this future partnership. 

[00:27:35] Alex Sarlin: Yeah, I was totally surprised and very excited by that news because these are two, this is basically, you know, GSV. The A-S-U-G-S-V conference is an absolutely incredible conference. It's like just, it feels like the Davos or whatever the World Series of, of EdTech conferences.

And then BED is this incredible global set of conferences, as you say, Malaysia, Brazil. Putting it all together, I think is going to literally bring the world of EdTech together, which is something we always love doing at EdTech Insiders. We love interviewing founders from all over the world, and especially in, in this modern age, the technology is so good.

People can create incredibly powerful and meaningful EdTech products from anywhere. So I'm so excited that the world is sort of getting more visible, more transparent, more connected through this partnership. It's really quite amazing. Claire, there were a few acquisitions this week that I wanted to just zip through because I think they're really interesting, but I'd love to get your take on any of them.

So there there were four that I wanted to highlight. One, curriculum Associates acquired Style Education, that's a science provider Australia. This is a way for curriculum associates to expand into their, really into science and expand from what they've been doing with i-Ready. That is really interesting.

Top Hat, the Canadian company works mostly in higher ed. Acquired Open Class, which is an AI assessment creation company, so that was an acquisition to enhance their AI portfolio. Alpha School, which we've interviewed McKenzie Price on this podcast they're doing, they do this really interesting model based on the sort of two hours of academic work per day.

They acquired most of the higher ground education assets, higher ground education as a Montessori provider all around the country. And then Torch acquired Praxis Labs. And Praxis Labs is a fun company. We had talked to the founders way back, but they do these AR basically diversity and equity training through AR and where people can sort of put on headsets and see what it's like to be autistic in the workplace or do all these really interesting simulations.

Really cool company and really cool founders, and they were acquired as well. So any of those could be interesting news in various ways. I'm curious if any of them stand out to you as worth a chatting about. 

[00:29:37] Claire Zau: Yeah, obviously always exciting to see great partnerships in in the space, whether that's Juicy Summit and BET or all of the four that you mentioned.

I think generally just at a super high level around how the market is evolving, I do think we're seeing a lot of consolidation in the ecosystem, and I think that's good for the ecosystem. It means people would like to build things that eventually exit to other great companies and ones where there are clear synergies.

I think for example, for curriculum associates and style education, it makes a ton of sense to have that partnership to expand another vertical in science. I think in the case of Alpha School and Higher Ground, my understanding with that one was that it was mostly an acquisition of physical campuses and really with the plan to accelerate their campuses expansion.

I think right now they're only in Texas and Florida, but it sounds like they're hoping to expand to other cities and the higher ground education would support this expansion since I believe they already have assets and campuses nationwide in New York City and Scottsdale and whatnot. So I think exciting to see that.

I think that is driven a lot by. Excitement around AI and being able to learn things in two hours, but also right. School choice and, and alternative school models I think are seeing a lot of momentum. And then on Praxis Labs, an interesting point here is we've also been been tracking Praxis for a while.

Great company and, and really interesting in the XR space. But I do think that XR has historically been quite expensive to scale, just given the content friction of having to take time to build content, to hire 3D designers to build individual modules. So I'm excited that Torch is leaning more heavily into the VR and XR modality.

I think it is a great shift and I'm actually excited to see, especially within the AI era, how they can push out more content just given unlocks like Genie three, but also even I think, yeah, I, I just think generally the VR space is going to be accelerated by ai. 

[00:31:44] Alex Sarlin: Yes. Fantastic rundown and yeah, I didn't wanna step on you there, but yes, genie three, there was some, anybody who hasn't seen the demos of what Genie Three.

This is a Google's model for creating 3D worlds. Really, very, very exciting. Basically, the ability to almost dynamically and from prompts create immersive 3D worlds that. Could be a, a game changer for ar vr in terms of the cost of content creation. It also could be a game changer for what the future of educational, of, of even tutoring looks like.

If an AI tutor like Gemini or open AI study mode can say, oh, you wanna know what it's like to X, Y, Z, or, this is what you're trying to learn. You want, you're learning chemistry. Let me drop you into a, an atom right here. And you can look around and I mean, we are very close to that possibility. That is something I think a lot of us in ed tech have dreamt of for a long time.

But that's a great, you're, you're, I mean, I'm, I'm tempted to make a sort of moment out of this idea of the Montessori school campuses turning into Alpha schools, which are based on this sort of like accelerated AI learning model. It feels like you could do a whole thought piece about how that's like a shift in, in educational philosophy, but I won't because I'm not sure if that's true, but that was a great rundown and I, I think it's a, it's really interesting to see everything coming through this space.

It's, I mean, I. Philanthropic launched a new model just yesterday at 4.1. We're seeing open ai, open sourced it's model. I mean there's so much happening. We, I don't even have time to talk about it, at least it's open weight reasoning models. But we'll have to get into some of this stuff at Future podcast 'cause we have some amazing guests lined up for this podcast, including Brittany Miller from the outcomes based contracting world.

We have Evan Harris who does deep fake consulting for schools. Amazing folks. Anyway, thank you so much, Claire Zau is always a incredible pleasure. I just get like a high off of talking to you about this amazing space and I just think EdTech is entering a completely new era and I love the BET GSV combination.

I'm just so thrilled to see that coming together. Any final words? Anything we didn't cover? 

[00:33:43] Claire Zau: No, this has been so fun. As always, I'm excited to see more developments like I'm about to go play around with guided learning from Gemini that just dropped an hour ago. And yeah, excited to see how all this evolves and hopefully I'll see many of you listening.

I'll see you Alex Ben at the next A-S-U-G-S-V bet. Events that are coming soon. Oh 

[00:34:04] Alex Sarlin: yeah, should be a blast. I can't wait. Thanks Claire. And here are our guests, tha thanks. Thanks for being here with us on EdTech Insiders, and thank you all for listening. If you hear it's gonna happen in EdTech, you'll hear about it here on EdTech Insiders.

For our deep dive this week. On this week in EdTech, we're here with Derwin Sisnett. He is the entrepreneur in residence at Cambiar Education and CEO of Adaptive Commons, focusing on transforming civic spaces for social good, especially for educators. He co-founded Maslow Development, a real estate and community development firm, and Gestalt Community Schools, a highly performing charter network in Memphis, honored by the White House.

He holds degrees from Emory and the University of Memphis, a PhD from the University of Memphis, and is alumnus of Harvard's Graduate School of Design. He's also a great guy, generally, and a great coworker. We've been working together for quite a while now. He's amazing. He's also an Echoing Green Fellow and a Bahara Fellow of the Aspen Global Leadership Network, and he's here today with a big announcement about Cambiar Thrive Grant, the Cambiar Thrive Grant Derwin, welcome to EdTech Insiders.

Oh, thanks for having me, Alex. I'm so excited to have you here today. So first off, you know you have this incredible background, but you're here today because you have an announcement and it's about the Thrive Big Ideas challenge from Cambiar Education. Tell us about what is the Thrive Big Ideas challenge, and why should it be something on everybody's radar who's listening to this podcast?

[00:35:28] Derwin Sisnett: Yeah, you, you mentioned that I'm an entrepreneur and resident at Cambiar and Cambiar has supported the work that I'm doing and this is a great opportunity for me to pay it forward and lean into support other entrepreneurs, particularly in the access to student data space. And so we had our Thrive cohort from past years and we are now revamping that to really introduce not just giving access to student data for parents so that they could actually inform the work that's being done, but also what does it mean to give students access to data in ways that give them agency and support their wellbeing and, and, and for them to thrive.

And so for. Cambiar is a, a venture studio that really invests in entrepreneurs to support education broadly. But as we go more specifically into the Thrive work, the core of our work for Cambiar Thrive is to address the significant gap that we face in education today, which is the information gap. So we know there's all the buzz and talk about AI and, and how the world is evolving faster than we could even catch up to.

And if you could imagine, students and families are often the ones left behind in that. And so we think this is a great opportunity to really invest in entrepreneurs who have really big ideas to support access to student data that not only support parents in getting that access to what, what they could do with their students, but also what it means to support students with what they can do with that data.

[00:36:43] Alex Sarlin: So tell us a little, when you're talking about student data and access of student data, for parents, this is really about solutions within the home and school environment that allow data to sort of bring both the home environment with the parents and school with teachers and students closer together through visibility and access to student data.

Give us an example of somebody who's doing a good job of this so far, who may be in the Thrive community already. 

[00:37:09] Derwin Sisnett: Yeah, and just to underscore, we're we're not talking about getting parents report cards and that tells their families or, you know, caregivers that, okay, the student's an A student, B student, C student, we all know that those metrics really change depending on the environment.

And sometimes it's skewed. Oftentimes it's skewed. And so we're talking about folks who are really diving deeply, and I, I can use a few examples. I mean, one that comes to mind is Paloma, right? Alejandro, an amazing CEO, we're, we're excited to be supporting his work. When you think about what it means to caregivers, how much time they spend with their, their students, right?

They are the first teacher. I tell people all the time, my mom, my dad, they were my first teachers. And so with Alejandro, with Paloma, they're doing, they're actually putting reading data in parents' hands. Helping parents teach their kids how to read in their own language, in their own way, in their own cadence.

And so as parents who typically don't feel like they have that agency or that access, getting that power in their hands and being able to provide that support to their students means that students are actually achieving at higher levels because they're not only getting that dosage of reading at school, but they're also getting it at home.

It becomes the, the night routine, if you will, for kids. And so it makes them more excited to get into reading in ways that wouldn't happen if you only relied on what happens in the classroom. 

[00:38:24] Alex Sarlin: Absolutely. And yeah, I mean, we had Alejandro on the podcast a while back, and his model is so interesting. It basically asks parents to do 15 minutes a day at home of reading practice and support.

But it gives them, as you said, all of the data, exactly what the student is struggling with, what they're studying, what would be useful, and even creates automated stories for them to work with that actually build the decodables into their practice. So parents have this 15 minutes of very guided instruction, which has incredible results.

So the Thrive Big Ideas challenge that's open right now. This is a funding opportunity for EdTech companies. Right. What are we looking at if EdTech companies wanted to apply right now, who should be applying and what might they expect the process to look like? 

[00:39:06] Derwin Sisnett: Yeah, I mean, we're looking for, I mean, you just mentioned, we just talked about Paloma.

We're looking for big ideas, right? We're not looking for the thing that's just gonna work around the margins. We're looking for organizations that are working to provide students and caregivers with better information. These can be nonprofits, startups, even existing organizations that wanna pilot a new project.

The main thing is that they must be a legally operating organization in the US and have a fresh, practical, and scalable approach. Nothing that's already been on the shelf, that isn't a big, big, big idea. We're looking for leaders with the credibility, insight, and, and early traction, frankly, to make their idea a reality.

[00:39:39] Alex Sarlin: Yeah. And so, you know, you mentioned this, but I think it's worth double clicking on. This is for both nonprofits and for-profit companies either can apply to this. It's not a pure grant that's only nonprofit ready and it's not safe note. That would only be for a for-profit company. Any ed tech organization that is doing this type of interesting, meaningful, big idea work is eligible for a Thrive grant.

[00:40:03] Derwin Sisnett: Is that right? That's exactly correct. And what's unique about this opportunity versus the first opportunity is that we've actually split into two distinct tracks. So this time we not only have a caregiver track, but we have a student track as well. So for the student track, this is for ideas that give students direct access to their own data, helping them to understand it and then make better decisions about their learning.

And then for the caregiver track, this is for ideas that provide parents and caregivers with innovative tools to better understand and support their children's learning journey. 

[00:40:30] Alex Sarlin: Yeah, that's really smart. And I think that giving students access to their own data is a area of ed tech that is so fascinating.

And you know, that feedback loop, there's so many different approaches and ways to take on that challenge of giving students access to their own data in a way that's actually gonna be beneficial for their future knowledge beyond, as you say, you know, report cards and grades and the kind of things that can be very divisive.

So tell us a little bit more about the background of this Thrive Grant. It's about, you know, using data and information to connect the different parts of the ecosystem that are not always connected and improve outcomes. What gaps in education or community development, because this is beyond the school, are you seeing right now in the ecosystem that Thrive?

Big Ideas Challenge is really designed to address. 

[00:41:14] Derwin Sisnett: I mean, we're seeing a wide range, and I can, you know, go back to some of our past grantees where, you know, one platform they, you know, really looked at how to connect teachers and families via text to see, uh, how they would think about the decrease in absenteeism rates, right?

When teachers use their tools. And so the gains that they saw were even higher for Spanish speaking students, for instance, who averaged, you know, six additional days of attendance during the school year. And so this organization, when we talk about big ideas, they've been able to reach millions of families and educators.

In order to do their work. And so that's, that's one example. But you know, going onto the other side, we have another grantee who focused on indigenous education and demonstrated success with a, you know, increased graduation rate of 88.6%, which is 20 percentage points higher than average. And so this was done by involving families in a community-led assessment process.

A parent teacher planning tool. And so we see a wide range of what these kinds of ventures could be, and so we aren't prescribed to one, but we really believe that these ideas have to be really big ideas to make changes that really look into impact, scale and sustainability. 

[00:42:17] Alex Sarlin: Yeah, that's a great point, right, the absenteeism.

There have just been some recent reports about incredible decreases in public school enrollment over the last few years. Absenteeism crisis continues, literacy crisis continues. You mentioned Paloma, there's, there's disengagement at school, there are language gaps, there are special populations that need additional support to get to graduation.

Lots and lots of problems being addressed. So one question, you know, as we, you know, the EdTech companies and nonprofits that are listening to this. Maybe hearing this and and saying, I wanna look at this, thrive Big ideas and see how it corresponds to what I do. If a company already has a really interesting, scalable, big idea solution, but it's not necessarily about caregivers, should they still be considering the Thrive Grant is, or are they sort of out of the running?

[00:43:03] Derwin Sisnett: I would say if their work is focused on access to student data, whether that's access to student data for parents or for students, they should certainly apply If it, it falls outside of that range. That's not to say there won't be opportunities in the future. Just in this moment, our main focus on those two specific tracks.

[00:43:21] Alex Sarlin: Gotcha. So anything that involves making student data more accessible and usable for either the students themselves or for their home environments and their families and caregivers, which is a wide range, which is as it should be, should be considering the Thrive Big Ideas challenge. Exactly. Exactly.

Fantastic. You know, I mean you have such an interesting story. Before we we go even deeper into the Thrive Challenge, I think we have a couple more questions about it, but tell us a little bit about what Adaptive Commons is. 'cause it's such an incredibly interesting organization and I'd love you to describe it and talk about how what you do also sort of helps bridge the gap between the educational and home environments for families and educators.

[00:44:00] Derwin Sisnett: I'm a big believer that there's not a, a single solution or a silver bullet for any of the work that we're doing, right? We really have to think about this as an entire ecosystem. So when I think about Cambiar art thrive, what's really incredible about that is that we're addressing the problems and opportunities from a programmatic standpoint.

And then there's another side of this, right? There's the built environment, the wraparound infrastructure. How do we address the problem from that vantage point as well? And I think about my own upbringing. You know, my parents essentially. When I think about Thrive, my parents were Thrive parents, they were the caregivers that we envision as we think about Thrive, where they were empowered.

They felt agency to take the data that they had about me to then, you know, better inform what was happening in the classroom, both from myself and for me and from our brother. When I think about the work at Adaptive Commons, we realize that wait, teachers typically can't afford to live near where they work.

So we're having a hard time recruiting talent. We're having a hard time retaining talent at the same time, school buildings are being closed, left and right, and these are assets in the community. And so what might it mean to reimagine those assets and reenvision them for housing that's affordable for not only teachers, but other.

Workforce professionals, right? And so imagine these, you know, informing the ecosystem and building out the built environment at the same time, having programs like the ventures that are supported by Cambiar R Thrive within or around that ecosystem. And so I think about this work really holistically as Cambiar R does, which is why it was great to team up as an entrepreneur in residence.

And as I think about the work more broadly, there's so much impact that we could have if we think about the work, not just what happens inside the walls of a school, but also outside the walls of a school. 

[00:45:38] Alex Sarlin: Just to make it concrete, tell us a little bit about some of these school buildings that you have helped convert into housing for educators and other professionals.

[00:45:46] Derwin Sisnett: Yeah, and I'll also say the beauty of this is that converting schools to housing isn't new. It just hasn't been done at the scale to which it should be done. And it hasn't been done by and large in ways that are affordable. You typically see them become high-end lofts or hotels, but never for teachers or essential workers.

And so in my past work, particularly at Maslow, and even at Gestalt, we really looked at transforming whole neighborhoods, right? And whether it's a dilapidated department complex in reimagining that for a school of performing arts and affordable housing, or a vacant shopping plaza directly across the street from what I just described, and re-imagining that for, uh, health and wellness and high school and elementary school.

And so adaptive reuse. As I think about that terminology, really, you know, for the US we're a relatively new country. It isn't as common for us as it is for places in Europe and, and other continents where the building stock is older. It's just, it's commonplace to reimagine these spaces. And so while I've done that in my past work for Adaptive Commons, our focus is squarely now on taking these civic spaces, whether it's school building to other municipal loan spaces and reimagining them for civic good, primarily housing for teachers and other essential workers.

[00:46:57] Alex Sarlin: That is the perfect example of a big idea, and I think you, it's really quite incredible what you've done with that. So let's just get back a couple more questions about this really amazing opportunity with the the Thrive Grant and the Thrive Big Ideas challenge. You mentioned AI in passing here, and one of the things for me, maybe this is my, you know, my bias here, but whenever I hear anything about student data, two things flash into my mind.

One is. Security and privacy and making sure that student data is handled really carefully. And that is obviously top of mind for every ed tech company that handles student data. But the other is AI student data can be used as the fuel for personalized learning, or I like to call precision learning. So you know, you mentioned Paloma, they literally develop decodable texts for each student every week that these parents can use at home.

And it's done using ai, but also using the underlying data about what the student is learning. I'm curious, as you see these applications coming in, how you are thinking about the relationship between student data and the AI that could be built on top of that data. 

[00:47:59] Derwin Sisnett: Yeah, so we're seeing some come in now and I don't wanna give anything away.

While, while the application is live and, and I might have failed to mention it closes August 31st. So if anyone listening hasn't applied yet and is interested in applying, there's still time. So I don't wanna give away specifically what I'm seeing. However, I will say that it's more than likely that AI will be a big part of most applications.

As we think about scale potential, as we think about impact potential. And so data privacy is huge for us, right? There needs to be some really important part of the work that is undergirded by data privacy because that work is, or that data is so sensitive, particularly as we think about students. And so that will definitely be a big part of, as we screen the applications a big part of what we think about.

[00:48:45] Alex Sarlin: Yeah, it makes a lot of sense. And then you mentioned this already, but I do wanna dive into it again because I think it's so relevant to potential applicants, the Thrive Big Ideas challenge. It's about big ideas, but it's not necessarily about organizations that already have reached massive scale. You're very open to early stage companies, to companies that are relatively new on the new entrance to the market if they have really transformational ideas.

Tell us a little bit more, so if you're listening to this as an entrepreneur, tell us a little bit more about what you, you, you mentioned, I think they just have to be sort of a setup organization in the US based and things like that. But what type of scale? Is there any minimum amount, amount of scale that is required here?

Or can companies, you know, I see a lot of incoming companies made by students that are doing really interesting work with ai. If a startup has just started and is just starting to get off the ground, should they be considering the Thrive Grant or should they wait until they're a little bigger? 

[00:49:40] Derwin Sisnett: If you are asking me, as someone who wasn't a serial entrepreneur, I might hesitate and say, well, no, give it some time.

But as a serial entrepreneur, I know what it means to get some early traction and then be able to run. And so, so long as an organization has early traction, they don't have to be in existence for five, 10 plus years or have 10,000, 20,000 users. They have a pathway to that. That's what we're, we're looking to see.

Right. It's a big idea. It isn't big in existence. Right. It is an idea that is really big that we really believe in, that we wanna support, and we're giving a hundred thousand dollars grant to these ideas, right? And an opportunity to double down on select grantees down the road. And so for particularly if you are a startup or an existing organization that is spinning out a new entity that has this big idea, we're looking to support that.

And we believe that our investment could go a really long way. And I, I can say that from personal experience. 

[00:50:33] Alex Sarlin: Yep. Yep. Very well put. If you have traction and a a trajectory towards being able to really reach a lot of people, then you should be thinking about applying and I think that's a really exciting opportunity.

Durwin, this has been so interesting. You know, Durwin, you and I have worked together over the last few years in various ways because I am also worked with Cambi in all sorts of ways, and it's an incredible organization, always focused on equity, always focused on educational opportunity. As you think about this next few weeks as people are applying for these grants, and I know that they're gonna be a good amount of applications, what are you most excited about as you see these ideas come in, all of these different ways to use student data for caregivers, get you student data for, for students themselves.

When you sort of get up in the morning and think about all the different applications sort of coming into the inbox, what gets you most excited about the Thrive Grant and all the potential that it could provide in the world? 

[00:51:24] Derwin Sisnett: There are a couple of things. One, I'm super excited about what I don't know, there will be applications that come in with really big ideas that touch on parts of student data that we hadn't even thought of.

And I'm excited about seeing that for the first time and I'm excited about supporting those entrepreneurs and maybe even being one of their first supporters through our Cambiar Thrive Grant. So really excited about that. I'm also excited about, you mentioned equity, like we've opened up a space for entrepreneurs, whether you mentioned students, caregivers, like this isn't just about full fledged companies that already, you know, they've been around, they, they have a name and a brand like that's great.

And we we're excited for, for those companies to apply as well. We're also excited for the new entrepreneurs that are more proximate even to the problem that we could support. And so I feel like we're gonna get applications and, and we're already seeing previews of that, that really go across the spectrum and I'm really excited to see the diversity of applications there.

[00:52:19] Alex Sarlin: It is really amazing to see all these different ideas come from different parts of the educational ecosystem, from Teacherpreneur, you know, who are coming outta the classroom, from nonprofits who are looking to expand into data and ai. You know, I've worked with some of the, the Thrive grantees and the, the previous cohort and there's just the diversity of ideas is incredible.

You have individual schools doing really interesting things with data and with portrait of a graduate type work within their environment. You have tech startups, you have nonprofits that are trying to get the word out about how to use data at home. You have language, you know, one's working on special populations or with language translation.

Mental health data is a really interesting, for me, I think is a really interesting new set of data that's starting to come out in schools or social emotional learning or wellness data. It is really, really a really interesting space. Thanks so much for being here with us on EdTech Insiders, and just to remind everybody, August 31st is the deadline for this Thrive Big Ideas Challenge.

So if you have an idea and a big idea that you're working on, and I know so many of you do, now is the time to apply. Thanks so much Derwin for being here with us on EdTech Insiders. Thanks so much, Alex. Hello Tech Insiders. Today we have a very special guest, Dr. Alan Bekker, co-founder and CEO of eSelf, an AI company redefining human machine communication through realtime face-to-face video assistance.

A lifelong student of both human nature and science. Alan holds a PhD in machine learning and AI with published research spanning voice, NLP, and Computer Vision. He previously co-founded Voca ai, a voice-based AI agent for call centers that was acquired by SNAP in 2020 at Snap. He led conversational AI productization during the early wave of large language models.

Alan is also an angel investor in early stage AI startups and a graduate of Harvard Business School's OPM program. And today he's building AI assistance trusted by leaders in finance, real estate education, and beyond. Welcome to the pod, Alan. Hey Ben. Thanks for having me. It's a pleasure to be here today.

So today we're really talking about the EdTech use cases for ai, and you recently launched a national rollout of AI tutors for students in Israel. What were the biggest technical challenges, but also ethical challenges that you faced? In deploying this at such scale? And how did you address 

[00:54:39] Dr. Alan Bekker: them? That's a great question.

So maybe we can start first by explaining what we did in Israel. So, in Israel, in the 12th grade, students need to take an oral exam in English. It means that the, usually they have a Zoom conversation between themselves and, uh, human teacher. That the human teacher ask questions in English. Right? And in Israel, the people don't speak English natively.

They speak Hebrew. In, in this exam, the teacher asks questions in English and the user, the student needs to reply. And depending on some metrics, the teacher provide a score based on fluency, vocabulary, and so on and so forth. So this exam usually takes 20 to 30 minutes overall. So what, what we did together with CET ct, they're the largest educational publisher in Israel.

We created a simulation tool that simulates the teacher by the AI avatar. So the same way in the final exam, they have a Zoom conversation with the human teacher. So they had a practice tool, which is like basically a Zoom conversation with an AI avatar. And the AI avatar simulated the same questions and that the real teacher does, meaning that the user, the students, they got the chance to practice as many times as they want.

AI avatar through the Zoom conversation of how to respond to questions and how to practice, you know, the verbal skills in English. We rolled it out to the full 12th grade of all students in Israel. That was an amazing pilot that we also, we have shown that students that practicing with EO, they got 4% more on average on the exam compared to the students that didn't do that.

So that, that was the story, what we created. So back to your question, I think the, one of the biggest challenges, first of all was to convince the market that an AI avatar can simulate a teacher. 

[00:56:53] Alex Sarlin: And were you distributing through the schools, through the Ministry of Education or direct to students 

[00:56:58] Dr. Alan Bekker: themselves?

We partner with ct, which is the largest educational publisher in Israel, and they distributed it through the schools. This is the how it works. So we didn't did by ourselves the B2C thing, but CET did that for us. 

[00:57:13] Alex Sarlin: Great. And as you were scaling it, what were the main objections? What 

[00:57:17] Dr. Alan Bekker: was the skepticism? I think the skepticism mainly came from at the beginning, from the schools because you know, obviously schools, they have some conflict in some sense because they were introduced with an A avatar that in some way they felt at the beginning that might compete with a teacher, with the human teacher, right?

Because imagine that we are simulating the conversation of the exam by an AI avatar. So maybe, you know, they had concerns that maybe the avatar eventually could replace the teachers. That's a fair concern. So we had like many issues around it and we need to explain that we're not replacing the teachers.

The teachers and, and the reason why people go to schools, it's way more than just get the education people go to school to socialize. Right? So we had to explain that the AI avatars, they are like a private teacher, but with easel based on ai. So the teachers at school, they're not in any risk, but the students that they can't afford themselves to pay to the private teachers.

Now they have the access to a tool that does that even better than private tutor because it's available 24 7. It doesn't judge you. You as a student, you know that you're speaking with an ai, so you have like more openness to speak with it. You have more openness to have mistakes, right? When you speak with a person, there are many people that they have.

Uh, one of the reasons why they can't speak well is because they're being afraid of making mistakes. But with an ai. 

[00:58:55] Alex Sarlin: We see the same dynamic in early literacy where younger kids don't wanna read out loud 'cause they're worried about being judged. Where when they read to an AI tutor, they know it's ai, so the risk is lower.

Can you tell me a little bit about the technical challenges? So you're rolling this out, of course. Different people have different pitch in intonation, they have different devices. How did you think about the technical specs, especially given your background in, in voice and voice ai? 

[00:59:26] Dr. Alan Bekker: So, there were many, many challenges and many of them were tackled before this specific project that we did in, uh, in Israel.

So, yeah, so you, first of all, you need to support many devices and many browsers. So we went with a Pareto. So meaning that we'll support like. 95% of the devices we want support, like all generation devices obviously. And if you use like a browser that it's not common, we don't support that because we want to support mainly the, you know, 95% of the users, like every company does in a more, like, I would say AI technical issues that we had.

So the tool that we created, which is an AI avatar, a zoom conversation between a human and a machine. Usually there are a great fit for specific use cases in which the user is in full. I would say focus of the conversation, right? Like right now I'm in a quiet environment, right? If someone will I close my door?

My kids, obviously they're not running around me because I mean, I'm taking it quite seriously and know I want to, to provide the best audio possible. Mm-hmm. And to have know how don't having like difficulties around the conversation. So we hear the tool that is AI based. So if a user speaks and there's a lot of noise background is there's an issue around it, right?

Because the AI will hear the noise as well. We might have like force alarms and many stuff that we don't want to get. So specifically for this specific project, because we understood that the students might be using this tool from noise environments, we created for them a specific feature, which is a, you know, was a, it's a very simple feature actually.

It's like kind of a downgrade of our avatars. It's a tap to talk button, which means that. If the student is like driving a car or just, you know, riding a train and there's not a lot of noise background when they speak, they can just start to talk and then it would reduce the noise background. So we kind of, we did some compromising on the product itself, kind of even downgraded with the, the conversation because obviously we prefer to have a end-to-end conversation completely fluent without the tap to talk, uh, button to overcome this difficulty that we found with, uh, real life students, that they take those simulations from non quiet environments.

[01:01:48] Alex Sarlin: And given that your use case was being a language tutor, if they wanted to interact with the AI and talk about other things off topic, were they basically able to have a generalized AI tutor? They could talk about anything, or was it specifically language? What if they switched into Hebrew or another language?

How did you kind of put the guardrails on it? Because in a way, like part of the feature of AI is it can really talk about or do anything. 

[01:02:16] Dr. Alan Bekker: So it, it was another, I would say, restriction that we gave to our tool in order to provide the right guardrails. I'll give you just an example. So there was a chat bot in Israel a few months ago by the Ministry of Education that it was released by them and they have some issue that one student asked a question about.

Some political stuff around, you know, Hamas and Israel, whatever, and the AI gave an answer that, that was not politically correct, I would say. Right. The AI didn't condemn like the aist, whatever, so they took it off because of that, because they, they created many political issues, like a lot of bad PR and so on.

So it was very important in these projects that we create the right guard rails because we knew that students, specifically in the 12th grade, they will try for sure to throw all the AI right? And they will try as hard as possible to make this AI say things that, that are not appropriate in many aspects.

So we, we invest a lot of time in testing. The guard race that we created by prompting the AI avatar to stay in the, the limits, the borders that we created. So we invest a lot of time in testing to be sure that, you know, nothing bad happens because we knew that even one student having one conversation and they will record it because the trollers, they will record the screen while they speak with AI avatar and they will see the logos of, and CT and the hbar that also took place in this pilot.

And they, they will record, you know, the error, the AI avatar saying like, bad stuff in real time. We knew that we completely ruined the project. So we invest a lot of time in testing and be sure that the guard rails that we created actually work and they can't control the system. So luckily we, we completed the pilot not only successfully in terms of the final exam, but also without any bad stories.

So it was a quite a release afterwards. That's 

[01:04:21] Alex Sarlin: great. Given the success of this, where do you think this is heading? You've had super deep expertise in building multimodal, especially voice ai. Your company was acquired by snap, which is at tremendous scale. In this case you use the distribution partner who was a curricular partner.

Do you view that the future is companies developing targeted AI use cases and then partnering with distribution channels? Do you think that eventually all of this will just be done in chat GPT or in Gemini or in a in cloud, in a global browser? You know, just this week OpenAI announced a study mode.

Like where do you think the space is heading and what is the business opportunity 

[01:05:06] Dr. Alan Bekker: if your company is a fully B2C company? So probably having distributors is not the right move because you can probably build something on top of like a wrapper on top of JGPT or Germany and wherever. But then if you're a B2C company, you need to either have like amazing product, best product as possible, needs to be viral or invest a lot of money in marketing and, uh, user acquisition.

Having a viral product is not straightforward, right? Even if, even if you have like a great product, doesn't mean it will be viral, okay? So that's something that you can't really rely on and creating, you know, um, nice wrapper on top of those LMS is not really defensible. So even if you invest a lot of money in marketing, the churn that you have might be that big, that you will lose money eventually.

But if you are building something that it's not only B2C but it somehow relates to something that is more institutional, for example, articulation exams or SAT exams, something that is established, something that there are standard around it, then you still need to do the B2C, but you need to make sure the content you provide, the, the, the wrappers of your AI agents or avatar, they're perfectly aligned with the rules of the system.

Right? So we couldn't even if we wish as ease of to create, uh, simulation of the oral exam in English by ourselves because we don't know how to operate in this educational world in Israel because it's a regulated. Think we can do whatever we want, but CT for us was a, the perfect partner because they're fluent in this world and by partnering with us, they help us to create the right environment, which the AI can fit into the system.

So the partnership wasn't only about distribution, which is obviously was great for us because we didn't invest in marketing. Right. But was also great in terms of creating and uh, and wrapping the product in the right way. 

[01:07:09] Alex Sarlin: Do you feel like you had an advantage, because Israel is such a tech hub now, that maybe the project that you did, I think in the United States would get so much pushback, but given that people are more tech forward or that kind of, the startup per capita is higher, do you think that it was accelerated?

Or do you think it's the opposite? People have more of a traditional view of education and it was harder to distribute. And how, how does the tech infrastructure in Israel play into all of this? 

[01:07:41] Dr. Alan Bekker: I thought about about it yesterday, I had a, a call with some big players in America, okay. Not only North America, but also like Latin America.

And I even shared with them, you know, the results of the pilot that we had in Israel. And they were so suspicious around it. They were like, I was showing them what we did, which is, I mean, when I show it to Israelis, they say, oh my God, that's amazing. But the only thing that they had to say about it was, you know, what?

Uh, but what happens if the teachers won't accept it? And, uh, what hap I mean, the, the only thought about the worst case scenarios. And in order to innovate, you need people that you know are technology open and open-minded to innovation. So my answer to your question is. Yes, yes, for sure. Yes. I not only I, I knew that in Israel would happen.

I actually proved myself by talking with other countries to see that Israel is quite advanced and they're very open in innovation and in education. Even though our tendency is to have a religious country overall. And we are traditional in some sense, we are quite open-minded in terms of bringing innovation to education and to other things in the, you know, in our lives.

So I think the fact that we live in Israel not only allowed us to attract amazing talent that is tech savvy and open-minded and, you know, they think out of the box, which is great, and this is why we are called the startup nation, but also having the design partner in Israel that is willing to accept this tool and convince the, you know, the schools to take it and use it and to promote it.

And not being afraid of maybe failure even. That's something that was an amazing, uh, opportunity for us. 

[01:09:25] Alex Sarlin: Given your technical expertise, and you've not just worked in education, but you've worked in other places, what gets you most excited about educational use cases of ai and what are the biggest challenges as you compare it with some of the other industries that you've worked in?

So 

[01:09:42] Dr. Alan Bekker: I, I think education, it's like the holy grail of everyone in technology. And I think the only reason why we don't see startups in education is because it's a market that it's highly regulated. They're very conservative. So it, it's really hard to create startups that like they're really growing fast, right?

Like one of the most successful companies today is like dualing, right? In, uh, and you have like Coursera, but, but even them, they're not huge. They're big. Okay. They, there are a few billion dollars market cap. They're not hundreds, right? But if you think about it, in Israel, for example, almost 30% of the annual budget of the country is going to education even larger than the Army.

Yes. Before the war. Right now with the war, the army budget went up, but before the war, the educational budget in Israel was higher than the Army. One. Two G is crazy and there's a lot of money in education, but it's really hard to start up to disrupt the field because they're moving really slow and then, and that they're, uh, very conservative.

But the holy grail of everyone was always to impact education because when you create a product technology that can have impact on the early stages, early years of someone, it can really impact his life to be an educator. And I was an educator by myself when I was young in youth movements and, and afterwards as well.

So you see the value of your creation. The product that you create is people. I mean, it's, it's amazing. It's not an abstract thing, it's something you create, something that I impacts the life of the people that you work with. So it was Al always the holy grail. But one of the things that changed recently was that in education, the information as of today until now, until the AR revolution was always centralized within specific people, right?

Even in, I don't know, in science or even in religion, right? There are specific professors, rabbis that they knew the information, they kept it usually close to the heart, and even if they wrote a book. It's hard to read a book, understand the specifics, right? Not every person can read a book in physics and understand it.

Most of us, we can't. We need to have a private teacher on, like someone that write it for us in the board. But what is unique about the AI revolution that's right now, the information is democratized completely. There is no single person in the world that knows more than the lms, right? But the only thing that we need to solve because information is, is accessible.

The only thing that we need to solve is how to provide the right experience and engaging experience to the students to bridge the information that it's accessible in the LMS to their lives. And that's, if you think about it, what we do at SO, we create. From LLMs and fully interactive and emerging experience, which breach the information into their specific lives by, and this is actually what's a private tutor, if you think.

Think about it. It's someone that has the information. He knows the the students, and it can bring it to you. So that's exactly what we do at ol. We try to breach the information that is accessible in the world with the right experience, personalized for the student in order to have the highest impact as as possible.

[01:13:13] Alex Sarlin: Well, we're gonna leave it there. That's really inspiring vision of the future. Democratized education with engaging experiences for all. Dr. Alan Bekker , co-founder and CEO of eSelf. Thanks so much for sharing about your company, this incredible journey in Israel using voice tutors. And if people wanna find out more, where can they learn more about ES self?

[01:13:36] Dr. Alan Bekker: They can just go to our website, www.eSelf.ai. Awesome. Thanks so much. Thank you, Ben. Thank you for having me. 

[01:13:44] Alex Sarlin: Hello, EdTech Insider listeners. We have a special guest today. Andrea Pasinetti is the co-founder and CEO of Kira, an AI platform on a mission to make high quality future ready, learning accessible to every student As seasoned entrepreneur with a background in engineering, Andrea has spent his career building products at the intersection of technology, education, and social impact.

He co-founded Kira with the support of AI pioneer Andrew Ang, and under his leadership, the company has partnered with governments and school systems across the US and around the world. Andrea brings a global perspective to how AI can transform teaching and learning at scale. Welcome to the pod, Andrea.

Good to be here. Thanks for having me. We're gonna dive right in. I'd love to hear a little bit about your background and how the idea of Kira came, and then talk a little bit about what makes Kira different from the other kind of chat, GPT wrappers. 

[01:14:39] Andrea Pasinetti: Yeah, absolutely. We started Kira before LLMs were commercially available.

So we actually started building Kira on a more traditional NLP stack and the original purpose of the product. And it was actually a project before it was a product and, and before it was a company was to provide. To non-technical instructors of technical subjects. So the original permutation of PIRA was really a browser extension intended to support teachers who were teaching computer science or artificial intelligence or machine learning for the first time, but also didn't really have much of a background in those subject areas.

This was fairly common with classroom type environments, with the emergence of, of MOOCs like Coursera and Khan Academy, where you had students working through lecture content on, on their devices, but then needed to interact with features, many of whom weren't necessarily trained in some of the technical subjects that, that they were supporting students 

[01:15:42] Alex Sarlin: on their learning floor, which of course Andrew was very familiar with that use case.

Having founded Coursera, and I believe he has like an incubator that is focused on ai. And funding AI projects like this with natural language processing. I don't know that we knew the LLM boom was going to be so big and so fast. That's exactly right. 

[01:16:04] Andrea Pasinetti: Yeah. So here I was one of the first companies to come out of, of AI fund.

It was sort of the, the very first vintage of AI fund companies. This was back before AI was kind of the, the topic, the drawer. But yes, the tech stack was very different. And this was, you know, two years ago. So two and a half years ago. We're not talking about a distant horizon, but the tech stack of, of building.

And AI assistants at that time was very different. And also, I would say much more friction and the, the quality of the interactions you could get was inferior. So when LLM became commercially available, obviously with chat g pt, we shifted the tech stack and also of all the, the approach of the product and, and made it a much broader set of offerings for learners, for educators, as well as for administrators on the wrapper question, you know, there, there are a lot of companies I've noticed in the past year, and I think we all have substantial exposure to them that they're sort of discussed or or referred to as GPT wrappers.

And I think the more appropriate term is actually prompt wrapper. What I think has happened in K 12 especially. In the realm of supporting teachers with efficiency and sort of their effectiveness in providing instruction is using chat GT presents certain challenges. Sometimes it's flocked entirely in schools.

Other times, writing an effective prompt can can be a somewhat daunting or or more challenging task, especially for first time users of prompt based AI tools. So there's been an explosion of prompt wrapper type tools and, and by that I mean AI tools that allow users to take prompts and break them into smaller chunks with some guidance and direction on, on what that chunking should look like than what the input should be, and then stream back more high quality content from elements.

That was sort of like the first generation of rappers that I would say has, has gained traction over the past 18 months. I think what we're seeing now and and really where Hero has been more opera operative is on the GPT wrapper front, and I think that's actually a misnomer. It's really more a shift towards AI agents, so AI tools that are doing really effective end-to-end work for users and not just streaming back output from LLMs.

And I, I think that's a big shift we're gonna see this year in school districts in conversations with districts. I can already sense that there's, I think, growing impatience with simple prompt wrapping and a real desire to, to push the envelope on what AI can do for teachers and learners. And, and that's where we see the shift happening.

[01:18:48] Alex Sarlin: Yeah, it's really fascinating and there's a way in which it's happening so fast that it's hard for education to keep up in another way. And, and you know, I've been on the school board of my local school district because things have been changing so fast. There's almost been a rational reason to wait as a school district and just say, you know what, if we wait six months, it's gonna be meaningfully better.

And so far that's proven to be true, actually. I think people are now diving in and understanding that you pick a partner and then things graduate very quickly. But from your vantage point, how are districts, schools, and educators. Changing the way they approach, adopt and deploy AI tools. And are there notable trends or shifts emerging in procurement implementation or classroom in integration?

I think it's really 

[01:19:43] Andrea Pasinetti: interesting and, and this is a shift that's probably either under reported or under observed, but I think it is a pretty seismic one historically, as you know, better than anyone. Have been the leaders in making decisions about what software to use across their schools or you know, across their districts.

And it's a top down both procurement approach but also rollout approach, right? So an administrator who's very well versed in the university tech tools to understands what the offerings look like, what the pricing looks like, what the trade offs are, we'll make decisions about what Netex stack looks like and those stacks proliferate with a lot of point solutions or historically have proliferated with a point a lot of point solutions.

I think I saw specific recently that if you kind of normalize for. Budget. Budget that's available to a district and compare to companies. Districts have 10 times the number of SaaS tools that AC company, you know, in a non apples to Apple comparison would have at the same scale. So there, there are a lot of tools, a lot of legacy tools, a lot of very clunky integrations.

But what's happened over the past year and a half is that the decision making still rests with district administrators, but the role of teachers in making determinations about what rules to use in their classrooms has become much more prominent. Teachers have just, they've taken a lot of agency in making decisions what works and what doesn't for them, and then becoming much more vocal advocates with their administrations.

What would help them and what they would like their districts to purchase. So I think that's been a really radical shift. It's also had big implications for how products are built. PLG sort of product led growth is very much the main strategy. That AI companies more generally are taking where you can create a free account, you can use the tool on some relatively limited basis or not so limited basis depending on the offering itself.

And then eventually institutions might, might buy institution level type accounts or licenses, and EdTech is moving in that direction as well. In fact, I even struggle to call it EdTech because I think this new generation of AI tools in education is philosophically very different from how EdTech does, has existed in the past.

But they're much more accessible tools. They're tools that get to value a lot faster, and they're tools that can be adopted by individual users without the need for broad institutional buy-in users and teachers and teachers that their students can just start using these tools without very much setup and get to value extremely fast.

And, and I think that's a very big shift in how procurement and also selection and adopting tools. 

[01:22:30] Alex Sarlin: Yeah, it is really fascinating and there's a way in which like learning is an essential use case of ai. It's one of like the top three and education infers really systems of learning. And the question is like, how do we evolve learning both inside, outside the classroom and through different models.

I also think that this is a moment where you were talking about some of the original work was to support non-technical people to integrate technology into their course curriculum. Educators haven't been trained and how to think about their stack and think about their tech infrastructure or how they're deploying it.

As AI becomes somewhat ubiquitous in all core workflows, how are school and districts rethinking their technology infrastructure and do you see any common pitfalls that they should either avoid or that they should be aware of? 

[01:23:29] Andrea Pasinetti: I'll say challenge that.

Probably more pronounced than it's ever been. I often think about where Bristol are today with respect to their staffing, and I think it's a lot more, there are a lot more parallels to where things were in the early two thousands where just having enough teachers in a school was a challenge that stopped being as much of a problem in the late S and early 2010s.

It seems like that's where things are, are trending again, and that implies teachers are teaching a lot of subjects that they don't necessarily have a lot of prior exposure to. So our first really big customer was the, and continues to be the, the state of Tennessee. Tennessee was one of the first states in the US to pass state level legislation around a requirement for.

Computer science as a prerequisite for graduation. So computer science in Tennessee, one year of computer science in high school, and there's also a middle school half year sort of component to it. But as a baseline, one year of computer science in high school is a requirement for all Tennessee high school students to be able to graduate.

And the runway between when that law was passed and when it needed to be implemented was very tight. It was a matter of about a year, 18 months or so. And so we were brought on board because of this focus on supporting non-technical instructors. To help with that rollout. So we're, we're the only company that is procured by the state of Tennessee, the Department of Education, Tennessee, through a partner PSA, as an offering that is provided for free to all schools to support this effort.

And the way we went about it was by offering an AI teaching assistant for teachers who are going through a certification course and an AI tutor for students who are working on the year long course. Also an environment for all that learning to take place. And the seed with which that rollout happened is, is really attributed to the fact that that AI support was there.

And what we've seen, and what's really fascinating is teachers who have no background, not only in computer science, but in STEM subjects, so e teachers, librarians, English features, and of course. Teachers and subject areas that are more adjacent to computer science becoming very effective instructors at computer science.

Not just because they can all align the process of being a subject matter expert, but because the time they save by dint of having these, these assistants, they can spend guiding students on their learning journey. Journey. And we have a, a few writeups on this topic, on our, our website, but I think it, it really shows what's possible when AI is deployed thoughtfully and effectively with instructional outcomes and student outcomes, sort of front and center and design and and planet in terms of what we see happening with districts tech stacks.

I think if you were to talk to any more dominant AI player in the space, or companies that have gained meaningful traction, whether it's hero or school AI or, or magic stool, if you were to talk to them in private, I think they would all tell you. Same thing, and I think this is becoming more public in terms of their messaging as well.

But ultimately, the direction things are moving in is to introduce a new category of products, which is an AI native learning management system. Districts don't really wanna talk about replacing learning management systems, even though Canvas and Schoology and sort of the dominant items in space are, are very, very clunky.

They're 20 years old, they're incredibly expensive. They're all cobbled together through m and a. So the conversation is still laden and districts aren't really ready for that conversation yet. But the products that are emerging are, are really. Horizontal type products that are AI native. And so I think what you'll see happen is districts that have 10 times the number of SAS tools that their sort of corporate equivalents have will reduce that universe of tools radically, and you'll end up with a much more powerful and much more integrated AI tools in place of sort of the systems endpoint solutions that exist today.

[01:27:45] Alex Sarlin: It's clear that they can see it coming. You know, canvas just announced a huge partnership with OpenAI. It's hard to know whether peanut buttering AI on top of a legacy system versus AI native systems. We've seen the history of software, it doesn't bode well, but you know, there is a lot of competition to be that next gen ai, native LMS.

And one thing we also have to talk about is the definition of LMS is being more expansive. So the idea that it can not just be infrastructure for content delivery and like management, but actually it can be part of the dynamic assessment learning, personalization experience. There's a way in which it actually has deeper integration into the learning experience and pedagogy.

That's part of why we in school districts are hesitant because who you pick and how they enable or unlock your learning experiences will have a meaningful impact on both students and educators. As you're thinking about what bets to make, and obviously you guys have had a bit of a head start and you've got deep underlying technology expertise.

How are you making your bets on that? Where do you think there's leverage, defensibility impact, value, and where do you think it's commoditizing? Because there is, in the investor space, there's this sense of, well, fast following is just helping everybody kind of stay a pace. Where does that kind of, whether it's from an impact standpoint or from a business standpoint, where does the leverage come from?

[01:29:26] Andrea Pasinetti: So what I, I think is happening with AI needed tools is that they're generating value quickly while also building, let's call it an AI needed learning management platform or an AI native platform that's more horizontal in nature, in parallel, because whether it's a sales cycle, an adoption cycle, sort of a mindset shift cycle for horizontal platform adoption, it's a long cycle.

And so you, you have to be building very fast value. To be, to continue to exist and thrive while also doing the horizontal work. And eventually that transition will become more organic. What I think is gonna happen, and what I see happening with RA is schools and districts use eera as a, a mirror of their lms.

So we've built as an LMS smear, which doesn't have to replace an underlying illness, so a district can grade and content. Sort of host an AI tutor and deliver content to students and modify contents and do everything that an LMS can or should be able to do, but do it in a way that's AI native and a lot more intuitive.

All that information is shared with the underlying LMS, whether it's a Schoology or a Canvas, but a user technically doesn't really have to interact with that tool, and I think that's progressively where everyone is going to evolve. You kind of have to replace what an LMS does again by din of, of a lot of m and a chunk by chunk and do it in a way that's really integrated and.

Fundamentally and substantially a better experience than it's with the lms. I think the whole, to your point, the whole concept of of an LMS is probably already obsolete like an lms. LMSs exists as integration interfaces at their core. It's a way to bring in all the tools that tools use, but they also have to use all those tools because LMSs are so not useful.

And as tools start to pair back the number of tools they use and they become, all that functionality is saturated in AI tools. I think the, the reason for LMS is to exist is slowly going to go away. And I think it's a matter of 18 months, maybe two years, it's gonna be a progression. I don't think it's gonna be a huge amount of time, especially because all the big LMS providers, like them as in school, are now private equity owned.

So they're all massively laid in with debt and they're trying to push sales very aggressively. They're getting more expensive. So everything is pointing at that direction, and I think it's gonna happen fairly quickly. I think the harder thing, but probably the even more valuable piece is going to be where content and curriculum comes from.

Like where high quality instructional materials come from. And if you really think about it, the thing that AI is excellent at is producing content, right? High quality instructional 

[01:32:23] Alex Sarlin: materials or derivative content. I think there's a strong case. Right now that really baseline high quality content rather than it being a GPT wrapper where you wrap around the GPT, the content wrapped with AI creates an engagement layer that offers more effective personalization and engagement for 

[01:32:46] Andrea Pasinetti: kids.

I think there's definitely that. I think this is where being a prompt wrapper or the habituation to prompt wrappers is a challenge because a prompt wrapper fundamentally is only able to stream back what an LLM is already trained on, so it'll stream back what an LLM already knows. It won't actually produce new content from original or from primary sources, so it won't be able to.

A historical, well, it can because it's probably trained on it, but I think the real opportunity is in what sources are being used and then what content is being built around those sources in a way that's really agent like, how are we using ai, not just to screen back with the LM already knows, but to generate new content and to build instructional materials.

Following the same principles and following the same steps and the same flows that are used to generate hq. I like high quality and structural materials that touches everything from images to scaffolding, which is impossible to do with a, a traditional LLM today. Like you really need a flow to be agenda for scaffolding to, to work effectively.

Examples, work examples, et cetera. So there's the experience of learning through an AI tutor or guided examples that can come with ai, which I think is definitely a big enhancement. But I think the way that contents and curricular content and pipeline instructional materials are generated is also going to evolve with more sophisticated approaches and implement regions of ai.

And when you have both a curricular engine as well as a vehicle to house that curriculum in, you end up with a really interesting product category, which we haven't really seen before. And, and looks nothing. A traditional lms, but it will be a process of adoption. And I think it starts with kind of more granular steps.

Lesson plans were the, the first beachhead, and we're gonna quickly move into full lessons. Actually, here's launching our gentech lesson generation tool next week, and we build lessons that we are benchmarking against and, and doing testing against high quality instructional materials, having experts look and, and make determinations about the extent to which our AI generated high quality instructional materials match the quality of sort of legacy high quality instructional materials.

I think that's gonna be a really interesting and powerful shift where a teacher can effectively become their own publisher. A school can become their own publisher. You see this with universities already, but the stronghold of the textbook industrial complex is strong and it'll take time for that inertia to to seed to new players.

But I think that's a shift that's underway in a really powerful setting way. 

[01:35:33] Alex Sarlin: The overall theme that I'm taking away is really that we are in a profound shift, and it's not like we've reached the destination yet. And so then the last question I would have for you is just, what's the advice you would give for people who are trying to lead systems in this uncertain time?

Given the preponderance of tools that are out there, the legacy, the new, the inundation, with so many products per staff member relative to other industries, you know, if you had a few pieces of advice you'd give, what would that be? 

[01:36:08] Andrea Pasinetti: The first piece of advice is just be curious, as overwhelming as it can be.

It's also a very exciting time for administrators and teachers in the sense that there are really useful pools out there that do more than add marginal value, or at least hold a promise of doing more than having marginal value, and I think can have a varied foundational impact on their effectiveness and their quality of life.

I think the reality is AI has made a lot of roles more efficient. It's, it's increased productivity for a lot of different professions. I think it's starting to do the same for education, but I think we're very much in the early stages of that. So the first piece is just be curious and experiment with things, and if things feel good and they seem like they're doing something useful and productive, then do more of it or explore it further.

I think the second is, and this. Maybe creates more of an onus than it lightens a load. But I think the imperative to be informed is a big one. I think AI and literacy, and by that I mean really understanding how AI tools work, why they work that way, what the risks are, what the risks are. As many times risks are way overblown and in some cases the real risks aren't really talked about.

But understanding those risks and calibrating appropriately I think is really critical and, and that requires. A deep level of understanding. I think district ed tech administrators will increasingly become district AI administrators. I think traditional ed tech tools are gonna be legacy systems that need to be maintained more so than like the future direction of where ed tech is going.

So I think that role is going to fundamentally change and, and the competencies and things that folks in those roles need to know are, are gonna shift. And then the third thing is really keeping an eye and staying focused on efficacy. Ultimately saving time is great, but the real promise of AI in education is helping teachers see more effective instructors or stewards of learning.

I think the role of teachers is gonna evolve and it's already evolving. As, as I, as I mentioned, I think teachers are gonna become learning architects. First, and then subject matter experts. Second, there'll be, you know, important stewards for their students and, and shepherd their students to the learning process.

But a lot of help with subject matter expertise will come from, from AI pools, and also just making sure that students ultimately are, are learning more and facet. And that sets a very big challenge because original AI tools weren't really designed to support learning. As you probably know, if an AI just released its six learning mode.

But the challenge of introducing appropriate amounts of friction into an experience that's. To be frictionless in getting to answers is a difficult design and technical question. And so making sure that students are learning more and retaining the things that they learn and doing. So more joyfully, I think is probably the, the last few, I'd say those three areas.

I think being curious, being informed, and then keeping an eye on the efficacy of, of what these tools are being used for and not just the efficiencies and, and gains and efficiencies that they 

[01:39:23] Alex Sarlin: bring. Wonderful. I I think that's great advice. And Andrea, if people wanna learn more about Kira or your company or reach out to you, what's the best way?

[01:39:33] Andrea Pasinetti: Our website is www do rac a ira learning do com. They can also just email me at andrea a NDR, ati, PAS IE TT i@herolearning.com. I always welcome direct email and, and any insights, feedback questions. Uh, we can get, our product is PLG based, so teachers can set up accounts and use Kira as they wish. So we try to make as much of the product available to teachers for free.

We don't do a lot of scent marketing. We put all of our funding behind r and b and making the product available for, for free for teachers. So we welcome that exploration feedback and any engagement folks are gonna give. 

[01:40:20] Alex Sarlin: Wonderful. Well, thank you so much. Andrea Pasinetti is the co-founder and CEO of Kira.

Excited to have you back on the pod in a couple of months to hear how it's going. Thanks so much for joining us today. Thanks so much, Ben. It was great talking to you. Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community.

For those who want even more, EdTech Insider, subscribe to the Free Ed Tech Insiders Newsletter on substack.

People on this episode