
Edtech Insiders
Edtech Insiders
Week in Edtech 9/3/25: Google’s AI Surge Hits Duolingo, Parents Push Back on Student Data in AI, Early Childhood Enrollment Struggles, Oak National Academy API, University of Phoenix IPO, and More! Feat. Karim Meghji of Code.org
Join hosts Alex Sarlin and Ben Kornell as they break down the latest shifts in education technology, from Google’s aggressive AI push to early childhood challenges and new federal initiatives.
✨ Episode Highlights:
[00:04:48] Google’s AI surge disrupts Duolingo with gamified language learning in Google Translate
[00:12:14] Google’s edge in practical AI tools versus the AGI race
[00:17:28] Competitive landscape across OpenAI, Google, Anthropic, and predictions for Chinese challengers
[00:22:14] Presidential AI Challenge invites students to showcase projects nationwide
[00:24:01] 70% of parents oppose student data going into AI tools, raising regulatory concerns
[00:33:29] AI shifting from “what it is” to “how it enables” daily tasks and learning
[00:36:53] Uptake struggles in early childhood education despite universal pre-K expansion
[00:38:52] Oak National Academy opens curriculum API and University of Phoenix prepares for IPO
[00:40:53] Michael Horn highlights optimism for innovation inside and outside school systems
Plus, special guest:
[00:41:15] Karim Meghji, Chief Product Officer of Code.org on the Hour of AI, AI-powered teacher tools, and CS education for all students
😎 Stay updated with Edtech Insiders!
- Follow our Podcast on:
- Sign up for the Edtech Insiders newsletter.
- Follow Edtech Insiders on LinkedIn!
🎉 Presenting Sponsor/s:
Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.
Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.
As a tech-first company, Tuck Advisors has developed a suite of proprietary tools to serve its clients better. Tuck was the first firm in the world to launch a custom GPT around M&A.
If you haven’t already, try our proprietary M&A Analyzer, which assesses fit between your company and a specific buyer.
To explore this free tool and the rest of our technology, visit tuckadvisors.com.
[00:00:00] Alex Sarlin: The Duolingo thing that Google Translate becoming a teaching tool that threatens Duolingo. That is one paradigm, but I think that's very much in passing. Google is in experimentation mode. What I think is not in passing is that Google has made incredible slide generators. It has made incredible image generators, it has made incredible music generators.
It has made amazing video generators, images. I think Google and a small number of other companies are going to be, like you said, the content creation machines in the AI world.
[00:00:29] Ben Kornell: The other thing I'm seeing from a regulatory standpoint is this shift to ESAs and decentralized funding systems, which may also mean that parents will have more say in what kind of AI education their kids get, what kind of exposure they get.
Do I want to pay for a school with my a SA dollars that is, you know, an AI tutor and then some project based learning all LA Alpha School, or do I want it to be. No computers and all books and human beings.
[00:01:05] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter, and also our event calendar, and to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben.
Hope you enjoyed today's pod.
[00:01:45] Ben Kornell: Hello, EdTech insider listeners. It's Ben and Alex back again for another week in EdTech. Let's jump in. Alex, what do we have on the pod? What's going on in EdTech Insider Land?
[00:01:56] Alex Sarlin: We have some really interesting work. We talked to Handshake, which is doing really interesting things with AI and sort of workforce preparation.
We were talking to Caleb Hicks from school ai. That's a conversation we've been meaning to have for quite a long time. We're talked to Jamie Candy from Edmentum and Sarah Factor from Imagine Learning. Really, really, really big and exciting guests. Coming up this week we are talking to uh, Chunghao-Fu from Leading Educators.
That was an incredibly interesting conversation and we have all sorts of really interesting. Stuff coming up in the next few weeks. We're also talking to a few little tiny companies. You may have heard of Microsoft and Google, the PM for guided learning at Google. Having that conversation coming out just in a couple of weeks, as well as Microsoft, you may have read, they put $4 billion into AI education and we are talking to the head of that program about what they're planning to do and how they're moving.
We saw Microsoft Launch its own AI model this week as well. All sorts of amazing things, and this week we're talking to Karim Meghji, who's the chief product officer at Code.org about their hour of AI initiative. So lots and lots and lots of interesting folks. Stay tuned for all of that here. Then what's going on on the event front?
[00:03:08] Ben Kornell: On the event front, we're having our back to school bash happy hour at our favorite place, Salesforce Park. That's on Tuesday, September 16th from four to 6:30 PM I will say, given the kind of parade of awesome. That we have on the podcast. It's almost like if you don't have time to fly to a conference and you just created a playlist of our EdTech Insiders interviews, what a great lineup This should be.
It always, I don't know if this hits you the same way, Alex, but it still blows my mind how many people are willing to talk to us about the amazing things they're doing. And I mean, it's remarkable just that list of guests just in one segment. I mean, that is an all star lineup.
[00:03:52] Alex Sarlin: Yeah, it's pretty amazing. We also talked to Andrew Grower from eo.
That was a great conversation. That's coming out in a couple weeks. I mean, it is an all star lineup, but I think it speaks to where we're at as an EdTech community. I mean, education, technology is really growing and it's really thinking about this AI moment as a coming out party. And it's, everybody has plans, everybody has projects, everybody has new product launches, and I think we're increasingly becoming one of the go-to places to share some of these project launches and plans with the world.
So that has been an honor. It's been so interesting to talk to so many different folks from small startups to giant big tech companies. I think we're all honestly trying to figure it out together. I think you know, Ben, we talked to Jean-Claude Baard, A Digital Promise. That was an amazing conversation.
And it's like people from every part of the ecosystem are trying to make sense of the moment. And I think these conversations are helping. It's been really an honor. So let's talk about what's happening in the news this week. Where do you wanna start?
[00:04:48] Ben Kornell: I tend to start with the biggest questions, discussions, and the biggest ed tech company in the world, which, you know, I've said many a time is Google and as much as in the past.
My kind of talking point is Google's the largest ed tech company in the world and it's a rounding error for that exactly. No longer a rounding error. My friends we're starting to see an incredible surge in Google, basically, since its genesis, this really represents a second creative wave. And so we saw Duolingo shares tumbling as AI has launched new learning products along with Google Translate to Power Live translation and language learning.
We're seeing a lot of new products and developments with Gemini as core, and then the kind of nano banana image editing, which has taken Reddit by storm, like everyone's nano banana. But this idea of creative tools being part of the Google Suite has really moved it into a different element of the learning experience from infrastructure enablement to actual creative tooling.
And what I would just say is, and I'm really curious on your perspective on this. It feels like the AI race and the threat to search of woke a giant, and they used to be so afraid of shipping. They used to be so careful and often thought like, who's the lead product manager on this product? It's corporate legal at Google.
Now, today, they are just driving incredible innovation and they're using their distribution advantage of like AI everywhere in everything. So I feel like there was a wash of information about open ai, Google Anthropic just raised. But to me, the leading story is really around Google's multi industry takeover and learning.
And EdTech has to be one of those. And by the way, EdTech listeners, if Duolingo stock tanks, it's gonna affect all of us because that's how you pencil out the most optimistic valuations rely on things like Duolingo to set high valuations. And if that valuation gets crushed. Then that's gonna hurt the invest of any.
EdTech product or tool. So I don't know, obviously Google, it's hard to call them ascendant given that they were dominant before. But what do you think about this resurgence of innovation? I
[00:07:27] Alex Sarlin: think you're making great points. I saw a really funny and interesting sort of tweet or whatever we call these things now.
A little comment, probably a LinkedIn comment about how when Google Maps launched, they launched without Europe. Literally there was a screenshot of Google Maps and it did Iceland, and then a big. Ocean of nothing. And they were, and the point of the launch was, hey, Google used to launch things before they were ready.
They were moving fast and breaking things, so to speak, and to trying to get things out in the world. And they have not done that for quite a long time. And I think your point about, hey, Google is now back in sort of startup mode. They're starting to think about, Hey, if we are in competition with these other big companies, these other companies from iShare for eyeballs and market share, what should we do?
And instead of just being like, well, let's think about this for a really long time and get all the product marketing people and the corporate strategy people and the legal people in a room and all the things that you'd think you'd do if you're a giant company, they're moving. I think your point about Duolingo is interesting this week, so Google Translate, we mentioned this last week.
Google Translate has put out some features that are sort of Duolingo, like they're trying to do language learning within Google Translate, they have some gamification features. It's sort of meant to be like, Hey, this is a place where people are trying to figure out how to go from one language to another.
Why not make a learning experience? That's the kind of thing where in the past that would be like one product manager's sort of wild idea. Wouldn't it be cool if Google Translate could teach you languages and they'd be like, yeah, calm down. This is our core audience. Everything has to go through everybody.
Now they're just saying, let's do it. Why not? AI lets us move fast? Ai. We wanna showcase our Gemini model everywhere. So they're doing it. And the wake of that move, like you're saying, Ben, that when a big ship turns, right, the wake of that move upends everybody. And Duolingo, which has been the prize ed tech public company with the best, best highest stock price, it's been pretty consistent for quite a long time.
You see it shares tumble because people are saying, well, Google's moving into language learning. They're moving into gamified language learning. I actually don't think Duolingo should be scared by a move like this. I might be proven wrong, but because to your other point, even with Google trying all these new things, they're trying things.
This is not, Google's new business model is trying to eat Duolingo lunch. They're just like, Hey, let's try a new way to use ai. Let's use it in here. Let's use it in there. Let's use it in everywhere. And this is one of the places they're using it. The nano banana is pretty different because I think the nano banana, like along with VO and flow and all the things they've been launching for creative tooling over the last.
Six months. That actually I do think represents a strategic direction for Google. I think the translate education features awesome. That's really cool. That still is the rounding error model, to use your metaphor, but I think that the nano banana and the things they're doing in video, the things they're doing for podcasting with Notebook and all the different multimedia outputs, I think that's increasingly becoming a core strategy for Google as they compete with open AI's ubiquitous chatbots as they compete with Philanthropics ethical model of trying to put together the pieces in coding.
I think they're saying, Hey, there seems to be a really interesting lane here. We already own YouTube. We already own Google Images. We already own maps. We already are a place where people use us for multimedia. We're gonna double down here and become the AI multimedia company. And I wouldn't be surprised to see things like maybe Google buying Midjourney or starting to really, really double down on that by pulling together some of the core other creative tools.
What do you think of that?
[00:10:52] Ben Kornell: Yeah, I mean, I love that vision for them. I'm thinking back to their original vision, which was to organize the world's information and now what would that statement be? Basically, becoming the global platform for all content. I mean, one step up for information is content. I think one thing that is interesting here is that OpenAI has pulled them into spaces where maybe Google didn't wanna go, but now that they're there, they've made a conscientious decision to go win in that space.
And I love, rather than trying to get to parity, they've been surpassing OpenAI. I mean, I think the Gemini image generator is better than Dolly. And Dolly had a huge lead to start. One really important question is. Are we on a road to a GI or are we on a road to AI tools being really practically valuable?
If we're on a road to AI tools being really practically valuable, Google is really winning that race. I can't speculate on a GI or not. We spent plenty of time last episode, speculating on that Good Gary Marcus article this week in New York Times saying I was right. We're not gonna get agi. Yeah, so
[00:12:14] Alex Sarlin: that's his every, he's written that article many times.
[00:12:17] Ben Kornell: But I do think that what an interesting outcome of open AI's quest to just push product out there is that they've actually pulled Google into a huge productivity space around ai and that could be a really winning space for them. So for our ed tech audience, I think this continues to raise the specter of.
Stepping into learning and AI learning in a really meaningful way and figuring out how you are going to differentiate what you are doing versus what Google's offering for free or very low cost. That probably there's like your competitor is, you know, the customer does nothing. The customer does it with Google or chat GPT, or the customer does it with you or with one of your competitors.
And that's a really tough set of circumstances to win. So you really have to hone your competitive advantage. It almost makes me feel like. Vlad of Diffit who's really just said, we're gonna narrow in on like differentiation and instructional materials. Like we are gonna be so deep in like instructional materials.
No one's gonna beat us on that one area that given all these other products that have basically 50 use cases, that might be the winning strategy to grow and scale an intech company.
[00:13:47] Alex Sarlin: Yes. And I would add to that, that I think sometimes it can feel like Google is a competitor or any of these big companies are competitors to EdTech companies when they release new feature suites or new product features that compete with people's core characteristics.
Like we saw at chat GPT for education launch, a flashcard generator, right? And quiz GPT, they call it a flashcard generator, which directly competes with the EdTech companies that are trying to do flashcards as one of their core value propositions. So what I'd say is that it can cut both ways. You can either go really specialized and say, we wanna own a lane, we wanna be really specific about it.
I think special education is a great place for that. We've seen some successes there. I think we've seen the tool suites do some really interesting work where they build something that's very specifically sort of school focused. But I think there's another option here, which is that if you look at what Google is doing, where there, you know, again, like the Duolingo thing that Google Translate becoming a teaching tool that threatens Duolingo.
That is one paradigm, but I think that's very much in passing. Google is in experimentation mode. What I think is not in passing is that Google has made. Incredible slide generators. It has made incredible image generators. It has made incredible music generators. It has made amazing video generators, images.
I think Google and a small number of other companies are going to be, like you said, the content creation machines in the AI world, and that actually becomes a functionality that we can all use, right? If you are an ed tech company right now, nano Banana, one of the reasons it's taking over Reddit, to your point, Ben, is that it allows you to basically.
Inject yourself into AI video, into AI images really, really easily, and it allows you to combine images really easily. A teacher can take themselves and put themselves in a astronaut outfit and put themselves on Mercury and then put themselves on Venus and have different images and different videos about that.
That is so cool and so handy. As an educator or as an ed tech company making content like, I don't know, one way to look at the way that smaller ed tech companies can relate to these big tech companies. The same way that we're sort of all inheriting downstream, the API functionality, we, I think, are gonna start using these as content creation engines that can support our end goals.
They just have to add more value. Whatever we're doing has to add more value than somebody just accessing nano banana or Gemini itself and baking images, right? If a teacher can go to Gemini and say, I can put myself on Mars. I can put myself inside a mitochondria right now, then that can't be your core business model because they can do that automatically.
But taking those outputs and putting them into a system that's actually gonna deliver to students, deliver to teachers, that's actually gonna change learning outcomes. It's actually gonna change test scores. It's gonna actually affect people's lives, help people get jobs, help people learn to read. That's the role of the ed tech world, and I still don't think the Google or Open AI or Anthropic is truly going to eat that lunch.
I don't think that's really where they're headed. I may be wrong. I've been wrong before many times. We also saw Alibaba, you know, I mean you mentioned Anthropic raised a big funding round at $183 billion valuation. We saw Alibaba's Quinn model and they announced revenue that surged a big stock rally in the Chinese stock market and in for them, like we are seeing the big companies, we saw Microsoft start testing a new model.
We're seeing the big tech companies all really hungry and really competitive Apple meta for this space. But I don't think that it means that the new ed tech company is going to be one of these companies. I still think that the ed tech sector is going to be incredibly useful and incredibly valuable in starting to take all of this incredible power that we all have and making it work for schools, for educational outcomes, for colleges, for adult education.
This so much space.
[00:17:28] Ben Kornell: Yeah. And also when you're that large, the prizes you have to go after are so big that many of the prizes that are meaningful in EdTech would be considered niche by those larger companies. And yet the launching point for any tech tools, AI or information, is light years ahead of where we would've been just four or five years ago.
So that's what I think is getting very exciting in the seed strapping space just to come to Alberto's term from Transcend, I think there is this like really exciting moment around zero to one startups. In terms of other headlines, one company that we haven't been talking about a lot is meta. They continue to be investing a lot in AI talent and Anthropic, they just did that new funding round at one 83 billion valuation that you mentioned.
As you're thinking about the kind of Google move, who do you think is going to be the most competitive with it? Is it still an open AI versus Google game, or do you see one of these other players competing with them for Top Dog?
[00:18:42] Alex Sarlin: Yeah, I mean, I am always really impressed with philanthropic. I think from everything I've seen and I've read, it feels like both Meta and Apple are still sort of pulling together their strategy.
Even meta acquiring scale ai, there's a lot of headlines this week about how there's already friction happening. There's already some changes happening, and we saw Microsoft Launch a new. AI model today, but they're obviously coming from a, even though they're enormous, absolutely massive company, they're still coming from a little bit of a standing start with this because they spent so much time partnering with Open ai.
They haven't done their own stuff yet, but they're just launching their first in-house model. So I'd say not yet. I am not seeing Microsoft or Meta or Apple breakthrough and be able to be said in the same conversation as OpenAI and Google, and to a slightly lesser extent philanthropic. I think Philanthropics moves in the coding space have really put them on the map in a different way.
Their whole differentiator was ethical, honest. That's great. But I don't think that's enough to become a massive business by itself if you don't have other differentiators. But I think what they've done with coding has kept them really in that conversation. And I don't yet see other people. I mean, I'm interested to see if companies like Alibaba, like as of right now, I would not be using an Alibaba model as an American citizen.
But these companies are very savvy and they've done some really good jobs in the past of making versions of what they're doing that work internationally, work in Europe, work in Africa, work in in the us. So I would actually be looking to Chinese ed tech as the ones who maybe, if they can get the strategy right, are just gonna sort of break through.
We don't talk about them that much because we don't use them very much, but I'm gonna put my flag in the ground here. I think it's more likely that we're gonna see a Chinese ed tech company like at Alibaba or a Tencent or or Ance doing some crazy AI. Thing certainly than a meta or an apple in the next six months.
[00:20:31] Ben Kornell: Ooh, I like it. It's like our predictions episode. I know. Coming early. Yeah, I think there's a lot of validity in what you're saying, both because the market dynamics in a China are meaningfully different around what people will do pay for, but also around data and information that can be leveraged for ai.
And I think a second is really around vertical integration and focus. And given how fragmented the US education systems are and learning experiences are, it's just really hard to imagine scaled adoption of any one solution for the foreseeable future. One of the things that I'm thinking about is like, is American hegemony in ai, like on the technical side, going to translate to learning?
And I think it would be fair to say that US based companies. Are in the lead globally around ai, but I do worry that so much of that is predicated on an investment engine. Yes, that is bound to Peter out. If you can't demonstrate the return on investment, I think you get to a really, really hard place.
Whereas many of these government backed AI initiatives or government heavy AI initiatives will have the juice to run the marathon. That is something that I'm thinking about a lot and clearly like China would be a bet on the government integrated approach. Speaking of government, you want to talk about our Secretary of Education and AI education.
You wanna talk about early childhood. What else is on top of your mind?
[00:22:14] Alex Sarlin: This is difficult to talk about, but I mean, the thing that stands out for me, the government right now is that you have this split where states are trying to figure out their AI policy, but the federal government has just been absolutely, you know, unequivocal in being super supportive of ai.
They're trying to do everything they can to support it. They didn't get this regulatory regime through that they wanted to, but they're moving fast. And we saw, you know, just over the last week, the presidential AI challenge really start to happen, launched by Baia Trump, basically inviting every student as well as educators to start showcasing ai.
And it's, this is, you know, doing really interesting AI projects. And this is like bittersweet to me because I think this is. A very smart move, and I think it's a good thing that they're doing it, but it's also, I feel like coming from some very strange motivations as the super pro business motivation, the super anti-regulation motivation.
I think it overlaps with their, with the federal government supportive cryptocurrencies. So like it's happening for, I think the wrong reasons, but I like that this presidential AI challenge is going to do something we've been talking about forever on the podcast, which is start creating a structure by which students, even young students, there are elementary school students in here, are able to create really amazing AI projects, showcase them, get the visibility for them, get funding for them.
I think that's terrific, but I think it's coming from a strange motivation. So I don't know, what do you make of that?
[00:23:33] Ben Kornell: There's a complexity here that we haven't had before. Yeah. That I'm just not sure how, actually, I'm curious, do you think that. Fundamentally behavior in the space is going to change based permanently?
Or is this just a temporary thing given the unique dynamic between our regulatory regime and what's going on on the ground in schools and learning?
[00:24:01] Alex Sarlin: I mean, I hate to say this, this sort of a negative perspective, but I think we've had this honeymoon period for AI for a while now where people are excited about it, they're talking about it.
The coverage has been mostly neutral. I think there's been some negative coverage, but mostly neutral, and I think people are saying, this stuff is obviously on everybody's mind. Everybody's thinking about it and that initial reaction among schools where they said, let's ban this, let's worry about it.
It's gonna cause cheating, it's gonna cause issues. They said, let's put that aside because it's such a hot topic. It's just meaningful. We know, and I think we may be in the third quarter of that honeymoon period, because you're starting to see those headlines about, you know, the chat, BT suicide and all the regulations that the family of the teenager who committed suicide after talking to chat GBT calling for more and more regulations.
There was a poll that came out this week about policies in schools being confusing and that the parents are worried about people putting their students' information into AI engines. You're just starting to see a pushback, and I just get very worried the number here is 70% of parents do not want their students' grades and other personal information put into AI software programs.
And we know in ed tech that. When Instructure launches partnership with open ai, that's exactly what they're doing. They're putting students' grades and personal information into AI software programs, and they're trying to do it for all the right, I think the right reasons. But if parents, 70% of parents are already saying, this scares me.
I think we may be getting to the end of the sort of grace period of this. So I think there will be a moment where, and it might not be that long from now, right? We're in the new school year now, where the people who do want more regulation, who are more worried about privacy, who are worried about mental health, they've been saying this for a while, but people have been saying, yes, but let's see.
Let's see. This is important. We wanna teach it. I think there may be a moment where their voices start being heard more and more and more and, and the regulation starts picking up. I think unless we, as a community, I say this almost every week now, but unless we have more positive stories, we have more success stories, we have more ability to really explain very, very clearly, not hand wavy.
Someday, this will work really clearly. Why AI in schools is going to be hugely beneficial to students and educators alike. If we don't have that story ready to go, I think the regulations and the pushback is gonna start to win. That's my take.
[00:26:19] Ben Kornell: Yeah, and I think there's a, like a sense of a black and white or like all the way to the extreme dynamic here, and that's probably accurate, Alex, is that we're in a pendulum society where nuance is quite often lost.
My hope would be we get to a sweet spot where we could have both the great stories and some regulatory guidelines and protections so that schools, educators, parents could act with the best interest of kids. I think that's the most vulnerable group is under eighteens, but also ethics around like representing your own work that we could actually come up with societal standards.
But I do agree with you that there's a danger that the pendulum is gonna swing all the other way. And you know, I think the case studies of DeepFakes and you know, criminal activities on the rise to. My thought is also let's have hope in the next generation. And every time the technology has looked really, really scary.
It's our like young people who grow up with it natively who can either intentionally or through osmosis, like develop adaptation skills to survive and eventually thrive with the new tech. And you know, from TV to radio to internet, like there's always been fear around these things. I just think there is something particularly menacing around the adaptability mini, the addictiveness, the personalization and the like relatively low cost of AI as means for humans doing bad things.
So it's a really tough one and I think this administration. Has been very anti-regulatory and they've been very anti department of education. Yeah. So anything that the Department of Education is doing around driving AI literacy is like an, is a deviation from their overall trend and probably a net positive for kids and for our space.
I'm excited about that. The other thing I'm seeing from a regulatory standpoint is this shift to ESAs and decentralized funding systems, which may also mean that parents will have more say in what kind of AI education their kids get, what kind of exposure they get. Do I want to pay for a school with my ESA dollars that is, you know, an AI tutor and then some project based learning all Alpha school, or do I want it to be.
No computers and all books and human beings, you know, maybe this is where my question is coming from. Yeah. Like I wonder whether this is a transitory phase or this is the permanent state of where things will be. Is this hodgepodge of. Learning styles, modalities, and approaches in an AI for learning world.
[00:29:19] Alex Sarlin: I agree with you that there will continue to be lots more educational options, including, you know, micro schooling, ESAs people able to take their money and use it, either as you say, for Montessori schools or for alpha schools. Right. I agree with you that that is gonna stay for a while. But that said, I don't think anybody can exist in a world where everything is just constantly moving and changing.
It's hard to not to imagine that this entire period is transitory. And you know, one thing that I think is relevant to a lot of things we've talked about today, and tell me if this makes sense, Ben, I is. You mentioned that Google is moving into this sort of practical use case. They have so many practical tools.
Tools that people use almost every day between Gmail and and and maps. We can just name so many. And they're starting to use AI in all of them. Open AI basically created a new category, right? This idea of a really intelligent chat bot that does anything you want, and it can code for you and it can do images for you and it can give you life advice and it can make your meal plans.
Like that's a new category. That's just a product that didn't exist before and it's been incredibly exciting for the world. But it also is not very opinionated, right? I mean, open eyes is just like. You know, chat bt, you know, it's just like, welcome. What can I do for you? That's what Claude does too. It's an interesting moment because I think that same kind of like, they offer different kinds of value prop for advanced users.
For users who wanna sit and use it for many hours, who want, who know what they wanna ask, who know what they wanna accomplish. These open tools are unbelievably powerful and nobody's ever seen anything like them. For people who want to do their regular life and just make their emails work better, or make their Google, you know, their PowerPoints work better, that's where Google is.
Or make their meal plans work better or make their workouts work better. Anything like that. If you look at the history of the internet, you know, I think we are. We have seen in the past that tools that are very, very open-ended don't always win. I mean, Google, you could argue that web search is very open-ended, sort of, but we've seen so many super successful tech companies that really dedicate themselves to very specific things.
You think of the, the eBay or Etsy or Airbnb, right? These are not do anything programs. These are, I'm gonna do something very specific that has a real clear value to you. And so I think that between the open AI and the Google model of sort of like, I do anything, isn't that amazing? I'm a like a, an intelligent servant to you versus.
AI in the service of many different specific tasks. I think over time we're gonna migrate to the, to the latter. I think we may look back and say, oh, remember the moment when what AI was was just this chat window where you'd ask it to do whatever and say, now AI is used in these very specific ways. In all of our daily tasks, we, you know, we know when we are getting coffee we're gonna use that x, Y, Z company that does coffee just right with ai and we do education.
We're gonna use this set of companies when we do training, when we do hotel booking, we're gonna use this particular company. I think there's really is room for AI native companies in all sorts of different areas. So this is all to say how this connects to what you were talking about right now is I think we're in a moment where.
AI is still this sort of hand wavy term. I mean, it really is when, when a policymaker is like, what do we do about ai? They mean like everything. They don't know what they mean when they say that, right? And until we know what we mean, if we say, oh, what are we gonna do for AI supported curriculum creation?
What are we gonna do for AI supported tutoring? What are we gonna do for AI supported, you know, assessment? Like, until then, you almost can't even have a, a meaningful conversation. AI is just sort of a word that carries meaning from everybody. And as that word becomes negative, right to the point as there are more DeepFakes, as there are more scams and phishing scams, as there's all sorts of things are gonna happen, and that word AI gets loaded with all of those negative ramifications or negative connotations, people are gonna think of AI as a scarier and scarier and scarier thing.
Just the way they thought of email as a scarier thing. It used to be just. Email. Oh, it's mail but electronic. How funny. Now it's like, oh, my email box is so full and I get all this spam email and email and all these companies email me about this. It's like, it's something that, it has sort of become a negative connotation over time, and I think AI is heading that way right now.
So we need to really think about the different subsegments of it, and especially in education.
[00:33:29] Ben Kornell: Yeah. Yeah. I mean, just the great points that you're making because we're living before the future state, and so that, that you paint and I think there's a lot to agree with on that future state. But of course, whether it's I interact with the coffee AI bot or whether my AI companion interacts with the coffee bot and distills it down, but I think that's what we're all excited about and what you're distilling is that there's a AI today is a what?
What is it? What does AI do? Ultimately, AI will just be a how, it's how we do the things we wanna do. And as it moves from the object of the conversation where it has all the focus to being more of an enabler for whatever use case, I think that takes some pressure off of what is it, what does it do?
What's good, what's bad, and so on. And so we have this all the time in ed tech. I often, you know, you and I often talk about it's ed first and tech second. The ED comes first for a reason and it's because those are the use cases. And when we get overly focused on an iPad for every child we lose, what's it for?
Or this is more of a how, so other topics that I wanted to talk about beyond the kind of AI and just the expansion of Google into our markets, A little bit about early childhood. So Inger Report did a really great piece this week. Folks should read it around early childhood education in New Jersey where there's $10,000 for every child under the age of five to do, uh, government funded pre-K and some great programs there it sounds like.
And yet many people are not signing up. And in California, transitional kindergarten, we are now in full rollout where three year olds have coverage in California. And yet, like uptake has not been as much as people would've expected. Not just from high income, but also low income. And so there is this question of what's driving that?
Is it an awareness issue or is it loss of trust in schools? Is it a concern about little kids being exposed to, you know, all the things in public schools, either it's bigger kids, but also school politics and all of these things. So from my perspective, this is an area where we've made really big policy strides on an investment area where all the research shows that like a dollar spent on early childhood worth $3 anywhere else in the system, and yet we're struggling to get the uptake that we are looking for.
To me the takeaway here is like the bond between community and school has really fractured and I think that there's a communication, collaboration, cultural challenge now that has, has really alienated a lot of families and this is like a really big concern as we think about what do kids need to do to be prepared for an AI future if they're not starting in at 3-year-old, four years old, we are going to be lowering the ceiling on what their potential is.
[00:36:53] Alex Sarlin: Yeah. I think you just made some really amazing points. I love, just to go back to the beginning, the idea of AI going from the what to the how I think is a really great way to put it. AI as a subject is just so confusing. AI as a how to accomplish things that you actually wanna accomplish. It's gonna be a really big change.
And then yes, the, the break between community and schooling, I think it's real and we could dig up statistics around it, but I feel like this is something, I really consider this as sort of one of the biggest legacies of the post COVID era is just this feeling. You always see these funny results from surveys where people say, I like my own kids' school, but I don't like the education system.
I don't like schooling. My own kid's school. Yeah, that's pretty good. And this weird schism where people have sort of started to think of schooling and the school systems as nefarious or not well run or all these things, which are really, you know, creating a lot of backlash. And I think that's what I would attribute this, what you're saying to the idea that preschool, even though we know that universal pre-K is hugely beneficial to the families in it and to the society around it, it just feels like more school and people are not looking for more school right now.
That's not sort of the number one thing that people are looking for. I think the, the bloom is sort of off the rose in terms of just the feeling that schooling, public schooling, but even just schooling in general is like a good thing, like a, an unequivocally good thing. I just think we've moved away from that, which is.
Scary in a lot of ways. There were a couple of Ved tech stories. I just wanted to make sure we mentioned, even if we don't get into them deeply, we saw Oak National Academy, which is a really interesting organization, basically offer, they opened up their curriculum API for free use for EdTech companies, and I think that's really worth noting.
Oak National Academy is really thoughtful organization. They've been thinking about putting together AI curricula and curricula for, for schools in really thoughtful ways. And now this is something that any ed tech company can access. We're also seeing Phoenix Education heading towards an IPO, and that's the company behind University of Phoenix, which has, I believe, been split off from Apollo.
I don't know the exact details of the business here, but
[00:38:52] Ben Kornell: yeah, it's like University of Phoenix almost feels like a relic of definitely ages past. So the fact that it's still alive is a surprise to me. And then of course, like that it would be IPOing. What makes one think that a University of Phoenix reboot.
Is gonna succeed in today's world when its brand has been significantly damaged. I will say Michael Horn had a great post around his optimism around innovation in education, both outside the system and within the system. Just given that AI is allowing people to be more cost effective and flexible, but I don't think rebooting University of Phoenix is what he had in mind.
Oh my God. Not that it's a full reboot. By the way. It sounds like they're operating programs. But it's very like reminiscent of like 1998 through 2005, ed tech innovation. It's
[00:39:51] Alex Sarlin: like the worst stuff in, in ed tech. It's the worst. I mean, I personally don't like it. They, they work hard. I mean, they try to do good things, but it's just they, I know what it is.
If you're Apollo, if you're a private equity firm, you're like, oh, we have three more years of Trump. And they, they do not care at all about regulating or like they will open up, we have workforce Pell right now. Like they're gonna be like, they will let anybody get money.
[00:40:16] Ben Kornell: Maybe. It's hard to know because then also there's competing interests that government players have interest in private universities that would compete against Apollo.
So we just don't really know, like. Here's probably what they're saying is like it can't get any worse, so, oh God. So you might as well go for it now. And you know, given capital markets are generally much better now on IPO, if you're gonna do it, now is the time to do it. I know we're up on time. We have an interview coming up right after this.
Do you want to give an intro on that?
[00:40:53] Alex Sarlin: Yeah. We are talking to Karim Meghji. He is the chief product officer of Code.org. They are doing really interesting work, including launching the hour of ai, which is the successor to the hour of code. So without further ado, let's talk to our guest
[00:41:07] Ben Kornell: and thank you all for joining.
If it happens in EdTech, you'll hear about it here on this week in EdTech on EdTech inside.
[00:41:15] Alex Sarlin: We've got a really special guest for this week's deep dive on Week in EdTech. We are here with Karim Meghji. He is the Chief product officer at Code.org, where he leads the organization's technical strategy and innovation to advance its mission of ensuring that every student has the opportunity to learn both computer science and artificial intelligence as part of their core education.
Karim is a tech leader and entrepreneur, and he's held chief product officer and chief technology officer roles at remitly, booking.com, boutique and more driving global growth. And an IPO, he holds a bachelor's in computer science from San Jose State. Karim Meghji. Welcome to EdTech Insiders.
[00:41:59] Karim Meghji: Thank you Alex.
Great to be here and look forward to our chat today.
[00:42:02] Alex Sarlin: Yeah, I am really looking forward to our chat today as well. So Code.org has been doing incredibly important work for quite a while now to help education become more and more practical, more and more technical. And now you are moving very aggressively into the new realm of technology, ai.
Tell us about the vision behind your new hour of AI initiative and how it feeds into a overall strategy of incorporating AI into code dot org's mission.
[00:42:28] Karim Meghji: Yeah, so Alex, it's super important as we enter this world with AI to teach students about fundamental knowledge related to computer science and ai.
And not just how to use it, but how to create with it, how to shape it, how to be. Part of the future with ai, and that's the work that Code.org is increasingly focusing on. The hour of AI is really intended to get students started. They're already using AI today, but they're using it without understanding how it works or what does it mean to use it responsibly.
And so we're gonna take what we've done over the last decade with hour of code. Code.org was the organization that brought the hour of Code to the World. We're stewards of that activity every year with hundreds of partners. This year we're shifting to the hour of ai, and it's the first step where students and educators get the chance during CS Ed Week here in the us, which is in December.
To essentially start understanding and creating with ai. And so we'll create easy to use hands-on activities that'll demystify ai. Kinda imagine what's possible. And just like we did with the hour of code, with the hour of ai, we're engaging the community of partners to engage as well and hopefully amplify, if you will, this moment where we can get students interested and curious about learning more about ai.
[00:43:42] Alex Sarlin: Yeah. One of the strategies that I think has been really effective from Code.org in general over the last decade, as you mentioned, is, is the idea of sort of shrinking the change, right? I mean, people have been talking about coding as a fourth competency, but you know, reading, writing, arithmetic, and coding as one of the ways to get into school, but actually that's very daunting for people, right?
The idea of a new subject of a whole new competency of a whole new area that teachers have to learn from scratch is really tricky. The idea of an hour of code as a sort of. One step towards this has been really influential and you've had huge success with it, being able to reach so many schools and students.
Tell us a little bit about how that kind of thinking maps to this AI thing. AI is also daunting for students and for teachers and for educators especially students are using it, but they don't necessarily understand it or they don't necessarily want to learn it as a school subject yet. How are you using that same sort of first step mentality to get AI into classrooms?
[00:44:41] Karim Meghji: Yeah. You hit on probably one of the most key points at creating any change in our education systems globally, which is. The school, the teachers, the environment, have to be prepared for that change. If you don't prepare the world that is around the student for the change that we want the students to see, it's gonna get messy.
And in some cases, the classroom or school or district may shy away from bringing these important skills into the classroom, or teachers may not have the confidence to teach. To your point, so big important step is to realize that teachers are also learning in these, these changes in, uh, the education system.
And we have to give them the tools and support at Code Debtor, we believe since our early days that. Professional learning and supporting teachers with their own curriculum on how to teach the content is critical. So all our curriculum is paired with professional learning. We do it in various formats too.
So we do in-person workshops where teachers can have groups of practice beyond just the workshop, you know, phone a friend and having a challenge with this situation. That's a supportive environment for teachers that don't have the time, or maybe they're not near a, a physical location. We do virtual workshops and then we also provide them with asynchronous self-paced curriculum.
So a lot of what you might not see about code org is all the work we do behind the student scenes to make sure that educators prepare. In addition to that sort of foundation, we incorporate extensive teaching materials into all our curriculum. Everything from lesson plans to guides. A teacher can literally use our guides to walk through an entire lesson.
You know, we pace the lesson out, we've thought it through. So the teacher has less to worry about on the content side and can focus more on each individual child in the classroom, and are they learning the materials as they need to. And then the last exciting thing that we've done with ai actually. Is we built an AI teaching assistant, so it helps teachers build custom lesson plans based on our curriculum.
For teachers that are still learning computer science and ai, it can help them assess student projects so they understand like, okay, this student on a rubric excelled in certain categories, but maybe met competency in others are struggling with others. And ultimately it helps the teacher if they have questions about the curriculum.
It's almost like a professional learning coach in the moment, oh, I had this question about this thing I remember from my workshop, I'm running into a nation classroom. How do I deal with that? So these sorts of uses of AI to help support teachers. And another example of just ways we think about making sure the environment is well suited to teach the curriculum and ultimately get the outcome with students.
[00:47:12] Alex Sarlin: That combination, you know, professional learning, just in time learning with an AI tutor, teaching guides, walkthroughs, videos, putting all those pieces together is really necessary when you're trying to do a change of this scope, when you're getting people to take on an entirely new subject. That was true for coding and it's true for ai.
An entirely new subject, which is not in there. Own education, right? Nobody has ever been through AI in school before, in K 12 school before, so nobody learned it growing up. All of humanity is learning it at the same time. Right now, you need all of the different pieces to come together to get educators comfortable and to be able to deliver really powerful lessons.
You've emphasized that AI your code org, right? And coding and computer science is core to the mission of Code.org, but AI really builds on a lot of the computational principles, the computational thinking, problem solving, algorithmic thinking, that underlies computer science. How do you put together all the amazing work that you've done over the last decade on computational thinking, on coding with all the partners that you've mentioned with the AI work that you are just undertaking right now?
[00:48:15] Karim Meghji: Yeah. You know, it's interesting. I think computational thinking sometimes. We miscategorize it as a skill that's about computers. I don't see it that way, and I think there are many in, in our field that see computational thinking as basically a human skill. It's another way to talk about problem solving, right?
The four aspects of computational or, you know, problem decomposition, abstraction, pattern recognition, and algorithmic thinking. When we talk about algorithms, we're talking about sequencing, right? Coding is a great way to teach those things, but it's a means to an end for the person who may not be a computer scientist or software engineer.
We need to apply those problem solving skills in designing a garden, to building a car, to building software, to solving some of the board's biggest problems that we still have yet to solve, or just a very localized problem that a student may be seeing in their community. So our approach is that AI is built on computer science fundamentals, right?
At the end of the day, it is mathematical, probabilistic, and it is software doing work with massive amounts of data. And so we wanna teach students about how these systems work. 'cause it's important we think, to understand under the hood what is happening. And then when you layer on top of that computational thinking skills, we believe that unlocks and empowers that student to use AI models to solve those problems we just talked about.
In some cases they might write code to do it, but in some cases they can use AI directly to solve those problems. And that pairing is what we think is sort of the power of the two combined. And so that's why we have a very strong focus on ai, but without necessarily saying, oh, we're not gonna talk about computer science anymore.
[00:49:49] Alex Sarlin: Right, and that overlap between AI and computer science or AI and coding is something that we are also all dealing with as humanity. We've now seen big tech companies talk about huge amounts of their code being generated with support from ai. Almost every professional engineer is using copilots in their daily use.
So AI is changing what it means to be a professional coder. It's also helping us reveal some of the things that are, as you say, ways of thinking that may not be in the service of coding and helping us figure out how we have to operate our own brains to be able to learn in different ways, including computational thinking and other types of new ways of seeing the world and thinking and solving problems.
Some people are saying how AI is really changing computer science. Fundamentally, it's changing what it means to do computer science, and I'm curious how you see that as Code.org as introducing this ai. How do you talk to students? How do you talk to parents or policy makers or educators about computer science education in the age of ai?
How is it gonna change and how are the pieces gonna come together?
[00:50:54] Karim Meghji: Yeah, I love this question 'cause it's both a challenging question and a very kinda inspiring question for us to, as a community, see if you will, the future with one view or one vision. The way I think about it is we've been teaching science for many hundreds of years to students, you know, middle school.
If when I was in middle school, I dissected a frog. I'm not a biologist, I'm not a doctor. But the point of that education, the point of going through that experience was to learn about. The world around me for us to learn about the world around us, to make sense of it in the context of whatever pursuit we end up on, right?
The world ahead is increasingly digital. It is digital fit, and it's becoming even more digital. And so I would posit that teaching the science of how the digital world work is paramount. It's like the equivalent of teaching biology or chemistry or physics a hundred years ago. That's a big part of our future.
We want students to be able to be critical consumers, responsible creators, to make sense of the world around them. CS education provides those foundational concepts to begin. It is the platform upon which our digital world is built. You and I are speaking today because the digital world exists, right?
Otherwise you would have to get on a plane or I would to come see you and sit face-to-face to do this conversation. And I think that's really critical. And then when you layer AI on top of that, it is increasingly the world in which students will work. They will work with AM models, and we want them to, again, not only understand how to use those models, but how to create with those models and how to apply them responsibly and, and the kind of problem spaces that they'll face in their, in their lives.
The one thing I think is really interesting when we talk about this question is. Detangling coding from computer science. And I think, you know, to all the educators that that will listen today. I really wanna underscore this point that computer science is about so much more than coding. We see computer science as equivalent to coding, but computer science is about how the internet works and how computer systems work, and how do you design in a human-centric way, computing systems, and what are the ethics and societal impacts of using technology and algorithms for, you know, problem solving?
How do you work with data? And now of course, how does an AI model work and how do you build with them? So all these things are. Domains of computer science that we don't talk about enough. We talk a lot about the coding part because that is the end manifestation. And yes, of course coding is part of computer science and in a world with ai, it's, it's gonna evolve students who are gonna continue to use coding to solve problems.
You know? But just like in the beginning, early days of computer science programmers would use machine code to instruct, you know, computers. We've gone from that to, you know, Python, a much more higher level language to instruct computers, how to do their work. So I'd see the future of the coding pieces.
We'll teach students how to write code because it'll help them with computational thinking skills. We'll teach them how to use AI to generate code and then we'll teach 'em how to evaluate it critically. Did it produce the right code? Is it correct? Does it solve the problem? In addition to that, let's teach students how to use AM models to solve problems directly.
Maybe they don't need to reach for coding as a solution, they can go direct to the model. But having both skills, I think are. Tools in a student's toolbox that we want them to have for a digital and an AI centered world.
[00:54:00] Alex Sarlin: Yeah, I love the way you're talking about it. I think, you know, computer science is understanding the systems, uh, understanding how all the digital world works.
And, and as you say, for young people especially, they spend a huge majority of their daily lives. Increasingly not the school time. Schools are starting to bend some digital devices, but most of their life is spent on digital devices right now. So understanding how the systems work, how the data is flowing, how the client is talking to the server, how everything is all fitting together, is really understanding how the world around them works in a very fundamental way when they pull up a YouTube video and it recommends another one, what's actually happening under the hood.
But that's different to your point than coding. Coding is the conversing, right? To be able to give instructions to a computer to do what? Do something specific. And we are very. Quickly changing even beyond high level languages like Python. As you're saying, students can really build entire apps right now, you know, purely through AI tools.
And I guess where it's an open question about if that's the future, if this, what the future of coding looks like. But I think to your point, the ability to speak to a computer with code, the ability to speak to a computer or computer system with an AI assistant, the ability to tell an AI tool what you want to happen, all of them require deep understanding of the systems that are happening behind them.
Absolutely.
[00:55:16] Karim Meghji: Yeah. I love the point you made, which is like the evolution, if you will, of how we direct computer assist change, right? From like assembly code, right? To these sort of high level languages to now English or the language of your choice, you know? And the ability to understand across that spectrum I think is, is critical.
And the other point. Which you're sort of touching on is AI is not done with this change cycle that we're in. Innovation is gonna continue for some amount of time. How long, you know, we're all kind of just guessing, but I do think it's important for us to also track the education pathway that we're charting for students along with the evolution of this AI innovation cycle.
[00:55:54] Alex Sarlin: One of the superpowers that I think Code.org has, has had for years now and continues to have is it's a great aggregator, it's a great builder of partnerships and communities and mentioned some of the, there's many, many partners involved in the hour of code and in all the work you've done so far, I imagine that partnership strategy is continuing to be core to what you're doing.
Tell us a little bit about how Code.org is thinking about building a community of partners for this hour, A of AI initiative, and generally for your AI strategy.
[00:56:23] Karim Meghji: Yeah, I mean, in many ways we've kind of done a lot of the community development work over the last decade with computer science. You know, if you go back.
12 years actually to the beginning of Code.org. The computer science community was, might say smaller, more modest. It, it hadn't gotten to the level of scale and engagement that we are at today. We've done that through some fantastic partnerships with for-profit organizations who helps fund this movement around technical education, nonprofit partners who help us train teachers to the earlier conversation we had work in the school systems with administrators to educate leaders in the school systems about the importance of technical education around computer science and ai.
Other partners who themselves are doing curriculum, you know, coding tools, products for students to learn on, whether it's in school or out of school. So that network of partners is already strong within the computer science community. Our next step is to work with that community of partners to bring coherence around what it means to take a step from there to AI and AI education, right?
Bringing sort of a unified perspective and voice and approach to helping get students prepared for the future, get those teachers trained, get those school systems kind of ready, if you will, to insert AI education alongside computer science education. So those are some of the things that we're doing.
We've also got some of our own interesting products that I'll just touch on. We've launched a new product called AI Foundations. It's a high school curriculum. It teaches a lot of things we've, we've talked about today, and our goal is to reach half a million, uh, high school students taking this curriculum every year.
We think it's that foundational needs to be that, you know, scale. And we're upgrading our middle school curriculum this year to include more generative AI content. And then we'll be offering an AI tutor later this year, literally for sporting students and learning these concepts. The way I like to think about it is.
So how can you r the barrier for the student that feels shy to ask a question or that isn't confident because maybe English isn't their first language, and so they're, they're holding back, but they have a really important question, how do we support that student through their learning? So ultimately, you know, on the path to personalized learning with a tutor.
We'll take some small steps to, to start. So it's gonna be a journey over the next two to three years, but we're excited for the work we're doing and we're excited to bring partnership together across the community and, and do some really great work with, um, students and teachers,
[00:58:43] Alex Sarlin: and you're incredibly well positioned to do so.
I mean, code org, greed, the largest provider of K 12 AI and CS education curriculum across the globe. Over a hundred million students and big equalizer, I think you just mentioned sort of in passing, that you reach a 48% female audience, 45% Title one or free and reduced lunch audience. I mean, really the legacy of Code.org has been taking something that was traditionally considered a really a luxury good learning to code.
Making it accessible to a huge number of students from all different backgrounds. And I think we are absolutely in need of that moment right now for ai. So I'm so happy to hear that you're stepping up to the plate in such a big way with this, with new curriculum, with AI tutoring with the hour of ai.
And we are gonna follow that space very closely here at Ed Tech Insiders. 'cause it's really, it's an incredibly important moment for I think all of us in education to think about what's next and how do we not replicate the downsides of coding education in the past, which is that it took too long, it took too long to get into schools that we had too many years of not enough female coders, not enough people able to access computer science.
And now AI is here and we have a sort of a do over chance, we have a chance to accelerate AI education and incredible to hear what you're doing. I, I just wanna pass that back to you as a, I'm sure that is a huge part of your mission. How will you think about the next few years for Code.org and how do you're gonna replicate or even enhance the success you've had with coding for.
[01:00:09] Karim Meghji: Yeah. It goes back to, you know, the partners that we've talked about, bringing the community together and forming a, you know, clear, cohesive view of the path forward in a world is ambiguous and changing, right? That like, that's the complexity of doing that. So that's piece one. Piece two, I can't underscore this enough.
We are where we're at today because more than 3 million teachers, actually more than 3 million, probably 10 plus million teachers in the time that we've been doing this work, have engaged students in the classroom and they do the most important work, which is understanding each child, making sure each child leaves that classroom with the learning that we hope they will take into their future lives.
So. First, thank you to the teachers that have leveraged Code.org to bring our curriculum into the classroom. But teachers are obviously hugely important parts. So professional learning, education, training and support of teachers is a critical step. And then the last part I think is really just maintaining a, a sense of adaptability and curiosity ourselves about what the future holds.
It's really important for us to see this moment as one where the opportunity exists, but navigate it in a, in a flexible way. And so I think that's something that we try and do everyDay@Code.org is really kind of. Inspect with intellectual curiosity, think about innovation, and also think about how we can use some of these new systems, technologies, and tools to deliver, um, the work that we deliver on.
So those are a few of the things that I think we'll think about over the next few years, but it's gonna be an exciting once in a lifetime journey, and we're excited to take it on.
[01:01:29] Alex Sarlin: Karim Meghji is the Chief Product officer at Code.org. He leads the organization's technical strategy and its innovation to advance its mission of ensuring every student, every student has the opportunity to learn both computer science and artificial intelligence as part of their core education.
Thanks so much for being here with us on EdTech Insiders.
[01:01:51] Karim Meghji: Thank you, Alex. It was wonderful to chat with you today.
[01:01:53] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.
Hmm.