Edtech Insiders

Year-End Special Part 2: 2026 AI & EdTech Predictions from Google, YouTube, Reach Capital, Owl Ventures & More!

Alex Sarlin Season 10

Send us a text

In Part 2 of our year-end reflections, Alex speaks with educators, founders, investors, and platform leaders about what 2026 may bring for AI and learning at scale.

🎙️ Featuring insights from:

😎 Stay updated with Edtech Insiders! 

Follow us on our podcast, newsletter & LinkedIn here.

🎉 Presenting Sponsor/s:

Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hit

This season of Edtech Insiders is brought to you by Cooley LLP. Cooley is the go-to law firm for education and edtech innovators, offering industry-informed counsel across the 'pre-K to gray' spectrum. With a multidisciplinary approach and a powerful edtech ecosystem, Cooley helps shape the future of education.

Innovation in preK to gray learning is powered by exceptional people. For over 15 years, EdTech companies of all sizes and stages have trusted HireEducation to find the talent that drives impact. When specific skills and experiences are mission-critical, HireEducation is a partner that delivers. Offering permanent, fractional, and executive recruitment, HireEducation knows the go-to-market talent you need. Learn more at HireEdu.com.

As a tech-first company, Tuck Advisors has developed a suite of proprietary tools to serve its clients better. Tuck was the first firm in the world to launch a custom GPT around M&A. If you haven’t already, try our proprietary M&A Analyzer, which assesses fit between your company and a specific buyer. To explore this free tool and the rest of our technology, visit tuckadvisors.com.

[00:00:00] Sunil Gunderia: You see the work of like Erin Mote and the EDSAFE AI Alliance, which launched an AI companion task force so that we can be a part of a solution around the safety rather than just simply advocates of more of the same. And I think this work is really essential for ensuring that the benefits of AR are scalable, but not at the expense of student wellbeing.

[00:00:20] Amit Patel: And so this idea of, Hey, we actually need to know that we are going to get the same consistent response back, there is a level of trust that exists. And this is something that students can depend on. This is something that teachers can depend on, administrators can depend on, and parents can depend on.

And so this idea of starting to get more consistency and reliability and eliminate these variances that we're seeing is going to become even more critical. And I think the companies that actually start to figure that out are the ones that are actually gonna benefit the most and gain the greatest trust from educators and students.

[00:00:55] Daniel Carroll: I feel like this is a year where it's never been a better time in the history of the universe to be a motivated learner. If there's something in the world that you want to learn. It has never been easier. You can have a chat with open ai. You can teach yourself with Gemini. There's countless tools where if there's something that you personally are feeling really motivated about, you've got it better than ever to learn in whatever style, whatever technique, whatever bits of time that you have.

And that's an amazing thing.

[00:01:27] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter, and also our event calendar, and to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben.

Hope you enjoyed today's pod.

For our next prediction, we have a very, very special guest, Jean-Claude Brizard Is. President and CEO of Digital Promise, a nonprofit focused on education innovation. He was formerly a senior advisor at the Bill and Melinda Gates Foundation, where he led efforts to close achievement gaps. He's also the former CEO of Chicago Public Schools and has over 20 years of experience in education leadership.

He's also a huge presence in the education and EdTech landscape because Digital Promise is always doing amazing things. Jean-Claude Brizard. Welcome Ed Tech insiders. 

[00:02:41] Jean-Claude Brizard: Alex, thank you so, 

[00:02:43] Alex Sarlin: such 

[00:02:43] Jean-Claude Brizard: amazing to be here with 

[00:02:44] Alex Sarlin: you today. It's great to be with you as well. So let's start by looking back at 2025. What do you think was the most significant shift or learning that we got as an industry about EdTech or about AI in 

[00:02:58] Jean-Claude Brizard: education?

I think initially when AI came onto the scene, but we, we know it's been there for a long time. A lot of focus was on workflow. You know, saving teacher time, et efficiency, which is necessary. But at the same time, did I think Miss missed the point, which I saw in 2025, the shift in moving to talking about curriculum instruction assessment, right?

What does it mean for teaching and learning? What does it mean for not just the way we teach, but what we teach? All these things begin to percolate and frankly, frankly, had a foundation on the frustration with learner outcomes. So all the headlines about math. Outcomes on need, going down, reading, et cetera.

I think forced people to begin to understand, wait a second, what does that mean for that? The second thing I would say is that it also forced the conversation around the broader definition of success. So we're looking for ultimately in economic outcome for kids economic mobility. Was it mean for workforce?

So the discussion went from sort of thinking about workflow inefficiency to what does it mean for the future of our children, and what 

[00:04:00] Alex Sarlin: is a prediction you have for how this is gonna start playing out in 2026. Do you think this is gonna continue? Uh, I'd love to hear how you're thinking about this for the next year.

[00:04:09] Jean-Claude Brizard: I think it's gonna continue. I think it's gonna accelerate. In fact, I'm already seeing grants coming from multilateral funders work. I'm seeing in other nations that have visited on AI and education, that the real understanding of learning science, the international learning science, artificial intelligence, you know, teaching and learning.

I think it's going to keep mushrooming and keep accelerating. Wow. I'm seeing learner graphs being developed by folks here in the US and folks again outside of the us. So that kind of granular understanding, for example, in getting to the conceptual nature of mathematics, how is actually a learning, there's real movement taking place.

Now that some of this foundation is being built, I'm gonna watch and see acceleration take place, frankly, into that particular realm. So yes, it's gonna continue and frankly I think it's gonna accelerate. One thing I think is gonna begin to show up in the conversation is what this means for assessment for the way we actually assessing.

One of my colleagues often say that right now assessment is made to be equally boring. So no one has an advantage. But I, I have three boys, I can tell you that I know at least one of them who walks by the beat of his own drum couldn't care less by about your assessment. If he sees no meaning in it, he doesn't take it seriously.

So I think that discussion is gonna gonna take root and I'm gonna say some acceleration there as well. 

[00:05:24] Alex Sarlin: And if there's one trend that you're seeing that you think educators, ed tech companies policymakers should prepare for in 2026, what would it be? 

[00:05:31] Jean-Claude Brizard: You know, when I think a lot about r and d. I think about DNR or research to practice.

Practice to research. So how is what we are seeing in the classroom, for example, informing research and how is research informing what's happening in the classroom, that kind of two-way bridge. So all to say is that I would love for educators to be crew and not passengers in this work. Right? I was on a webinar recently and I had one person in the chat who kept saying, I'm not gonna use AI until it becomes ethical.

Finally, I had to respond. I said, look, it's never gonna be ethical until you get involved in the conversation. This was a system leader. I said, no, you've gotta be in it. So one of the things that we do at Digital Depart and others do it as well, is this feedback loop system where educators, technologists, right, and researchers are part of a design.

What we've done with a number of different companies and organizations like ever Way EV based in, in Ireland, there's a lot of work here in the US on special education. They don't build things unless they have educators at the table frequently designing with their, with the technologist. So I'm gonna say please.

Take a look at that kind of construct. It makes your product go further. It makes it more relevant because they can, the folks who are using it and will use it know what it does and how they're hand in designing it. So that kind of research practice industry partnership, frankly, is what I would love for folks to get involved in.

For the educator, be crew in that, passengers for the technologists. If you really want market share, if you really want outcome in your product, involve the user in the design, or perhaps a further design of the work you're trying to get done. 

[00:07:02] Alex Sarlin: I love that metaphor of be crude not passengers for educators.

And we've seen this sort of rise in what some people call co-design or the sort of idea that educators, as you say, are in the room. They're as part of the design process, there's even the potential for educators to be creating their own tools through vibe coding by using some of the existing tools and that to be an influence on what happens in ed tech as well.

And I think we're getting to this really interesting leveling the playing field where technologists and educators who are once really saw the world in a very different way are starting to be able to speak each other's language and work together for ed tech outcomes and for learning outcomes. It's a trend that I think we've started to see really, uh, grow this year.

I love this prediction that next year this is just gonna become. Accepted part of the practice, especially when it comes to AI, where people who are skeptical can actually lean in and be part of the process. 

[00:07:52] Jean-Claude Brizard: I think the days of going into a corner and designing and building a new tool, I think is over because you have no idea what the market is going to demand and ask for.

So many technologists think they know what educators want, and educators perhaps know what they want, but don't know how to influence the technologist in the bill. Even at A-S-U-G-S-V in a bet, you're beginning to see now this real interaction. I mean, you know, the folks at A-S-U-G-S-V have been really one noted about getting state education chiefs, superintendents, teachers to attend a conference in San Diego because they know that triangle needs to be, to be bold and needs that feedback loop needs to actually exist.

So we're building tools that we, that will serve the needs of the kids we have in front of us in our schools. 

[00:08:34] Alex Sarlin: Last question here, is there anything that happened in the last year that surprised you that you didn't see coming that you feel like it would be interesting to highlight for our audience? 

[00:08:43] Jean-Claude Brizard: You know, not to get too political, the push to dismantle the education department, by predicted it wouldn't happen.

You would see maybe a bit of a neutering of it, but not really removing it. Because in this day and age, when leadership at a centralized level as a nation is more important than ever, and to see us outsourcing that to six different agencies is only gonna create more bureaucracy for our leaders, right?

In doing the work needs to be done. More incoherence in the effort. I really wish. That our secretary and our president would come back and think about. When you think about other countries thinking what they're building, building to centralize systems that more than ever we need that kind of centralized leadership as rapidly as AI is actually moving.

That surprised me, and frankly, it's given me a bit of a heartburn. My hope, frankly, is that I, as we move forward and perhaps seek competition from other countries, that will perhaps take a second look and say, how do we really create coherence? The one good thing that may come out of this is if a bit of a forcing mechanism to get our state education leads and chiefs to take up the mental and lead their state, move them compliance to frankly innovation.

That's my hope, but it's gonna be much harder than having a body that is perhaps a spokesperson, the leader in pushing this discussion forward. 

[00:10:00] Alex Sarlin: Thank you so much. This has been really, really fascinating. Jean-Claude Brizard is President and CEO of Digital Promise, a nonprofit focused on education, innovation, and constantly publishing really, really interesting work about education, EdTech and ed reform.

Thank you so much for being here with us on EdTech Insiders. Thanks, Alex. For our predictions, we are talking to Sunil Gunderia. He is the co-founder and CEO of a professional learning venture that applies science to unlock human potential. It is in stealth mode, formerly at Age of Learning. He led the development of programs validated in 30 ESSA aligned studies.

He's a venture partner at Emerge Education Vice Chair of the EDSAFE AI Industry Council, and a board member of Innovate EDU, and the Children's Institute. Sunil Gunderia. Welcome to EdTech Insiders. 

[00:10:52] Sunil Gunderia: Hey Alex. Thanks for having me. I'm super excited to talk about AI's potential, but certainly one, it's guided by evidence and centered on human relationships, and I think 2025 gave us proof points for both of those things as well as warning signs.

And there's so much opportunity in 2026 when you look at proof and warning at the same time to really develop out what it can be. 

[00:11:15] Alex Sarlin: Yeah, evidence and human relationships. I'm with you. So looking back at 2025, let's talk about some of the shifts and learnings we've gotten about AI and education. What do you think stood out this last year?

[00:11:29] Sunil Gunderia: You know, what really stood out for me was this study to Ed recently published, I think it was in November with Google DeepMind, which showed that when AI supported by human tutors matched the quality of human only tutors on both this like, I think it was immediate correction and deeper conceptual feedback.

What that tells you is that high quality tutoring may actually be scalable, and it was not because AI replaced tutors. It because it augmented them in a way that preserved the need for human judgment while improving precision. I think Ed's commitment to extend and replicate that work and continue to build.

Evidence base on what works is really important. You know, at the same time looking at 25, you gotta look at the peril. And there's been a lot of talk, and I think rightfully so about AI companions and how they're spreading among our youth, adolescents in particular common sense media, you know, found that what a quarter of our teens are using AI for emotional relationship like interactions.

What we learned from the tragic story about Adam Rain and his family is that the risks are just impossible to ignore. And what I'm encouraged by is that, that we are actually seeing bipartisan momentum on AI safety and transparency. And also what's encouraging from an EdTech industry perspective, Alex, and I think it'll really appreciate this, is like you see the work of like Erin Mote and the EDSAFE AI Alliance, which launched an AI companion task force so that we can be a part of a solution around the safety rather than just simply advocates are more of the same.

And I think this work is really essential for ensuring that the benefits of AR are scalable but not at the expense of student wellbeing. 

[00:13:11] Alex Sarlin: Yeah, it's a tough line to walk down. And we just published a editorial from Erin Mote and Michelle Culver about the trust, the moving at the speed of trust, the how AI companions have lots of potential and lots of risk, and it's just, the conversation has to be nuanced and thoughtful.

But I appreciate you bringing it all up. And yes, that EDD mind study was fascinating. 75% of the answers that the AI gave were just accepted. Wholesale, like without any changes by the tutors. And then half of the ones they changed, they were just changing tiny things. Like often deleting emojis because I used a lot emojis.

Yeah. So I mean, exciting. 

[00:13:47] Sunil Gunderia: And just to add to that and to see that the AI was able to respond without going off task or doing anything that was high risk, that would've led to high risk behavior. 

[00:13:56] Alex Sarlin: Exactly. So lots of promise, still some peril. Uh, there's definitely things to work out. What is a bold prediction that you have?

It can be a bold prediction or a regular prediction, but we love bold predictions. How AI will change teaching and learning in 2026. 

[00:14:10] Sunil Gunderia: 26 is gonna be about moving from generic assistance to true instructional intelligence. And I think this is gonna be across K 12 higher ed and workforce. The move will be from like this generic learning system to genuine support for teachers, professors, and trainers.

It's something that you referenced in an interview recently with Teaching Lab. I think this idea of curriculum informed ai. So AI tuned down. Yeah. Sound pedagogy rather than just the wisdom of the internet will dramatically increase accuracy and usefulness. And as I'm thinking about 26 and, and what we wanna build, and I think what great companies are gonna build is beyond curriculum like science informed AI as a next paradigm.

And, and what I'm thinking there is like this idea of a three sciences framework where curriculum science, learning science, and instructional science all come together to become a real partner. I think it'll not only help educators make better decisions in the moment, it'll really give learners more precise support and really evidence anchored AI is what will define 2026.

[00:15:15] Alex Sarlin: Evidence anchored. I love that phrase. I wish I made up the curriculum informed phrase. I, I used it, but it was not my coinage. That has been something that has been going around the industry and I think it's really powerful for all the reasons you've said. You can take the best learning science, the best evidence and bring it to the fore at the moment when it's most useful.

Incredibly powerful. So, Sunil, if there is one thing that you think is gonna happen in 2026 that educators, companies, or education policy makers should really prepare for, what do you think it would be? 

[00:15:44] Sunil Gunderia: As a systems person, I think of it across educators, companies, and policy makers, and it's really alignment around a goal towards optimizing the promise of AI while minimizing the peril.

You know, and, and with my work with at Safe AI Alliance, I think the safe framework with safe, accountable, fair, effective as the baseline for any AI used in learning is really critical. And fundamentally, you know, I spent time a couple weeks ago with Isabelle Howe and advocating for the idea of learning to love, to love to learn.

And as AI becomes more present in our learning, I mean, the fundamental question is. How do we strengthen human connection versus eroding it? And solving for that is really important. And then going back to what we talked about earlier with, you know, how do we bring all the sciences in, so curriculum science, uh, instructional science, as well as learning science into our solutions and have.

The, and coalesce around let's, let's produce evidence of what's effective both from a learning and relationship perspective. 

[00:16:50] Alex Sarlin: Yeah, so sort of building that evidence base of what works for learning and ai, what works for relationships and keeping that human connection and then implementing it. So it's sort of parallel tracks, right?

Building the evidence base 'cause it's such a new technology and then using it and implementing it and using, as you say, the unsafe AI framework is gonna be a really big way to stay on track as AI develops. This has been a pleasure. Uh, Sunil Gunderia is the co-founder and CEO of a professional learning venture that applies science to unlock human potential.

It is in Stealth mode. I think it will be out of Stealth mode hopefully soon. He is formerly the CIO at Age of Learning, where he developed programs validated in over 30 ESSA aligned studies. Thank you so much for being here with us on EdTech Insiders. 

[00:17:34] Sunil Gunderia: Thank you, Alex. And thank you for the EdTech Insiders community.

It's a wonderful community to be a part of. 

[00:17:40] Alex Sarlin: For the next prediction, we have a terrific guest. Sofia Fenichell is the founder and CEO of Study Hall.AI and college co-pilot, that's AI powered, SAT and ACT Prep, built by former college board and ACT Insiders. She also created reading co-pilots, partnered with Penguin Random House, Harper Collins, and MIT Technology Review.

One of my favorite publications. She's a Google Accelerator alumni, and she previously founded Mrs. Wordsmith, Sofia Fenichell. Welcome to Edtech Insiders. 

[00:18:15] Sofia Fenichell: Thank you Alex. So happy to be here as we close out the year. 

[00:18:19] Alex Sarlin: I am happy to be here with you as well. So speaking of closing out the year, what is your take on the last year?

When you look back at 2025, what is the most significant shift or learning about AI and education or about EdTech in general that you have seen? 

[00:18:36] Sofia Fenichell: Well, lots of sleepless nights, that's for sure. For all of us. But for me, I think the overriding theme is that the content cycle collapsed. And we have a sector that's very content centric.

Content is the foundation of knowledge and of everything we consume and we pass on to our children. And so that collapse means something. Attention is now hyper fragmenting. And that's a problem because education requires deep focus, right? We've got to process knowledge. And even with ai, especially with ai, we wanna go deeper.

So I'm really thinking a lot about how do we get there? How do we go from content, super scaling and fragmentation, and attention deficit to deep focus. Hmm. 

[00:19:25] Alex Sarlin: So looking forward to 2026, do you feel like we have any bead on how to get from that fragmented attention environment to a deep focus to using AI and other ed tech tools to actually get deep in our educational experiences?

Well, what do you think is coming in 2026? 

[00:19:41] Sofia Fenichell: Yeah. Funny you ask because I do have an idea of what I. Think might work. And I think that 2026 is the year that the AI tutor becomes the hero story. We had that wonderful experience with Sal Khan and his son and that video that went viral and we were like, wow.

Like that's just really exciting. Can that work? And I think it's gonna work. I think we're gonna get multimodal engagement. I think that users are gonna take agency over their learning. The immediacy of ordering that midnight pizza and whipping up an instance of a tutor is gonna become exciting as they get better, their lower cost.

And we know that cost is a massive pain point for all parents. Also all schools, but all parents too, right? We're struggling here. And the precision data, I think, is gonna bring in the magic, finally, because the efficiency that can come from the appropriate uses of data. Is transformational. 

[00:20:42] Alex Sarlin: Yes, I am very excited to see that come true.

I feel like the potential is there. The precision data, as you mentioned, can be a giant unlock. Scon announced that AI tutor for the world story, right as AI came out. And I think it may have been a little precipitous, but I think you're right. Maybe we're finally getting to that point where the AI tutor can be part of the landscape in a really exciting, meaningful, and transformational way.

And what do you think is a trend for next year? If you're seeing one thing that's sort of coming around the bend that you feel like educators in classrooms and a universities ed tech companies policy makers should look out for, what would that be? 

[00:21:18] Sofia Fenichell: So it's a little bit obvious, but I think we're gonna see some, what I call juggernaut.

Is that the right pronunciation word use cases in reading in high stakes exams, like the SAT, I mean, obviously I'm biased 'cause I'm very deep into these things. I think that we're gonna get to a point where we can start to move the needle on outcomes through really intentional engineering context, engineering, UI design, get past the ification of AI where everything is, let's embed a quiz in a bit of AI and move into juggernaut use cases.

That's what I, I think people need to pay attention to, particularly governments and particularly schools, because historically we have deployed capital into reading and high stakes exam and math. Not a huge area of my expertise, and that's meant something. And you would expect a return on that capital.

And so we need to get a return on our dollars now for how we're deploying ai. That's just fair. 

[00:22:15] Alex Sarlin: Tell us a little bit about how you're defining that sort of juggernaut use case in reading. What would that look like? 

[00:22:21] Sofia Fenichell: Well for us, and I'll tell you, we've had quite a journey on this one and we've like made supercycle mistakes or mini cycle mistakes without launching.

So everyone's waiting for the deep reader that is yet to see the light of day. And I think it's because we learned that where we started out embedding quizzes and side Charlie and the Chocolate Factory and the coming wave is not really an engaging use case. It's kind of doing what we did 10 years ago and last year.

And so we think that AI, where the user takes the first step has the dance card and says, I am intentionally engaging with ai. To do research, to read something and annotate, to learn on my own, navigate a book as the way forward. So we've been developing an annotation coach, and that's all you get inside the book.

You wade through that book. You go deep and you annotate. You take your notes, you get advice on your annotations, you get some feedback, you can save them, and you basically go deeper into reading, but keeping it simple and trying to go deep. I think that those are gonna be the big use cases. Simple and deep.

[00:23:27] Alex Sarlin: That's very interesting that, so putting the user sort of in the driver's seat and when they find something inside a book or when they find something that they want to go deep on and learn about, they can annotate it. And from there you have, you sort of unlock all of this AI functionality to go deep, learn about it, contextualize it, maybe get to transfer.

That's an exciting vision. 

[00:23:46] Sofia Fenichell: Yes. And aligned to your goals. Like, so when you onboard we ask you, I have AP English, you know, my son actually is really annoyed with me. He's like, why wasn't this ready when I took my Shakespeare GPSes? Like I hate reading Shakespeare. Well most people do, but with a reading coach you might go deeper and get those annotations right, but maybe you're just reading the coming wave for pleasure, MIT tech review for pleasure.

And you just want to learn and have it be your lifelong learning companion. Or use it when you're writing your blogs to go back and say, wait, I kept some notes with AI powered annotations. 

[00:24:23] Alex Sarlin: Really, really interesting. It's great predictions. I mean, I, I think that there's something really powerful, just, you know, some really interesting research about Generation Z using AI for learning really extensively and very frequently.

It's sort of the number one use case is, is actually learning new things and skill building as they use ai. And they use AI a lot. So the idea of giving them these superpowers and they trust educational tools, by the way, which has really been nice to see. You know, we get a lot of pushback. We, we talk on this podcast a lot about, you know, people worrying about screen time and worrying about privacy and all these things.

But I think for young people who, for whom this is new, exciting technology, the idea of being able to, you know, pick something you wanna learn, go deep, make sense of it, contextualize it, get it in your context, incredibly powerful. 

[00:25:07] Sofia Fenichell: I think so, and I believe we tested Socratic tutoring. It doesn't work because it has a cold start issue.

You're like, uh, you know, ask a kid a question and they're gonna turn off. Right? They've got to drive it. That's, it's gotta be what I call a Harkness system, which is a little bit flipped from a Socratic. But I also think, like, I love AI for learners and I know there are huge risks with it. You know why?

Because AI gets you reading. Okay. It might shortcut writing a bit, and that is a problem. And I don't have all the answers to that, and probably no one does. But what I do know is I've never read more in my life than this year because I'm constantly using AI and I'm constantly reading and assessing the outputs.

And that is an act of critical thinking. And so we have to find a way to channel learners down that path of doing research and making annotations and teaching themselves. 

[00:26:03] Alex Sarlin: That's really, really interesting. Fantastic predictions. I really appreciate all of your specificity and, and sort of the thinking about where this is all going.

Sofia Fenichell is the CEO of study hall.ai and college co-pilots. She also created reading co-pilots. All of these things are launching in a big way over the next year. She's also previously founded. Mrs. Wordsmith, thank you so much for being here with us. 

[00:26:26] Sofia Fenichell: You're a legend, Alex. We love being with you. You're the best.

Thank you so much. 

[00:26:31] Alex Sarlin: For our next prediction, we have a Allstar Predictor who had an amazing set of predictions last year, and that went actually quite viral on LinkedIn. It was quite amazing. Amit Patel is co-founder and managing partner at Owl Ventures, where he invests in leading ed tech companies like Amira Learning APNA, and Honor Lock.

His past investments include Code Academy acquired by Skillsoft, Thinkful acquired by Chegg, and he is founded Personal Academic Trainers as a founder, as well as served as Director of Technology at Success Academy. He holds an MBA and MA from Stanford. Amit Patel, welcome to EdTech Insiders. 

[00:27:11] Amit Patel: Well, thank you Alex.

Thank you for having me back again this year. Really excited to have this discussion with you. 

[00:27:15] Alex Sarlin: Absolutely. It's great to see you again. So let's start off and look at 2025. What do you think was the most significant shift or learning that we had in the EdTech space, especially regarding 

[00:27:26] Amit Patel: ai? I think one of the things that AI has really made possible, and we're seeing this I think in general technology as well as EdTech, and it's a really fantastic trend, is this idea of like prove value with this technology and tie payment with the actual value that you're delivering.

And this is starting to allow, I know for a while there's this concept of outcomes-based contracting that has been discussed, but with AI and the telemetry that AI actually offers of this continuous stream of data and efficacy and outcomes that you can get. This potential for outcomes-based contracting has been finally possible to implement at scale now.

And great example of that was, for example, in our portfolio, Amira Learning and the work that they did in Duval in terms of starting to help the district, their focus on reading outcomes for their students within that district is something that fundamentally, I don't think was actually possible before AI actually existed.

And this is something that I think is only going to continue to expand as you go into 2026. And for students, for educators and for ed tech companies, I think it's just gonna lead to higher quality solutions in the ecosystem. 

[00:28:39] Alex Sarlin: Yeah, it's exciting moment. Looking forward to 2026, what is a bold prediction that you have for how AI will change 

[00:28:46] Amit Patel: teaching 

[00:28:47] Alex Sarlin: and learning?

[00:28:47] Amit Patel: Yeah, so I think one of the things for 2026 that we're starting to get a preview of and is only going to become even more critical and relevant as things go on with AI is consistency of AI responses. We've all talked about hallucinations, and we've all talked about like kind of with the reliability that exists with the current models, but the idea of if we actually want, in cases like education to truly depend and trust ai.

We can't have two different teachers ask the same question and get two different responses, right? Whether they asked on Monday or Tuesday, or whether in different parts of the country. And so this idea of, hey, we actually need to know that we are going to get the same consistent response back, there is a level of trust that exists.

And this is something that students can depend on. This is something that teachers can depend on, administrators can depend on, and parents can depend on. And so this idea of starting to get more consistency and reliability and eliminate these variances that we're seeing is going to become even more critical.

And I think the companies that actually start to figure that out are the ones that are actually gonna benefit the most and gain the greatest trust from educators and students. 

[00:30:00] Alex Sarlin: Hmm. Very, very interesting. What is one thing that you think that educators, ed tech companies or policy makers should keep an eye on in 2026 that's sort of coming around the bend and may change the landscape in some important ways?

[00:30:15] Amit Patel: Yeah, so I think this last year what you've seen is, Hey, listen. This is not a genie that we're gonna be able to put back into a bottle, right? Like this is going to be a part of not only how students learn, but how adults work. And so I think what is going to become critical from educators perspectives, policymakers perspectives, is proving identity.

As like who actually created this work? Who actually learned what they said they produced, and how can I actually verify that? Right? And if you think about that from an educator's perspective, the reason is, hey, listen, maybe AI was used in actually producing this, but how can I think about assessments that actually verify that the student understands this?

Am I gonna start to think about whether it's oral exams or more real time work that they actually have to produce? And it's not about like, I think policing these things, but really starting to think about those next generation assessments and how can you actually verify that the student has used this technology in the way that is allowing them to actually learn.

And then from policymakers perspectives, it's gonna be important to kind of set up the right legal frameworks so that people can protect their work, right? This is going to be something that is gonna be critical. There's a number of cases that have been going on, there's a lot of policy that's actually getting created.

And so from all of those perspectives, it's going to be really getting down to like kind of understanding the identity of who created that and how deep of an understanding do those people have of, of what it is that they say that they've actually learned or created. 

[00:31:44] Alex Sarlin: We've heard a couple of different predictions about the assessment landscape, and it feels like the way you're bridging sort of intellectual property at the adult level and assessment of knowledge for at the student level is really intriguing that they sort of have the same underlying mechanics, which is using AI to actually determine what is going on in somebody's mind.

Sort of like actually sort of getting into the details of the relationship between a person and an AI creation. It's a really interesting way to put the pieces together. 

[00:32:13] Amit Patel: The beautiful thing about this technology, right, like it's unlocking secrets that previously were very difficult to actually get access to.

It's like. You know, simple things like, Hey, I wanna learn how to run a sub four minute mile. Like, how do I actually do that? Right? And it's like previously, well you needed access to a coach or you needed to find somebody that actually knew how to do that, and now you can get a step-by-step process of exactly how am I actually going to make that happen.

But on the flip side of that is, well, you can now have very complicated ideas that become very quickly accessible to somebody. They only have really a surface level knowledge and then they're kind of coming in and pitching as like an expert on this actual topic. And so you have to understand, well, is it important that I need to ensure that there is a deep and detailed understanding of this actual topic?

Or is it something that, hey, listen, if there is that surface level knowledge that's, that's good enough. And so I think there's gonna have to be that, you know, kind of distinguishing factor of, of what is it that's really important as you're taking all of those things into consideration. 

[00:33:17] Alex Sarlin: It's a really interesting way to look at it.

I'm curious if anything in 2025 surprised you? Just, it's a sort of a bonus question. We haven't asked all of our predictors this, but as somebody who observes the landscape very carefully, what surprised you in 2025? 

[00:33:30] Amit Patel: Yeah, I think the biggest thing that honestly surprised me is the talent that we are seeing self-select into the education technology sector.

And what I mean by that is we are having everything from second time founders to the world's leading educators to technologists that could be doing anything across technology. All saying, you know, the place that I actually wanna focus my talents and my life's work on is the education sector. I mean, these are people that could do anything that they possibly wanted to do in the world.

And the fact that they are all self-selecting to say where I think this technology can be, you know, used and have a profound impact on the trajectory of humanity is education. And that has been absolutely amazing and I'm so excited about a lot of the founders that we have the like good fortune of working with within our portfolio.

But more broadly, I'm even more excited about just all of the talent that's kind of coming into the sector and saying, this is what matters and this is what we need to be focused on. 

[00:34:37] Alex Sarlin: That's a great point. I mean, these reports keep coming out saying that the number one use case for AI is learning and the huge amounts of learning are happening.

I think people are starting to realize that education, technology, we've always loved that field, but is an incredibly powerful field and a place where AI can really shine and truly change the world. Ai, it's, it's amazing to see such great talent coming into the space. I agree. I've interviewed some people who, you know, their backgrounds are just mind blowing founders with 280 patents as a founder I talked to who has 280 patents.

Wow. One of the most prolific inventors in the world running an EdTech company. Thank you so much, Amit. This has been so interesting. Amit A Patel is co-founder and managing partner at Owl Ventures, where he invests in leading EdTech companies like Amira Learning APNA, and Honor Lock. Thanks so much for being here with us on EdTech Insiders.

[00:35:23] Amit Patel: Thank you, Alex. Thank you for having me.

[00:35:26] Alex Sarlin: For our predictions episode, we are speaking to Dr. Soren Rosier. He's the founder of PeerTeach, an AI powered peer tutoring platform serving over 30,000 students nationwide. He's a former Oakland Middle school teacher who went on to co-lead Stanford's learning design and technology program, and he spent his career building and teaching at the intersection of learning science and ed tech.

Absolutely amazing. Dr. Soren Rosier, welcome to Ed Tech Insiders.

[00:35:55] Soren Rosier: Thank you so much for having me, Alex.

[00:35:57] Alex Sarlin: So before we get into the predictions, I want to give you a moment to tell the world, anybody who has not yet heard about what you're doing at PeerTeach with peer tutoring, tell us about it because it is a incredibly interesting and powerful model.

[00:36:11] Soren Rosier: Yeah, for sure. There's not many folks that do what we do, and so it does take a little explaining. In a nutshell, we set up math classrooms to do peer tutoring and we do that by. Using micro assessments to figure out the strengths and the weaknesses of every kid in the classroom. And then we train all of the kids in the class to be effective peer helpers, to use the kinds of teaching and helping strategies that a teacher might use with their peers.

And then we facilitate the interaction. So it's really effective and fun between the kids. 

[00:36:39] Alex Sarlin: That social collaborative learning, that peer tutoring is one of the most powerful and underused aspects of education. I've been a huge fan of the peer tutoring literature for so long, and I just feel like it is so rarely done and especially so rarely done with the level of nuance and learning science that you do it with.

So it is really exciting. So let's talk prediction. So start with last year, when you look at 2025, what were your most interesting and most significant insights about the EdTech space and in particular about AI in education? 

[00:37:15] Soren Rosier: What struck me the most last year was just this interesting paradox where AI has never been more powerful than it is right now.

It can genuinely transform how learning happens in classrooms, and at the same time, we're trying to hear this building chorus from teachers, from parents, and even from students themselves. That there's too much screen time and there's even a lot of concern that AI is starting to prevent social interaction from happening with classrooms.

Used to be these spaces where kids would go and they would meet friends and they would learn how to be social creatures and work with others. Now they're becoming more and more spaces where kids can be isolated on the computer. And to be honest, a lot of kids don't learn that well, just working directly with a screen and these concerns around too much computer user are valid.

Like it's been a lot of different bodies that have measured the ways that kids are on screens too much. They're using social media too much. They're becoming addicted to interacting with AI and not being comfortable enough with interacting with other students. And that's why we actually think that maybe the best way to use AI in education is to use a lot of this powerful AI and machine learning.

To set up kids, to have better person to person interactions in the real world. Maybe we can use a lot of this powerful technology to set up really good conversations between two students to figure out which students should be talking to each other at any given moment. And then to grease the wheels for that conversation to make the kids really good at helping each other and to be that coach on the side that supports kids to have the best conversations possible.

'cause that's where a lot of the most rich learning can happen between kids. 

[00:38:51] Alex Sarlin: A hundred percent. So when we look at 2026, what is one bold prediction that you have for how the EdTech landscape might change? 

[00:39:00] Soren Rosier: I've been seeing a lot of like little percolating instances of this, but I think we're now getting to that moment in time where regarding assessment and evaluation, we're finally gonna move past this right and wrong answers.

For decades when we've been trying to build like personalized learning systems or anything in the class that allows kids to be more self-paced, it's just focused on is the kid getting answers right or wrong? And therefore how do we move forward in the progression of the lesson and what we're finally at this point in time where the technology has become so powerful.

At taking in long form thinking of students, whether it's in the form of them talking or their writing to understand intricate details about their thinking. Is this student understanding this concept on a deep or conceptual level? What are the little small misconceptions that are cropping up for them?

And just building a more nuanced understanding of what's going on for this kid. I predict that a lot of these technologies are really gonna take off in the coming year because the base technology has made such huge strides, and now it'll just be a matter of. Applications incorporating this really powerful technology to do these higher forms of assessment.

And at PeerTeach we're, this is a lot of what's driving the future development work of PeerTeach, where a lot of what we're building right now is focused on how can we have an AI listen to a conversation that's happening between two kids, make sense of what are the most effective ways that this kid is actually helping the other student?

Are they explaining the concept fully? Are they asking strong questions? Are they checking for understanding? Are they doing a lot of these strategies that we know are really important for good communication? And then if we do that deep, more nuanced assessment work in evaluation work, we then have the possibility to give feedback to the teacher, to the student themselves in order to help them grow in these ways beyond just being a better math leader, mathematician, where they can actually become a better leader.

You know, they can become more empathetic as a result of getting feedback on how to execute these different teaching practices. 

[00:40:54] Alex Sarlin: That's a really powerful and very exciting vision and I mean that is such a fascinating way to think about the future of assessment, that you're doing something realistic and authentic and AI can actually assess different aspects of it that is truly inspiring vision of the future of assessment.

Yeah. 

[00:41:10] Soren Rosier: Thanks Doc. Thanks for saying 

[00:41:12] Alex Sarlin: that. Last question for you is, if there's one sort of trend that you see coming for 2026 that you feel like the whole space, you know, educators on the ground, companies who are making ed tech and and developing it, policy makers perhaps should keep an eye on, what would that trend be?

What do you think is gonna happen in the new year that everybody should sort of watch as a underlying trend? 

[00:41:33] Soren Rosier: What really gets me is the pace of development that's occurring right now. It's so rapid and it's gotten to the point where it's, it's completely outstripped, like our normal institutions ability to keep up with it.

I think there's a lot of problems with that and I'll, I'll give an example. I'm about to publish a paper that I first submitted to a journal 18 months ago, and by the time it's actually in print, which will be a few months from now, a lot of what I had learned 18 months ago, a lot of what I wrote down about, about our findings, it'll already feel dated and it won't take into account a lot of the most important new technologies that have been developed in the interim.

And so I think this glacial pace of the, the research side and the way that traditional publishing happens has to change. I think that on the part of our industry and our field as a whole to kind of amplify and encourage educators, school people themselves to be builders, I think it's part of it, but also on the part of researchers to engage in more research practice partnerships where they're actually in the field, they're building.

In classrooms with teachers, with students, they're using the most current and innovative technologies in their research to see if they can push them as, as, as far as possible in order to inform these conversations. And then in, instead of going through these archaic glacial publishing processes to figure out ways to get their ideas out into the world more rapidly so that they can inform the conversation.

Because we don't want this to be an education world that's just completely driven by companies and completely driven by engagement metrics and profit and revenue. We want it to be informed by experts in the field, by teachers and by evidence. And so I'm hoping that the coming year involves a lot of people thinking deeply and hard about how to make sure we're incorporating, you know, teachers voices and having researchers be a more meaningful part of the conversation than they have been in the past.

[00:43:27] Alex Sarlin: Accelerating the product development cycle as well as the people doing it. Accelerating the research and dissemination cycle to start to match the speed at which the technology is developing. Feels like a very important imperative for our space in particular, because it is. Thank you so much. This has been fascinating.

Really deep thoughts and I am leaving this conversation genuinely inspired about the future, about 2026. Dr. Soren Rosier is the founder of PeerTeach, an AI powered peer tutoring program serving over 30,000 students nationwide. Thank you so much for being here with us on EdTech Insiders. Thanks for having me, Alex.

I appreciate it. For our next predictions episode, we are here with Jomayra Herrera. She is an absolute veteran all star, EdTech superstar, and has been on the EdTech Insiders Podcast since the very beginning. She's also a partner at Reach Capital, an early stage venture fund, investing in learning healthcare and the future of work.

Jomayra started her career as an operator at an EdTech startup called BloomBoard, and eventually started a career at Venture Capital at Emerson Collective Loring Powell Jobs. Family office before Reach. She was at Cowboy Ventures where she spent most of her time with consumer internet and marketplace companies.

She's championed investments in a huge variety of different companies, contract, career Karma, Handshake Guild, and at At Reach. She works closely with Work While Step full. Food Health Co tenor, Winnie. We've had Sarah Mossoff from Winnie on the podcast a lot of times. Kai Pod, amazing Manifold, and many more.

Jomayra Herrera, welcome to Tech Insiders. 

[00:45:04] Jomayra Herrera: Thank you for having me. I feel like at this point have I visited the pod the most out? Anyone? I 

[00:45:09] Alex Sarlin: think you're, I think you might be the number one most frequent podcast. And that's an honor. I think it's an honor. 

[00:45:15] Jomayra Herrera: It is. 

[00:45:16] Alex Sarlin: It's great. You always add so much value and your episode with Lyman Miser was one of our most listened to of all time.

It was really amazing. Really, really popular episode. It was a great one. Welcome to Tech Insiders. Let's talk predictions. So let's start with looking back. Looking back at the year, that was at 2025, what do you feel like was the most significant ed tech story, especially regarding education and ai? 

[00:45:41] Jomayra Herrera: So I think 2025 was the year of adoption and experimentation.

This was the true first school year where districts had to grapple with ai. They no longer could just block it. They had to develop AI policies. They had to figure out what does professional development look like for their staff and for their teachers in order to actually adopt these technologies in their schools.

Teachers were more willing to experiment with these technologies, and so I call it the year of adoption, not necessarily successfully, but the year of actually experimenting and taking it seriously as opposed to trying to push it to the side, ignore it, and block it for usage. I think there is now an embrace of figuring out how best to leverage it in the classroom and in the context of schools.

[00:46:32] Alex Sarlin: Yeah. So if 2025 was a year of experimentation and adoption, what will 2026 be? 

[00:46:39] Jomayra Herrera: So my hope, and I am cautiously optimistic, is that this will be the year of figuring out what are going to be the highest leverage use cases for AI within the classroom, and specifically within the context of curriculum and instruction.

So my hope is that in 2026, we start to move much deeper into the actual practice and art of education. So I think a lot about, we have companies like PHI that are focused on how do you leverage AI and the context of special education, an area where it's super underserved and where AI can make a really big difference in personalizing instruction for these students and ensuring that you're actually aligning your practices with the student's IEP.

And so my hope is that AI in 2026 will be more about really. Thinking about how we can leverage technology to reimagine what's happening in the classroom. And I don't say that in a fluffy way, like we've been saying technology is going to reimagine the classroom for a very long time, but my hope is that it actually just gets much more deeper into the curriculum and instruction side of the classroom.

[00:47:53] Alex Sarlin: That makes sense. And what is a trend in the space that you feel like educators or EdTech companies or policy makers should keep an eye on that's, you know, rising as we go into 2026, what would that be? 

[00:48:04] Jomayra Herrera: So I am racking my brain because I think I might have said the same thing last year, and I actually don't remember, but I'm gonna say it anyways.

It is school choice. So as you can imagine, we're seeing school choice legislation happening across the country, and that has ramifications from everything from schools to school districts, and obviously attendance to policymakers, to companies. I mean, for the first time, probably in the history of investing in ed tech, the idea of investing in a company that enables alternative learning opportunities in a K 12 setting is viable now.

That wasn't a viable thing pre the mass ES, a legislations that have happened across the country. And so we have even at reach backed companies like Pathfinder, that focuses more on the FinTech side of ESAs and also Cpod, which you had mentioned earlier, that is more providing the operating system for micro schools and these alternative school settings.

That's possible now because of these policy changes, but it has ramifications across the board. School districts have to grapple with the pressure of attendance, potentially dropping. Policy makers obviously have to grapple with what does it mean when there is a high amount of demand for more school choice.

And companies then have to adapt and figure out, you know, what makes most sense in terms of business models and how they can tap into these additional dollars. And I think it's likely going to be one of the biggest shifts we've seen in K 12 education. In the last 50 years, maybe in terms of just what school means for the average person, K 12.

[00:49:46] Alex Sarlin: That's very well said. And, and let me ask a follow up question on this because this is something I've been wrestling with. I think this school choice, this ESA landscape is fascinating for all the reasons you said, but one thing I'm trying to figure out, and I'd love to hear your perspective, I know you think about this a lot, is how close are we to the moment when an EdTech company that wants to succeed that sells to schools can actually sell to micro schools and alternative schools and all of these new education options that are popping up as their beachhead core demographic.

Like, is it yet viable or are we a few years out from that? Like, because that feels like it's gonna be key to the shift you're talking about where, you know, companies that are used to thinking about public schooling and all the legality and compliance and buying cycles and all the things that come with that can actually start to have a true alternative market to sell into.

Do you see that coming next year or are we still far out from. 

[00:50:44] Jomayra Herrera: So, as you know, we invest on 10 year horizons. And so we think it's viable today in the sense that there's enough market today to hit the growth rates that you need to hit in order for you to get to a multi-billion dollar valuation, which is what we're investing in within 10 years.

So like working backwards now, is it going to be the majority of the market next year or even the year after? No, but there is enough market in the next year or the year after to grow very, very quickly and to grow into a meaningfully sized company. And then what we are betting on is that in 10 years, and don't hold me to this, you know, venture capitalists are more wrong than they're right.

But in 10 years that it will be a significant market that it can support a venture backable or a venture-backed opportunity. And I don't think it'll be as clear cut as it. You know, call it micro school Alternative learning options in public schools. I think it's gonna be a little bit more hybrid, and we're seeing this even today.

Mm-hmm. Where we're seeing public school districts figure out how they can provide a la carte offerings that tap into ESA dollars, or figuring out how they can partner with companies like Kit Pod in order to offer micro schools within a public school setting. And so I don't think it'll be as clear cut as we think it'll be, but I do think that just the idea of building a business model off of the back of the ESA changes is viable.

[00:52:16] Alex Sarlin: That's a great point. Fascinating predictions. Really, really thoughtful. And even if you said school choice last year, I don't think, even if you did, I think this may really be a better landmark year for that expansion of different school options and different school choices. Thanks so much for being here with us.

Jomayra Herrera is a partner at Reach Capital. It's early stage venture fund, investing in learning, healthcare, and the future of work. We always love speaking to you here on EdTech Insiders. 

[00:52:44] Jomayra Herrera: Thanks for having me, Alex. 

[00:52:47] Alex Sarlin: For our next prediction, we have a very special guest. Dan Carroll is a EdTech advisor and investor.

He's the co-founder of Clever, which powers Digital Learning in 95,000 or more schools. He's a former middle school science teacher, a TFA core member and a tech director, and he lives in Brooklyn with his wife and two kids, and serves on the boards of Teach for America and Design Tech High School. Dan Carroll.

Welcome to Tech Insiders. So great 

[00:53:16] Daniel Carroll: to be here. Thanks for having 

[00:53:17] Alex Sarlin: me. I'm so happy to chat with you. You are really an EdTech legend and we chat in the WhatsApp group in EdTech Insiders. I feel like you are the consummate EdTech insider. You've been paying attention to this space for a very long time, so I'm really excited to hear your predictions.

Let's start with a reflection from last year. Looking back at 2025, what do you think was most significant shift in the ed tech space, especially regarding AI and education, which is moving so fast? 

[00:53:43] Daniel Carroll: Got a couple thoughts here. So the first is that I feel like 2025 is the year that teacher tools, teacher AI tools really went mainstream.

This was just the year where even if you weren't an early adopter, you had access to ai teacher tools to help you differentiate lesson plan, create custom materials for your lesson. Instead of doing what I did as a teacher scouring Google to find that one activity or video that might match up with my curriculum these days, you can just.

Type in a few sentences and have, whether it's a video or a quiz or a passage, have it created for you, just the way you want just to match up for your classroom. So this set of tools has been around for a few years, but 2025 is the year where I feel like it went across the chasm from early adopters to something that most teachers have access to and are using, if not every day, maybe every week or every month.

Additionally, in 2025, I feel like this is a year where it's never been a better time in the history of the universe to be a motivated learner. If there's something in the world that you want to learn, it has never been easier. You can have a, a chat with OpenAI, you can. Teach yourself with Gemini. There's countless tools where if there's something that you personally are feeling really motivated about, you've got it better than ever to learn in whatever style, whatever technique, whatever bits of time that you have.

And that's an amazing thing. We spend a lot of time in the ed tech world thinking about, uh, raising the floor, making sure every student is learning. But it's also incredible to think about those folks who do have their own motivation and their own drive, and them having access to incredible tools to learn whatever they wanna learn.

[00:55:12] Alex Sarlin: So let's look at the future. What's happening next year? How do you think that things are gonna pan out in 2026? And specifically, what is a bold prediction that you have? What's a big Yeah. Change that you expect to see? 

[00:55:24] Daniel Carroll: Yeah, I hope this is bold, but because I think it, it is one that really may happen or may not happen, it will happen eventually, but I don't know if 2026 of the year, but I'm gonna go out on the record and say it is, I think 2026 is the year that we see AG Agentic AI make its way into school.

And I think AG agentic AI is gonna hit first with school operations. So when I think about Ag agentic ai, I mean something really specific. This is an ai, you know, an LLM. That is running 24 7. It's in a loop. It's not just responding when you ask it a question or when it's prompted. It has access to different systems, it has access to tools and it is running 24 7 to make changes.

It's your always on assistant. And the place where I see this coming first to schools is with school operations. If we think about, there's a lot of administrative tasks, so many administrative tasks that schools have to deal with, and agen AI is really great for some of them. And the technology is there.

It's just about building the right systems and doing the right implementations to make it work. So that's my big bet for 2026. We even see some early AI natives, SISs. There's one called Scout, SIS. I'm an investor working in the virtual school space. Really, really excited about what they're doing and I can see them making the jump to traditional school districts in 2026.

Another prediction that like something that we see a little bit of in 2025, and I really see this becoming mainstream in 2026, is rich student interactions with ai. So kids talking with a prompt to open ai, that's something that happens, but I don't think that that's my mainstream prediction that you're gonna have in every, you know, K 12 classroom kids just chatting with open AI or chat GBT.

What I think is gonna be explode in 2026 is rich interactions. Thinking about things like multimodal formative assessment with snorkel, where kids can talk to the AI draw. Type and show their learning in a variety of different ways, and then get immediate feedback. Not having to wait for a teacher to grade their exit ticket.

They're getting feedback right away. The last rich student interaction area that I see happening already at Design Tech, and I think this is gonna happen across the country, is when you're doing project-based learning, when you're using higher order skills, giving students access to those LLM, the image generation, the LLM directly, to be able to show their learning in a really high level way to help them level up on bloom.

So the student is being the evaluator, the synthesizer, and the AI is helping them create really authentic work that, you know, might have a meaningful impact in their community. It's amazing kind of the, when you're using AI tools thoughtfully and harnessing their power, and you can go from, you know, maybe a poster that that lives to show you're learning to, like maybe you're actually creating a new product or a website.

That's something that's gonna live in your community and solve a real problem. 

[00:57:58] Alex Sarlin: Amazing. Really great predictions. And last question for you. What is a trend or something you see coming that EdTech companies, educators, and policymakers should prepare for in 2026? What do you think is gonna start happening that we should pay attention to?

Now, 

[00:58:13] Daniel Carroll: this one, I'm pulling straight from our WhatsApp group, but it's screen backlash. Big article this week in Free press, maybe not this week when this gets published. Talking about, well, we've introduced a lot of technology in schools and APE scores aren't where we want them to be. So clearly it's technology's fault and we need to get all the screens out of schools.

I think that's a. Deeply, overly simplistic argument. So many factors that drive NAP school scores. And I don't think technology is one of the top two or three, but I do think that there's some truth there that we need to be really thoughtful about. And I think that what we've seen as schools have gained access to more technology, one-to-ones become the norm, is that given a chance to do something digitally or with paper schools are just saying, well, we have the technologies there, we wanna use it.

Like, let's just make everything digital. And there's some real drawbacks to that first, you know, personally, if I'm reading, I'm gonna be much more focused, do a much better job of reading if I'm reading a print book than if I'm reading a PDF on a laptop. It's just a better affordances for deep thinking and.

So let's be really thoughtful about when we use print versus when we use digital and embrace the strengths of print, the focus, the lack of distractions, the ability to, you know, have a tangible book or set of materials in our hands. And similarly, when kids go home, let's be thoughtful about when technology is used in homework and when it's not.

As a, a parent who already with a 6-year-old figuring out what is she doing on the iPad? Is it educational? Is it not? Is she getting distracted is so hard? And if every single bit of her homework has to use technology, there's no way for me to glance over and know is she on task or is she off task? And so embracing the strengths of print for focus, for clarity, while also of course not getting rid of the benefits of digital when you're creating a rich demonstration of your learning using AI to create this rich project.

That's something that you can't do with print. Let's embrace that when you're having a set of discussions facilitated with Course Mojo, you can't do that with print. So let's just really justify when we're open asking students to open up laptops in school or at home. And if there's a real reason where, wow, this is a, a kind of experience that you can't do without technology, let's do that.

But if we're just taking the exact same thing we could be doing with paper and doing it on a computer with no real benefit, then let's keep the print around to allow, uh, for all the benefits that it has. And so the screen backlash is real. Whether it becomes a wave or a ripple to is to be seen. But if I'm a principal or a a district leader, I wanna really be able to justify when my schools use technology and when they intentionally don't, because both are a critical part of a learning experience for a student.

[01:00:47] Alex Sarlin: Fantastic prediction. Yeah, we, we've seen technology explodes in the post pandemic era and I agree with you that there's a pushback moment of saying we may not u wanna use tech for everything. There's a lot of things that analog or print can be beneficial for. Finding the right balance, I think is gonna be one of the big challenges of 2026.

I, I love that prediction as well. Thank you so much. Really, really powerful predictions. Dan Carroll is the co-founder of Clever. He's an EdTech advisor and investor, and an EdTech legend. Thank you so much for being here with us on EdTech Insiders. Oh, thank you Alex. It's a pleasure. For our next 2026 prediction, we have an incredible guest.

Katie Kurtz is the managing director and Global Head of Youth and Learning at YouTube, where she is responsible for developing the ecosystem to engage, inspire and educate kids, teens and adults. Katie joined YouTube in 2019 after serving as Chief product officer at Noodle Partners, designing and building fully online degree programs for universities.

Previously, she was senior vice president and national sales manager for higher ed at Cengage Learning, and before that, vice President of Business Development at Adaptive Learning, pioneer Newton, where she led the platform's partnership strategy. Katie Kurtz, welcome to EdTech Insiders. 

[01:02:05] Katie Kurtz: Thanks so much for having me, Alex.

[01:02:07] Alex Sarlin: So you are in. The catbird seat, the number one learning platform in the world, and I am so curious to hear how you thought about 2025 and 2026, especially in this age of ai. Let's start with 2025. Looking back, what was the most significant shift or learning in AI and education that you observed from your incredible vantage point at YouTube?

[01:02:33] Katie Kurtz: I think the most significant shift in 2025 was the realization and maybe the broad consensus that AI can improve learning outcomes for students. It can lessen administrative load for teachers, but it will never, can never replace the essential role of the teacher. The inspiration that they provide teachers are the magic, whether that's inside or outside of classroom.

We see this in the 85% of teachers around the world who say they use YouTube video in their classroom to help unpack really complex ideas, and I think that it just reaffirmed that the. Essential mechanism for human understanding and teaching is storytelling and AI has not and will not replace that.

[01:03:16] Alex Sarlin: Yeah. And what do you think going forward in 2026, what are you looking forward to? What do you, what's a bold prediction that you have for how education is going to continue to evolve? Especially again, in this crazy era of ai we're all living through in 2026? 

[01:03:31] Katie Kurtz: I think 2026 is gonna be the year. Where we have a universal commitment to AI literacy as a foundational competency for just life.

Just the way we insist that you need to build some foundational skills before you can drive a car and get a driver's license. There's gonna be this societal imperative that we all have an agreed upon curriculum of what it means in this sort of like new age, what is the digital license? And with that, I think we will also reaffirm AI as a tool that for youth can foster greater curiosity that can build agency, that can build empowerment, but we'll need some of the rules of the road actually explained.

[01:04:16] Alex Sarlin: Right. It's been interesting this last year there, literacy. You've seen a lot of different nonprofits, a lot of different tech companies of various types start to really jump into that AI literacy space. But. It's still not clear. If you talk to any three people, they're, as you say, you know, nobody's like, this is what AI literacy means, this is what it means.

This is what it means. Depending on what class you're in, who you're talking to, it's different. I think this year it might converge. It's gonna be really interesting. And what is a trend that you see coming in the next year? So we talked to educators on this podcast, we talked to a lot of EdTech companies, founders, operators, policymakers.

What do you think is coming in the next year that people should keep an eye on that might sort of undergird some of the changes that happen? 

[01:04:55] Katie Kurtz: Well, I think the big one from where I am sitting, is that I think we're going to derail important educational access if well-intentioned countries that have a goal of keeping young people safe online.

Right. Legislation that actually does the opposite. And so the recent Australia Social Media Minimum Age Act, I think serves a little bit as a, as a cautionary tale. You know, it unwinds in importa safeguards and protections and controls. Such as personalized accounts, which are just the critical tools for driving learning outcomes on platforms.

You know, especially like YouTube. And so restricting online access needs to be thought through with real expert guidance involved. I think we wanna protect young people in the digital world, not from the digital world. And so that's gonna require some expert guidance. You know, failing to do so, I think results in the unintended consequences of diminishing, rather actually then safeguarding young people's safety online.

And so I think the one thing that we collectively could do is just make sure that we're articulating the profound value of a personalized online experience to facilitate in deeper learning outcomes, to inspire curiosity. And then, you know, most importantly, how we can marshal all of that to build confidence in young people.

[01:06:17] Alex Sarlin: Very interesting. So I'm hearing you say that the pushback against technology, against screens, against social media may have sort of unattended consequences to being throwing out the baby with the bath water. And in one particular area of that is personalization, that by removing personal accounts, you also remove any aspect of personalization that may have positive effects in a learning environment.

Absolutely. Really interesting. Well, I appreciate your perspective and you know, we are at this very strange moment where I think everybody is simultaneously. Excited about all the potential that AI is bringing into technology and excited about all the changes and also nervous and, and you know, truly trying to figure out how the technical world is changing and how technology is changing society at large.

And you're seeing this sort of delayed pushback from the social media era, that there's these waves are sort of crashing together at this moment. And I think next year we're just gonna see even more explosive crashing. But hopefully there'll be the nuance, especially in regards to EdTech nuance introduced into the discussion so we can actually get to a place where we all feel good about what.

Happening, especially when it comes to ed tech, because it can be polarizing. Right? I think a lot of teachers or districts can be like all tech. Yes. Tech, no tech, you know, no social media, no phones. The reality is that there's roles for both sides and that the truth is somewhere in between. 

[01:07:36] Katie Kurtz: Yeah. And that we all have the shared goal of absolutely keeping kids safe online and really thinking about the power of the technology to expand access and engagement to so many young people who are inherently curious and all the positive that that can do if we can channel it in in the right ways.

[01:07:58] Alex Sarlin: Katie Kurtz is the managing director and Global Head of Youth and Learning at YouTube, where she is responsible for developing the ecosystem to engage, inspire and educate kids, teens and adults. Thank you so much for your time. We really appreciate your predictions. Thanks for being here with us on EdTech Insiders.

For our next prediction, we're talking to Juliette Reid. She's the director of Market Research at Reading Horizons, where she tracks literacy policy and trends across the us. She analyzes how legislation, funding and policy influence district purchasing and literacy adoption with nearly a decade in education.

She has expertise in both balance literacy and the science of reading. She holds a Master's in Educational Leadership from the University of Illinois and a bachelor's from UCLA, and she's based in Littleton, Colorado. Juliette Reid, welcome to EdTech Insiders. 

[01:08:53] Juliette Reid: Thank you for having me. 

[01:08:54] Alex Sarlin: So when you look back at 2025, what was the most significant shift in the EdTech landscape from your perspective?

[01:09:02] Juliette Reid: Specifically looking at ai? I think one of the biggest shifts I've seen from my data sources is a real increase in daily utility of AI tools by educators. I think even into the fall of 24, there was still some skepticism and just a lot of experimentation by educators. And I think over this year we've really seen just educators really embrace certain tools and embed them into their daily use.

So specifically what I've been talking to our customers about and things that have really resonated with them are lesson planning tools, I think is a huge one for them. Just generative AI tools that help them goal set and plan for meetings. And those types of tools have just really become ingrained in the daily workflow.

And the way I think about this is that they're solving a real market problem that these educators have, which is no time. So even in my work as an educator, we had about a 40, 45 minute planning block where we were expected to plan for curriculum, grade, contact parents, plan for meetings, and do all sorts of things.

And there just wasn't enough time for all that. So I think where AI has really taken root this year with educators is in helping them solve that problem. And I expect that we'll kind of continue to see that and see other problems that educators are seeing that value in with these new technologies emerge.

And then the other thing that we've really seen a lot this year from the policy lens is across the board recommendations and guidance for use on ai. So this is really a 2025 thing that we've seen happen all the way from the federal level down to the district level. So those are the two big things I've seen this year.

[01:10:36] Alex Sarlin: So looking forward, what is a bold prediction you have for how this is all gonna play out in 2026? 

[01:10:42] Juliette Reid: Yeah, so as someone that works in the literacy space out reading horizons, we are keeping a really close eye right now on upper grades in the students and the literacy crisis they're facing. So the most recent NA scores that came out this fall are really quite dismal.

Our eighth graders and 12th graders are really struggling in their literacy achievement with about 30% of them scoring at the below basic levels. And these are record lows for the entirety of the time the NA has been in existence. So we've got a real crisis in that, and I think there's a huge opportunity here for AI as well.

So another problem that educators really face on their day to day is differentiating or meeting the wide variety of needs for all of the students in their class. And this is especially true for intervention teachers at those secondary grades. They may be having students come all the way from kinder all the way to grade level or even above grade level.

Abilities and they're expected to meet all of them where they're at at the same time and help accelerate their learning. It's very hard, if not impossible task. But I do think that with AI we could see some really interesting things come about and a real trend towards personalization. So I think that that's where we'll head in the next year in unique ways though.

Right. So when I'm talking about personalization, I think our default is probably to have students get on a device and have them go through some sort of personalized pathway. But I think there's a lot of new ways AI might help empower teachers to meet the needs of their students and to differentiate.

So whether that's helping them identify and act on possible misconceptions quickly. I think could be a huge one. Real time scaffolds for students. So in the moment, right, a student responds one way and the AI can help inform the teacher of where they might need to pivot in the moment to meet that student where they're at, I think could be a really neat use case.

And then just really good assessment data targeting, pinpointing exactly what students need so that either the technology or the teacher can help accelerate them forward. So I think that those are the areas we're gonna see, whether that's teacher driven or student driven, and the technology to help fill the gaps right now that our older students are facing.

[01:12:53] Alex Sarlin: That's a terrific prediction. So it's really a year of differentiation, especially regarding older students who are behind grade level on their reading and all sorts of ways in which differentiation becomes enabled with ai. One other one that I have heard is the idea of AI being able to create readings that are tailored for older students so that even if they're reading at a second grade level, you can get a fifth or sixth grader a reading that is a topic that they care about, but it is written at a different grade level.

You can do all sorts of interesting differentiation and adaptation with AI for that as well. 

[01:13:26] Juliette Reid: I love that. And another huge piece of the older learner experience is motivation. So it's a huge factor with those students is how do we get them motivated so that they can engage in that content. And like you said, that tailored personalization is one thing, but there's so many cool technologies that are coming out that are able to kind of evaluate how engaged students are in a program and tailor that from them to keep them motivated in the learning process.

[01:13:53] Alex Sarlin: As you look forward to 2026, what is one trend that you feel like educators, ed tech companies, policymakers, should all prepare for that will undergird some of the changes that we're gonna see in 2026? 

[01:14:07] Juliette Reid: Yeah, so I'm really watching the tension right now between distraction-free learning policy. And the incredible possible capabilities of ai.

So one of the other huge trends we saw this year in the policy front were these distraction free policies. So cell phone bans or personalized device bans. I think about 20 states implemented new policies. I think we're about at 35 now, and I know that here in Colorado we're also on the precipice of getting our own in place.

So I think that the tension between that and the possibilities of AI to help students reach achievement levels is really gonna come to head. I think right now we're kind of talking about these things separately, but I think this year is the year that policymakers, district leaders, teachers, and all the companies that are involved in these processes are really gonna need to figure out how they balance these two conflicting forces.

There are ways to do it, I think, and there's some creative ways we're considering at our company at Reading Horizons. I do think right now we're kind of operating on less screen time is what's recommended in one case, but the motivational technology, all these pieces, all of the personalization that AI can have here for us is this, this other piece.

So I'm expecting in 2026 for us to really need to come to terms with what are the goals of distraction-free learning, and what is the role of technology in schools so that we can find a healthy balance to prepare our students to be literate and digitally literate members of society, but also be able to communicate right with other humans and engage in those ways.

It's about empowering the teacher and strategically using the technology at times when it's critical for students to engage in those ways. 

[01:15:55] Alex Sarlin: Well, thank you so much. This has been really interesting. I love hearing your predictions about the future. This is Juliette Reid. She's the director of market research at Reading Horizons, where she tracks literacy policy and trends across the us.

Thanks for being here with us on EdTech Insiders. Thanks again, Alex, for our 2026 predictions episode. We are here with the great Shantanu Sinha. He is the VP of Google for Education. He oversees products like Google Classroom used by over 150 million students and educators globally. Previously, he was president and COO of Khan Academy, helping make personalized learning accessible with a background in strategy, operations and computer science from MIT Shantanu is passionate about transforming education.

Shantanu Sinha, welcome back to EdTech Insiders. 

[01:16:46] Shantanu Sinha: Yeah, it's great to be here. 

[01:16:47] Alex Sarlin: Always great to 

[01:16:47] Shantanu Sinha: see 

[01:16:48] Alex Sarlin: you. It's always great to see you. So let's start by looking at the last year at the year. That was 2025. We're here right at the end of it, what, from your perspective, was the most significant shift or learning we had about AI and education?

[01:17:03] Shantanu Sinha: I think this was a remarkable year in in education, and I think the one thing that we really observed is this is a year where AI went from promise to practice. I think in prior years we talked a lot about the potential for what AI could do, how it could save educators time, how it could help personalize learning.

But I think we also always recognize that the technology had its flaws. It's a very jagged frontier. It'd be very capable of certain things, but at the same token, not capable of other things. And in education it's really important that the technology works well and it works well across a wide variety of use cases.

We would never accept a textbook that had errors on 10% of of the pages, and we shouldn't accept anything different of AI this year. I think what we've seen is. The quality of the models dramatically improved. If you look at Gemini 3.0 Pro and how it's performing on benchmarks and with the deep reasoning, its ability to do not just elementary school math, but really high school graduate level, deep math science subjects at really, really high quality.

And we've also been able to bring that to educators in a much bigger way with Gemini for education than we make available to K 12 schools and higher ed institutions for free around the world. And that combination of access and quality, I think what we're seeing is people are really engaging and embracing the technology in a very different way.

And it's no longer just what I could do with it, but it's what I am doing with it and it's really exciting to see that progress this year. 

[01:18:42] Alex Sarlin: Let's look at the next year looking at 2026, what is a bold prediction you have for how AI will continue to change teaching and. 

[01:18:52] Shantanu Sinha: One of the things that I'm particularly excited about is the multimodal capabilities of these models, and it really kind of going beyond this chat interface of text in and text out, and I think there's so much more that is gonna happen and that we're already seeing.

In particular, one of the things that we've seen a lot of capabilities with the models is this ability to have, we call it generative UIs, where the model can actually. Understand your context, understand your intent, but maybe can code up a simulation or could create a diagram, or it can really make much more visual, interactive experiences.

And it's really opening up a whole new world of interaction and educational content. With our image models like Nano Banana Pro, you're able to create amazing infographics with the latest Gemini Frontier models, you're able to code up a coding simulation and interact with the content in different ways.

And I think in education, it's always been so important to have these rich, dynamic, interactive experiences, and I'm really excited to see what educators are gonna be able to do with these capabilities that are really emerging over the course of this next year. 

[01:20:05] Alex Sarlin: Things have been moving so quickly in the multimodal space, and I love that you're focusing on the interactive element of this.

'cause we've been talking about multimodal a lot this year, but in many ways we were thinking about video output or audio output, the way that Notebook LM does podcasts and things like that. But what you're talking about here is even beyond audio and video, it's interactives, it's simulations, it's ui, you know, it's learning by doing, which we all know from the literature is so much more effective even than passive rich media like video and audio.

So it's incredibly exciting as a pedagogy person to see us going into multimodal with this interactive capability. 

[01:20:43] Shantanu Sinha: I think everybody who works in education knows the importance of active learning, of interactive experience, of having these dynamic experiences. And as you mentioned, video generation, audio generation.

That's one aspect of it and we've started to see what that's capable of doing. And that's amazing in itself. You can now translate content into Spanish or, or different languages around the world and make videos and podcasts. It's remarkable. But I think that's just really the tip of the spear of what's possible.

And now when these models are able to really understand that in full context and drive those interactive experience and have interactive simulations, it really opens up all kinds of richer experiences for educators. And one thing I know with educators, if you give them the tools, they are the most creative people on the planet.

They will find amazing ways to be really put this, to create richer experiences in the classroom. 

[01:21:30] Alex Sarlin: So when we look forward to next year, what is a trend or direction that you think that educators. EdTech companies or policymakers should prepare for in 2026 is, is there something that's going to start to change that sort of undergirds or underlies other changes that are gonna happen next year?

[01:21:48] Shantanu Sinha: Yeah, I think the biggest piece of advice that I would give anybody who's working in this space is. Really use and play around and understand the frontier of what's happening. I think a lot of people might have, you know, they might have tried some technology or they tried some use cases six months or 12 months or two years ago, and they have in their mind what that means.

And it's actually changing and it's changing in a really rapid way. And that's, it's remarkable to see this kind of exponential capability curve. But I think if you're working in this. Base. What that means for you is you really have to understand the state of the art and you really have to understand the implications for that.

So the number one thing that I would say is use the latest models. Understand them. Try leveraging them for different use cases. See what's possible. See what they're great. See what they're not they, it's definitely not a panacea that's gonna do everything for you, but it is a technology that's moving fast.

And if it's not there this year, it might be there next year or the year after. And you have to be thinking about how this is gonna be evolving in that context. I still believe the pillars that we've often talked about, about really saving educator time and driving more personalized learning are still the big things that tech this technology is gonna really drive.

But how it does it is gonna evolve. And I think if there's anything that's important is really make sure you understand that that latest and the greatest of the capabilities and that you're really kind of thinking about what does that mean in your world? 

[01:23:12] Alex Sarlin: That is fantastic advice. You know, it makes me think, as I look back to the last year or two now, people just don't always realize how quickly this stuff is moving and how competitive it is.

So I think that's phenomenal advice. Thank you so much for being here on our predictions episode of EdTech Insiders. Shantanu Sinha is the VP of Google for Education, overseeing products like Google Classroom used by over 150 million students and educators globally. Thanks so much for being here with us on EdTech Insiders.

Thanks for having me. For our next predictions episode, we're here with Amanda Bickerstaff. She's the co-founder and CEO of AI for Education. With over 20 years of experience in education as a former teacher and ed tech executive. She's a frequent consultant, speaker and writer, and she leads workshops on AI and education all over the country and the world, helping schools and teachers leverage AI ethically and equitably to maximize their potential.

Amanda Bickerstaff, welcome to EdTech Insiders. 

[01:24:11] Amanda Bickerstaff: Hi. Glad to be here. 

[01:24:13] Alex Sarlin: I'm so happy to talk to you. You have such a unique perspective on AI and education. You are in so many different school districts. You're at so many different conferences. You do a lot of podcasting. You're everywhere. When it comes to AI and education.

You publish this amazing map of all of the states and where they are with their policies that I rely on very regularly. So let's start with 2025. Tell us about what AI in education looked like in 2025. What was the biggest stories and sort of shifts that happened throughout the year? 

[01:24:43] Amanda Bickerstaff: Probably the biggest story of 2025.

There are a couple. I think that from our perspective, we have started to see, I think, a shift into some like stronger awareness of the need for gen AI literacy. Although I will say it's definitely been something that I have not nailed in my own predictions. Like, you know, I started AI for Education in 2023.

I was like, oh, it'll be like six months until people recognize, and now it'll be a year, and now it's two years. And now I have no idea. So I'll say there was less movement. I think in general AI literacy as a priority than we would've liked to see. But I do think that we are starting to see the next steps with some organizations that have really gone all in on training staff and students of thinking about adoption meaningfully.

I will say this is also the year of big tech coming straight for classrooms and straight for teachers. I mean, I think that first over the summer. Bypassing the teacher completely with all of the large language model being the foundation model. Companies coming right for students directly was first, but now we're starting to see things like Google Tools being developed, released, and other tools for teachers and the GPT for teachers that was released at the late part of the year.

So unfortunately, I think those two things are like at odds to each other because we haven't quite seen the level of meaningful kind of general literacy foundations yet to be able to even make great decisions about what those things mean. In fact, I know that what we're gonna see, I don't want to say my prediction too early, but I think we're gonna start to see banning actions more next year.

But I'll stop there. 

[01:26:20] Alex Sarlin: Interesting. So as somebody who pays a lot of attention to degenerative AI literacy landscape as well as the policy decisions being made on the ground, I definitely want to hear your bold prediction for 2025. Uh, you may have given a little bit of a teaser of it. 

[01:26:34] Amanda Bickerstaff: Well, I think that there are two bold predictions that I have.

The first is they're gonna also sound at odds. I think that we're gonna see districts going all in on gen ai, ed tech. I think that we already started to see that this year, but I don't think it was at any level of like where you would expect based on maybe estimations of how transformative it would be.

And then the second thing is, I think what we're gonna see is a rejection of some of the big tech out there. I think there are going to be bands that happen, and I already know that it's happening in some of the districts where chat bt and for teachers it sounds like a great idea, but it'll be something that feels.

A step too far or there's not a DPA signed or whatever it may be. And so I think that those two things kind of go hand in hand. Right. Because it's so interesting because the JI Tech is built on the foundation models, right? But the avenue which they get into schools, I think will be more focused on the actual JI tech For us, the prediction is, is that, unfortunately, Alex, you and I have talked about this before, the amount of times that we have people and organizations come to us saying we went with the tools first and then we realized no one knew what they were, how to use them, the adoption, they were using 'em in ways that could potentially be harmful.

And we have to take a full step back after paying a significant of money. I think what we're gonna see is the same thing as like tools as a strategy did not work when Ed Tech started and it will not work on its own now that Gen AI is becoming more ubiquitous in schools. 

[01:28:04] Alex Sarlin: Interesting. One thing that keeps striking me about this moment and what you're saying it feels like it's right on this thought, is that when technology moves as quickly as it has been moving for a while, but especially it's moving right now, there really are no experts, right?

I mean this whole technology has just arrived for all of us. We all have to get our heads around it because just assuming that it's going to fit its way into our system or that, you know, the one person at the school who really wants to embrace it is gonna be the best ambassador is not the best way to look at it.

[01:28:33] Amanda Bickerstaff: Yeah, totally. But I actually think even like a step back, 'cause this is like, you know, where the EdTech group, how many cover organizations have we talked about that are building gene ed tech where their salespeople don't have enough information to talk about it, or their content curriculum people or their trainers.

Like, I think sometimes we think students, teachers, like that's, they're slower, but it's the whole field and I think that that is where for us, and I know it, we're so boring. Everybody, you're like, Amanda, stop saying Gen I literacy. But like, you know what guys? I'm not changing and I'll change as soon as I feel like there's an recognition of just how core this is.

Genuinely, there is this moment where in the next 18 months everyone needs to start understanding the capabilities limitations, what these tools are. It's like the way you think about AI literacy is like the people that think that general AI. Works like other technology. So there's the old schema and then you have the other side that it feels like magic.

Like these things are accurate. I can teach them, they learn. And then there's the middle where it's like, you know what? These tools are predictive engines that are fascinating and and are more complex than we can understand, but they are also, we can know that training data matters and that they make mistakes and that bias exists.

And that my direction matters more than anything around the technology itself, right? And my expertise. We have this spectrum. And if we can't get towards that middle place of understanding, then we're gonna have people over trust the tools in both those spaces because they're either creating new schema, doesn't make sense, or old schema.

That definitely doesn't work anymore. And I think that that's a huge risk because. It sounds right, but it's almost like chat tea. It sounds like very confident what you Exactly. But at the same time, if you don't have the technical understanding, if you don't have Gen I literacy, then you just really cannot truly understand your part in the process.

[01:30:28] Alex Sarlin: Yeah. If there's one sort of underlying change coming in 2026 that you feel like the whole field should really know about, what do you think that would be? What's coming in the next year that people really should keep an eye out for? 

[01:30:40] Amanda Bickerstaff: I mean, first is regulation or anti-regulation trends. I mean AI regulation at the state level.

The EO that was just signed, that says no more. That also there are rumblings that it would mean that the state guidance for AI in education are also removed. And so I think that's number one. So there's gonna be a conflict between speed and safety that I think is, is incredibly damaging. The second is, I think that this was the year of talking about agents.

I think next year will be the year of like actually seeing agents have more value. I think next year will be the year that teachers and students will be like, I don't have to write a PowerPoint or create a PowerPoint ever again. I think the last thing is going to be, you know, we're going to start to see even greater pressure around monetization.

Mm-hmm. Of tools. I think that there are going to be significant questions about trust, and so the idea of monetization that also comes into this moment where it starts to get into this very, very difficult and potentially tragic gray area of like bots providing. These very important types of advice and, and that people are looking for.

Man, I don't mean to be too gloomy, but I do think that this next year we're releasing a report called Beyond the AI inflection point in January, and so much of it is like the choices that we make in the next three years will be the choices that are truly the moment we'll look back and understand if we actually.

Involved in this moment in a positive way, in no way or in a negative way. 

[01:32:12] Alex Sarlin: Really powerful predictions. Amanda Bickerstaff is the co-founder and CEO of AI for Education. She's over 20 years of education as a former teacher and ed tech executive and AI for Education does all sorts of things to help schools and teachers leverage AI ethically and equitably.

Thank you so much for being here with us on EdTech Insiders Predictions. 

[01:32:32] Amanda Bickerstaff: Thanks, Alex. 

[01:32:34] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insider's Newsletter on substack.