Edtech Insiders

Future Fluent: Betsy Corcoran & Dr. Jeremy Roschelle on Redefining Literacy in the Age of AI

• Alex Sarlin • Season 10

Send us a text

Betsy Corcoran and Dr. Jeremy Roschelle co-host the Future Fluent podcast, where they explore how learning is changing in the age of AI. Betsy is the cofounder of EdSurge and now leads Lede Labs, advising education leaders. Jeremy is a leading learning scientist and Executive Director of Learning Science Research at Digital Promise. Together, they bring decades of journalism and research experience to their shared mission: redefining literacy and learning in a world transformed by AI.

💡 5 Things You’ll Learn in This Episode:

  • What “future fluency” means in the age of AI
  • Why social connection is central to learning with AI
  • How AI can empower generalist educators and students
  • The importance of co-designing tools with teachers
  • What deep knowledge—not just critical thinking—requires today

✨ Episode Highlights:

[00:04:00] Launching Future Fluent: a podcast to ask better questions about AI and learning
[00:07:00] Literacy redefined: engaging with the world through many channels
[00:09:00] Mike Yates' AI poetry experiment builds confidence in creativity
[00:13:00] Using AI to support comprehension for young readers
[00:19:00] Post-COVID lesson: data proves teachers’ impact
[00:25:00] AI can’t replace deep knowledge or human insight
[00:30:00] Playlab AI helps generalist teachers act like specialists
[00:33:00] Colorado students use AI to create a voter guide chatbot
[00:36:00] Re-centering AI on human flourishing, not machine potential
[00:42:00] Real co-design starts with real educator problems

This season of Edtech Insiders is brought to you by Starbridge. Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

Jeremy Roschelle: [00:00:00] So I think we have to ask, not what can I do, but how deeply connected is my thinking. And it's a hard shift to get into, but I think the people who will be most valuable will be valuable 'cause of the connections they can make between different ideas and that are not connections that everyone else makes.

Between those ideas, and it might just because of the experience they come from or the role they're in in society or the background or the problem they're trying to solve. But really people have to own that connection making. 

Betsy Corcoran: Then let's move forward too. Think about the kids right now who are looking for their first jobs and trying to get into the workforce.

What happens when you get into the workforce? You collaborate with people. You don't have a high stakes moment, well, not too frequently anyway, where it's all or nothing. It's done. You've got 30 minutes and then put down your pencil, right? No, we iterate. We go into a group, [00:01:00] we try to understand the group, we ask those social questions.

Do you want me here? How am I feeling about this? How do we talk to each other? And then we kind of try to build something together.

Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at 

Ben Kornell: EdTech Insiders. Remember to subscribe to the pod, check out our newsletter and offer our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's posh.

Alex Sarlin: Enjoy this really fun conversation with Dr. Jeremy [00:02:00] Rochelle. And Betsy Corcoran, the co-hosts of the New EdTech podcast, future Fluent. I'll give them their intros, but they asked me to give a little bit of shortened versions because they both have very storied histories in EdTech, and we've interviewed both of them on the podcast before.

So if you wanna. Full intro, you can find them there. So Jeremy Rochelle is the Executive Director of Learning Science Research at Digital Promise. He's a fellow of the International Society of Learning Sciences, and he's basically a legendary education researcher. He has over 25,000 citations, 10 patents, and more than a hundred publications.

And just. Is one of the deepest thinkers in education research we have in the us and Betsy Corcoran is a veteran EdTech and education reporter. She was the co-founder and CEO of EdSurge, which as many know, is one of the very biggest direct inspirations on what we do at EdTech Insiders. She's been a nationally recognized journalist with Forbes Media, the Washington Post and Scientific American.

She also advises C-level leaders [00:03:00] at companies, nonprofits, and foundations through Lead Labs among other amazing things. These are both great ed tech insiders and definitely check out The Future Fluent podcast, which they're just a few episodes in, so it's a great time to get started with them. They have amazing guests.

Join this conversation with Dr. Jeremy Rochelle and Betsy Corcoran. Betsy Corcoran, Jeremy. Rochelle, welcome back to Tech Insiders. 

Betsy Corcoran: Thank you, Alex. Thank you so much. I am thrilled to be here and I'm actually really thrilled to be here with Jeremy. That's kind of fun, 

Jeremy Roschelle: and I'm thrilled to be here with both you, Alex and Betsy.

Oh, that's nice. I love 

Alex Sarlin: pets here among podcasters. You two are both literally legends in education and education. Research and education technology. I said back because you've both been separately on. Insiders podcast before, but now you're here in a really exciting new context. You have launched a podcast together.

Before we get into anything, tell us about the podcast, Jeremy, lemme kick it to you first. [00:04:00] 

Jeremy Roschelle: Sure. Well, you know, Betsy and I got talking to each other maybe a year ago, a little bit more. We were just having the greatest conversations. They're all about the future of learning in this age of AI and things we really care about.

Like what does it mean to be a literate person and how that's gonna change. And we thought. Actually what we thought, Alex, is, we wanna learn about this. How could we do that? And well, if we interviewed people, kick it over to you, Betsy. 

Betsy Corcoran: Yeah. I think the thing that's been really fun is that Jeremy obviously comes with this very deep research background.

He is a learning scientist. And I come with probably like just a lot of naughty questions, you know? And yet there are moments where our backgrounds have overlapped and there are things that we both experienced. 

Jeremy Roschelle: You know what I say, Betsy, I say it's not naughty questions I say, and she's a real journalist.

Exactly. It's journal. She has an 

Betsy Corcoran: endangered species these days, apparently. But yeah, I mean, what's fascinating is we've had some sort of funny overlaps. Like probably one of the [00:05:00] people that was hugely influential to me and was also to Jeremy, is John Sealey Brown. Mm-hmm. John Sealey Brown had run Xerox Park for many years, but.

Really also thought a lot about learning and created something called the Institute for Research on Learning many, many years ago. Jeremy worked with him. I interviewed him millions of times, I think, and so it helps a little bit that we both share. I. A deep curiosity about what does it mean to learn and what stays the same and what changes as we move forward in the world.

Jeremy Roschelle: John Tilley Brown too. If I think about him, I always think of a little doodle of a pair of eyeglasses. 'cause his motif was trying to bring a perspective to things, a long view, and that stuck with me from the time he was a mentor to me is like. How do we bring a new pair of glasses to a difficult, challenging, futuristic topic?

Alex Sarlin: Yeah. I think your perspectives really compliment each other in that way. 'cause the journalistic [00:06:00] perspective, Betsy, you know, you were running ed search for many years. It's about what's happening now, what's next? How do you jump on the trend and what's everybody thinking and talking about? And maybe you can even get ahead of that.

And then of course, research, Jeremy over a hundred publications is about. Finding the truth, right? Ignoring all the hype and the buzz and the buzz words and saying, let's actually dig down and figure out what's really true. And I think those are really different perspectives, but they can be complimentary.

Sometimes you want both. So when you talk about future fluent, lemme kick this too, Betsy, right? When you talk about future fluent, you're talking about. How do you see what's gonna stay the same? How do you see what's gonna change? How do you sort of be literate, quote unquote, in a world where things are changing so fast, especially now we're in this age of AI that where it's just dawning.

What do you think of as this concept of being future literate? Yeah. What does that mean to you? 

Betsy Corcoran: Thank you. That's exactly right. And it was how we kind of got to that name for the podcast. So literacy. When you say literacy to someone, they often say, oh, well I can read and [00:07:00] write. But actually it turns out really the idea of literacy is really something different.

It's how do you engage with the world? How do you engage with the world through some. Channel, and maybe that channel is the printed word. Maybe that channel is mathematics. Maybe that channel is art or computational thinking. There are lots of different channels through which we engage with the world. So how do you engage with the world, with society, with the.

Thing in all these different ways, and that's really what we have been exploring with all of the different guests that we've been talking to. We're still a baby podcast. We're only a few episodes in, but we've had some phenomenal guests who I feel I'm learning a lot from. But I think it's really that question of what does it mean to engage as the world is continuing to rush forward?

Alex Sarlin: Before we move on, tell us about some of your guests. You've talked to Mike Yates, you've talked to Mike Sharples. You've talked to Dan Meyer, who's like at least calling me out, [00:08:00] trying to debate me because I'm so optimistic and he's so realist about this stuff. What have the conversations been like? What have you learned so far?

Jeremy Roschelle: I mean, I think Mike Sharples was fabulous 'cause we took this deep dive into the future of writing and it's something he's thought about as a researcher and as a designer for 20 years. And you know, we're all writing with a partner now, but he could offer such a perspective on that. So we're still gonna compose ideas to share with others in some media.

I'm gonna say. And what is the human part? We really got into talking about what's the human part of that and how are we gonna work with an assistant and what's the human gonna hold onto? And those kind of questions get me going. Oh yeah. 

Betsy Corcoran: You mentioned Mike Yates. So Mike Yates is a guy who is, he's a Texan at heart, right?

And he has been teaching with Teach for America for a long time. He's actually a lifelong educator. His mom was a teacher and he talks about how. I hated school and I didn't wanna be a teacher, but this is where I am and what he loves. So you would think from the [00:09:00] outset, you'd think, oh, Mike Gates, he is gonna be like going away from the tech thing.

But instead he's experimenting and he's trying things out. So he talked about how without telling all those friends who are poets, he snuck AI into a poetry. Workshop that he ran and we said, well, what did you learn from that? And he said, well, you know, what I learned was that it can give people confidence for something that they're not confident in.

I don't consider myself a poet. If you said to me, okay, Betsy, write a poet, I'm gonna be. Yikes. And so what he was finding was the AI gave people enough confidence that they could get started and then they could build on that. So that was a really interesting perspective. 

Jeremy Roschelle: That was sort of similar to the writing too.

And one more I'm gonna mention that I thought was just took us in a new direction was Isabel how, and we talked with her about her book and I just, she's really all about human relationships and why they're so important to learning. And the discourse about [00:10:00] AI in education can often feature individualized.

And you get this tunnel vision view of a kid alone with a computer, and that's all there is. And she was calling that kind of like junk ed tech that just cuts a kid off from the world and in, in analogy, junk food. And she was trying to open our minds to. How could AI be enabling of the most important thing, both in learning and in life relationships?

And boy, that's a deep topic to go into. So really thanks to Isabel. 

Alex Sarlin: I recently talked to the chief product officer at Turnitin, and it was really interesting because she comes from a very different perspective. She's thinking about assessment, but integrity and all that, but she. Brought up something very similar to what you just brought up, coming from Mike Sharples, which is we've never had to think about writing as a sort of joint endeavor before.

We've never had to think about creating an essay or a research paper as something that at any moment you can sort of outsource part of it or bounce ideas off of an intelligence. And we just don't know as a society yet, [00:11:00] which parts of the writing. Maybe Mike Sharples does know, but I certainly don't know and she didn't know either.

Which parts of the writing process really have to be done in one's own head to actually improve and which parts could be done in a more pseudo social way. Jeremy, I want to kick this too, 'cause you've done research on AI's impact on writing and learning to read and literacy in all sorts of ways. Tell us how you think about the future of avoiding that concept of outsourcing the most important parts of thinking to ai, because it comes up all the time in my conversations.

Jeremy Roschelle: Yeah, absolutely. And you know it's fascinating to bring this down to students in first grade, students in third grade, which is the project we have you gain reading current project, we're working with generative AI to improve reading for the students who need it the most. And according to our national assessment of educational progress.

That's gonna be students who are English language learners, they're learning English at the same time they're learning to read in English. Okay. So how does the question you asked fit in that [00:12:00] very context? Well, yes, like phonics and decoding are part of reading and software like a mirror. That's what we're using.

Give kids great help, but also paying attention to the meaning of the text. Super important. Both you build on what students already know. And so with English language learners, they're gonna be coming in maybe some different things they're building on, but also that you're building knowledge by reading.

So that's where some of the, the payoff comes. So we're really thinking about kids in grade one and grade three and breaking outta the box of reading comprehension as you read a text and you get three questions and more reading comprehension you do with the technology partner, and it's more about.

Pulling in and making connections to the knowledge you came with and creating knowledge you're gonna bring back into your future. And so that already opens the whole thing right there. And we're just at grades one to three, and I kind of have the feeling like if we can do right by students in grade one to three, that's a good thing for the rest of their lives that they are [00:13:00] experiencing a comprehension partner that is gonna be an experience they're gonna have.

Betsy Corcoran: I don't think that negates the role of the teacher though. No, because you know, you mentioned Dan Meyer. Dan Meyer, who is a fabulous math teacher, has worked on Desmos and now works with Amplify and when he talked to us, he sort of differentiated between, we can think of learning as open head drop in knowledge.

Right. That's a kinda limited way to think of it, or we can think of it as a social engagement element, and he made the point that the first question that a kid has when they walk into a classroom is, does a teacher like me, do they even want me here? 

Ben Kornell: Mm-hmm. Do 

Betsy Corcoran: they appreciate. Who I am and where I've come from and all the things I've done, do I belong?

None of that is gonna change. Those are still the fundamental questions that we all have. When we engage with a new group, when we engage with learning of some sort, there are then things we have to [00:14:00] learn. And some of the things are the ones that Jeremy just talked about, right? How do you decode something?

How do you make sense of the words? But that first element is so important because if you fail that. You don't get a chance at all the other stuff. 

Alex Sarlin: Absolutely. And I mean, I love this theme of AI as a social form of learning and I, I mean, I think it's something that is, one of the things that gets me most excited about AI as an educational technology is that it, I think, has the potential to really take us out of that junk ed tech model of like back and forth computer and a person.

And Jeremy, you mentioned those reading a passage and taking the three reading comprehension questions like. To me that's like, it feels like almost junk education. Yeah. I should wouldn't say it that way, but it's the same thing, right? It's a kid and a passage back and forth. There's no interaction. You don't learn anything from the questions.

You can't ask any of clarifying questions. It's not really how humanity works. You're interacting with something dead, right? Something that doesn't interact back with you. I have felt it [00:15:00] myself. I have said this on the podcast a few times. I walk around now just talking to Gemini and Claude and chat BT, and I use it to reflect.

I'm like, what am I thinking about? It's this and this and this. How do I put these together? And it says, Hey, it sounds like maybe you're thinking this is that right? And I'm like, that's not that. It's more of this, this is reflection. This is what we've always meant when we say you should reflect after you learn and make sense of it and connect it to your actual life.

And it's like, we can now do that in such a more. Holistic, interactive way than we've ever done before. It's not a reflection question that you throw at somebody and then force them to answer in an lms, which is literally what we've done. 

Betsy Corcoran: Yeah, and I think the key element, Alex, just to jump in, was you used the word potential, right?

Yes. And that's the fork in the road, right? That it has the potential to be a really interesting thought partner and do all these things in the same way that other technologies have had the potential to do. Right. I think a lot about the analogy with transportation, right? A car has the [00:16:00] potential to do great stuff.

It can take me from New York to California. It can also make me the laziest person on the planet if I wind up using it just to go to the store down the block. And so there is this very big fork, which is, it has enormous potential. It will depend. On how we use it, and that's why these conversations around what does it mean to be fluent in this new future to me are incredibly important, really fun, but also very important to think about 

Alex Sarlin: a hundred percent.

So what does it look like? What have you learned so far about what it takes to be fluent? What do you think are the most essential from the conversations so far? You've talked to a number of different experts in different fields, and of course you both have tons of experience in education in ed tech.

I'm sure you have lots of priors on all of this stuff, but what is your working hypothesis about what students will need to continue to do themselves and what will be new skills that they'll need to learn how to do in [00:17:00] relationship to intelligent computing? 

Jeremy Roschelle: Yeah. I'm just gonna take us back to the social where you had us a moment ago, Betsy, and I'm thinking a little bit of a colleague, Jamie, who runs Teach FX and how he says like everything comes back to the classroom discussion.

There can, there are many other good parts of learning, but if you show me a classroom with great discussions, I know that great learning is happening everywhere else in that classroom and I You gave it, you were. Using AI to help you reflect and prepare for something, but not as an end in itself. You weren't just trying to cut to the chase, get to the result.

It wasn't all about efficiency and productivity. It was about meaning making. And so I think if we can get this connection right, this is gonna be my thought for you. If we can get the connection right. Between what you talked about, the generative beginnings of meaning making, but then taking that into the social context where you're using that meaning with other students, maybe doing some project-based learning with a teacher, maybe having a great discussion about art or [00:18:00] history or something.

That's where we find out. It's when we get to that human discussion of something important that we find out if we're using AI the right way or not. 

Betsy Corcoran: And I think there's never been a moment where it's more important to really talk about the social. Elements because we've just come through, we still, I think, all bear the scars of covid and of isolation, right?

And one of my pet peeves is that I wasn't writing a lot of news stories during Covid. I was having my own little covid experience. We talked a lot about, oh, math scores went down and literacy scores went down. We just did a perfect AB test of the importance of educators, the importance of the classroom.

Jeremy Roschelle: Absolutely. 

Betsy Corcoran: The importance of social learning. And if I had to write a headline coming out of that, instead of saying, oh my God, isn't it awful? Our test scores have gone down. We should have said, we have documented [00:19:00] evidence. We have data that shows you how important our teachers are and the social environment for learning.

Then let's move forward too. Think about the kids right now who are looking for their first jobs and trying to get into the workforce. What happens when you get into the workforce? You collaborate with people. You don't have a high stakes moment, well, not too frequently anyway, where it's all or nothing.

It's done. You've got 30 minutes and then put down your pencil, right? No, we iterate. We go into a group, we try to understand the group, we ask those social questions. Do you want me here? How am I feeling about this? How do we talk to each other? Mm-hmm. And then we kind of. Try to build something together.

So this whole question of how do we build together, so you had said to me, what have we learned about ed tech development in this world? Again, one of the theme songs at EdSurge was always, we need to be building together. We need to be bringing teachers together with [00:20:00] entrepreneurs and building together.

I've said that for the full 10 years that we were running Ed Surge, and I should be saying it even louder now because there is no other time that it is more important to be collaboratively building things right, and collaboratively building so that we are solving the real problems that people are dealing with, not just the problems we think we have.

But the problems that we really have. So just underscoring everything Jeremy said about social learning, about collaborative learning, about working together. 

Jeremy Roschelle: I wanna maybe be a tiny bit controversial here. 'cause one thing I hear a lot with people thinking about AI is, oh, the students need, they're gonna be experts in critical thinking.

That's what they're gonna do. And here's where, based on learning sciences, I get suspicious about that idea. It turns out to do any kind of critical thinking, you need to know a lot. You need to have some domain knowledge. Mm-hmm. Mm-hmm. And so I think we have to be very [00:21:00] thoughtful about how kids become an expert in things they care about.

Mm-hmm. And how we use this to drive really a well-rounded expertise where they're creative experts, they're critical experts, they're collaborative experts, but it's about something that's meaningful because just the critical thinking alone, it's shallow actually. It turns out not to take you very far, and it's like a party trick that actually the AI bots increasingly can do.

They can give you superficial criticism. So I do worry that we'll rush away from deep knowledge of things that students and society cares about. 

Betsy Corcoran: I tried an experiment when I was in college. I was taking physics and I thought to myself, well, I should be able to derive everything. So actually I don't really need to know anything other than kind of basic concepts, and I should be able to derive everything, so I'm not really gonna, like I.

Deal with this stuff. It was not a good strategy. Just saying it would've been a good idea if I just kind of like learned some of the concepts a little bit more deeply. [00:22:00] It was a tough semester, so, yeah, I agree. Jeremy. I think that this question of how do you have domain expertise so that you can, that is to me, you're right.

The essence of. Being a critical thinker. 

Alex Sarlin: Yeah. I love that analogy you used a while ago, Betsy, about transportation, about AI being like transportation and it brings up a question, hearing you both talk about this. The last 20 years or so, we've had internet technology. We've had access to more information than anyone has ever had, and then it came into our pockets.

We all know how this has gone and it's. Technically, again, in theory, given us more access to exactly that kind of domain knowledge in any domain than anyone's ever had. It used to be people would travel to the library to try to find the three books on the subject they wanted to learn about. Now, even before ai, it was very easy to go deep in any subject, and kids increasingly are using things like YouTube to do that.

It sort of matched the modality that they're used to in theory. Again, in [00:23:00] theory, we should. Potentially be in a sort of golden age of domain knowledge, golden age of fact finding, of being able to understand many different perspectives on any topic because they're all out there. They're all literally in our hands, and that includes high school students.

Yet, yet, I don't think that's how many people feel. I think people feel like the domain knowledge and facts and that just knowing some of the way the world works. That you have to know to be able to critically think about it. There've been a lot of complaints about feeling like young people don't have that.

Why do you think that is? I mean, let me throw it to you, Jeremy first, but I'm like, I've always feel very torn about that. You'd think somebody very driven right now, and there are a lot of very driven students in different areas, can really do almost anything. And now with AI just exponentially more yet, if you're not motivated or you don't have passion or you don't have something you really wanna dive into.

It goes the other way and you just get lost in this cultural muck that I think so many young people find themselves in. [00:24:00] What drives this weird dichotomy? I. 

Jeremy Roschelle: I think what this technology will push us all to do is what's our image of being knowledgeable? What do we think being knowledgeable is? And so I remember I once took an airplane fight and there was a guy's, a painter sitting next to me and his hobby was physics.

I. And we had a chat. I was studying how people learn physics at the time. And so we had a chat about physics and it was fascinating to me 'cause he learned physics from the Encyclopedia Britannica and he had just, just really superficial skim, like he knew two paragraphs about every topic. Physics and AI could lead to us being exactly like that, where we know two paragraphs about everything and if we need a third and fourth, we know where to look.

The alternative view about knowledge is really about connections. And our brain is a connection machine. Right? And so how strong is our ability to connect ideas, to connect with people around ideas? Yes. AI can feed that by helping us, as you were talking about, you were looking for a connection between our two [00:25:00] ideas, and it was helping you.

Right on. But the connection was occurring in your head. So I think we have to ask, not what can I do on the surface, not what facts not, can I give you two paragraphs about everything? But how deeply connected is my thinking. Hmm. And it's a hard shift to get into, but I think the people who will be most valuable will be valuable 'cause of the connections they can make between different ideas.

Hmm. And that are not connections that everyone else makes. Between those ideas, and it might just because of the experience they come from or the role they're in in society or the background or the problem they're trying to solve. But really people have to own that connection making. 

Betsy Corcoran: You're making me wanna connect with a whole bunch of other people that we should invite on the podcast.

So Jeremy, after this call, we have to have another conversation about, oh, here's like five more people I really wanna connect. I wanna come back to two words that you use too, Alex. You use fact finding and you use the word [00:26:00] perspectives. Hmm. Right? Jeremy talked about connections, but let me take one moment on the fact finding and the perspectives.

I'd argue that those are two different things. Right. I mean, historically we have thought that facts are something that you can verify in multiple ways from different people and that they are the atoms in our world, right? They are the things that are spoken 

Jeremy Roschelle: like a journalist. 

Betsy Corcoran: Well, we used to think this, right?

Perspectives by contrast, are a point of view, right? I can have a perspective on the Middle East. Without ever having gone there without actually knowing very much. And one of the things I think we have to be cautious about is differentiating between when someone is simply offering a perspective. Here's my point of view, versus here's something that I've got deep factual.

Knowledge about. I think those are two different things. Certainly from a journalistic point of view, those are way two different things, but the distinctions between those have [00:27:00] blurred a lot. So when you talk about the enormous amount of information on the internet, how much of it is perspective, people's point of view versus how much of it is actually factual?

I think the perspectives are swamping. The facts at this stage. The last thing that I just wanna throw in is facts are also rooted in, in the physical right in, in real experience. I mean, Jeremy as a physicist, you know, did experiments, right? And I don't want us, even as we live in a digital world, I don't want us to forget that there's a real world out there too, and that the tangible.

Physical experience of doing something, the physical experience of building a house or, or even writing those things really matter. And so I am not sure that those things have increased in the dramatic way that people's. Perspective or opinion about all of those things have increased. 

Jeremy Roschelle: Betsy, you're taking me right back to John Dewey and collaborative inquiry and like [00:28:00] that was his highest form of literacy, was to be able to participate in inquiry, to unpack a problem that was out there in the world and take measurements, figure out what's going on, and then to use ideas to think of a, a solution, and then test the solution, see if it worked.

And you know, he is a very pragmatic philosopher, but. It sounds like, like a really high value we have for people when we think about the future of literacy is can we engage together in inquiring into a problem? 

Alex Sarlin: This is to both of you. You know, lemme start with you, Betsy. Do you foresee a world where that type of inquiry that, because that sounds like a dream of what education should be, that's the Doy dream, right?

People working together on something meaningful, trying to really actually solve. Meaningful problems in a way that they're using complimentary skills and all that. I look at the power of AI right now, and it seems like, well, any group that's trying to solve a really meaningful, complex problem and doesn't have, you know, AI having a seat at that table, not the dominant seat, and maybe not just [00:29:00] providing the bare definitions, to your point, Jeremy, but.

Seems like, you know, boxing with one hand behind your back at this point, right? Because it literally has access to so many of the facts, right? I mean, the entire internet and the perspectives as well. To your point, Betsy, but I would hope that there was a future of learning that is that type of collaborative social experience.

But where AI still plays a role because it can jump in and. Fill gaps in almost any way, including, I would argue connections if you say, Hey, I'm trying to solve this problem, but I'm, I keep thinking about this totally different thing. How do these connect? AI can do a pretty good job of making connections between the things, not that they're always right, but it can at least give you ideas.

Betsy Corcoran: So to give you some support, Alex, and to give you something to cheer about. I have been spending some time recently with the folks who are building a thing called PlayLab ai. Oh yeah. It's a learning environment and an environment for educators and students to use ai and in particular, [00:30:00] I talked with a phenomenal educator in Ireland named Natalie, and she.

Frame the problem this way. She said, you know, in Ireland, I am a special education teacher or a specialist. And quite frankly, we don't have a lot of specialists in Ireland. We're a small country and oftentimes general I. Second grade teachers or elementary school teachers are asked to step in and be SPED teachers because we don't have enough.

And so one of the kinds of AI apps that she was doing was basically one I. That would help a generalist become a specialist. Yeah, so she was taking her knowledge and framing different ways of approaching either problems or texts or curriculum and so forth, such that a general teacher could come in and with the support of that tool, tool then be closer to specialist.

I think [00:31:00] this is an amazingly interesting application for ai, and in fact, just to be a little funny about it, I heard a. Quote from the director of the previous, director of the FBI saying. Yeah. I mean, what AI's gonna do is it's gonna take all the bad criminals and help them be really better criminals.

Yeah, 

Jeremy Roschelle: right. 

Betsy Corcoran: So I think in many instances, AI can take a generalist and help get them closer to the performance of a specialist relatively quickly. That's amazing. But it still started with Natalie, a specialist human specialist, who was able to define. Here's the problem. I see. Yeah. Here are the kinds of questions I get asked.

Here's how I would choose to coach somebody. And so from that point of view, it was doing just like the car and you know, the automobile example. It was amplifying. It was stretching her ability to have impact over a bigger community. But at its core, it started with her knowledge. Her specialty knowledge, not just, it wasn't just bullying [00:32:00] everybody's knowledge about all of sped from the internet, right?

She was taking her knowledge and saying, here's how I can help. Some other teachers. So I think, again, it's that. Possibility, right? It's that potential. We have the potential now we just have to make sure we use the tool appropriately. 

Jeremy Roschelle: We also have the potential in the kids too. Like one of the great joys of working at Digital Promise is when our League of Innovative Schools has a school tour and we can all go see what kids are doing in schools and they absolutely are great.

When given the opportunity at getting in small group problem solving, taking on a big challenge, pulling up their laptops when they need them, putting aside when they don't need them. Getting up physical paper and other materials to help them design. So I think we have the technology side, I think. I think the children are ready to learn in these ways.

I. We've gotta get outta their way with some of the structures we have in school, which just don't leave any time for this kind of thing. Or deprive kids of the [00:33:00] energy or, you know, just aren't supportive of them doing what they wanna do. Which is, which is, 

Betsy Corcoran: yeah, a hundred percent Absolutely agree with you.

Uh, there was another great play lab user, a guy named Zach Kennelly in Colorado, who was. Teaching civics class and said to his class, okay, what project do you wanna do? And this was before the election. They're like, you know what? A lot of our parents don't speak English. We'd like to take the voter guide that comes from the state of Colorado, which is 120 somewhat indecipherable pages.

We'd like to make a chatbot that will help our parents vote. And they did that. They did it. These kids articulated the problem they wanted to go after, they wanted to solve it. Their app was used by more than a thousand voters. And um, this was something that they did basically in the last couple of months running up to the election.

So a hundred percent our kids are ready to do real stuff and you know, they may not be as enamored with the tech. I love Jeremy, your point, which is sometimes they throw [00:34:00] out the tech, sometimes they pull in some paper, use the right tool for the right thing. It doesn't always have to be there, but you know, use it when it's useful.

Alex Sarlin: All these examples are so powerful and I literally got chills with that example about the voting bot for their parents. 'cause that just feels like, that is such an incredible example of civic engagement of taking a project and applying it to a real world situation in your local environment. I mean, it just the paradigm of action learning where you actually like go take action with your projects.

It's beautiful. 

Betsy Corcoran: And Zach said, you know, he didn't have to like beg those kids to come to class. They were like running to class. Because they defined the problem. This was a problem they really wanted to solve. It wasn't some faux, you know, assignment from a textbook or any place else. It was a real problem.

Alex Sarlin: I think both of your points, you know, the educator plays a really important role in defining the space in which, you know, the direction, the vector space in which students can then be creative and [00:35:00] use different resources and work together. And I feel like that's a big part of what the future of of educators will be, is knowing that young people just have more.

Power to research more power to put ideas together, more power to translate documents, to do just things that we've never make, movies that they've never been able to do before without huge amounts of help. So what does the educator do? They don't have to teach them every bit of the way. They sort of have to guide them and point them in interesting directions like that, including.

Open-ended directions like what's a project that you care about that you think would matter that you really wanna solve? That's a very enlightened move. As an educator, 

Jeremy Roschelle: making the space show you care. Give the students the resources. You know, show up to coach them, guide them, make sure they keep the responsibility and the agency.

Betsy Corcoran: So, Alex, here's a, here's a challenge that I wanna put to you as you talk to more people and interview more people. So we know that the folks at OpenAI, Sam Altman, and co have [00:36:00] defined, is it five levels or six levels of ai, right? And maybe we're at level two, right? Maybe we're at level three, it's and that sort of thing.

Here's my pet peeve.

All those levels are about the technology. It's still the eye on the tech. 

Ben Kornell: Exactly. I'd 

Betsy Corcoran: like to hear the education community define six levels of maybe using AI or not, but six levels that don't start with the technology, but start with the needs. That we have in learning. 

Jeremy Roschelle: What about six levels of future fluency of people?

Betsy Corcoran: Oh, I got it. Yeah. Wow. There you go. There you go. There we go. That's it. We should, we should 

Jeremy Roschelle: start a podcast. 

Betsy Corcoran: What a good idea. And you know, let's call it future fluent. What do you say? I'm 

Alex Sarlin: in? Exactly. I love that. And I mean, it's a really interesting insight. It's true. I mean, the people who are obsessed with AI and have been for decades, like the [00:37:00] Sam Altman's of the world, they have this dream of a GI and, and you can imagine why they're so deep inside that naval gazing.

Like, what can the tech do? What can the model do? What can the model achieve? What can it solve? Can it act like a PhD student? Can it act like a professor? Can it act like a high school freshman and answer these questions like. They're trying to elevate the AI on its own to sort of be surpass and be on par, and then surpass human intelligence.

What they aren't thinking about, I think nearly enough. I agree with you, Betsy, is why do we want an AI to just be the most brilliant thing on its own? Isn't it better if it's a partner to humanity and that we collectively can do amazing things? 

Betsy Corcoran: I do think that it's a somewhat wrongheaded approach at this stage because we've lived in a world where we build it because we can, as opposed to building it because we need it or want it or think it has a role, 

Jeremy Roschelle: or that we think it'll enable human flourishing.

That's right. People will be, you know, better [00:38:00] people or we'll have a better society, or we'll enjoy our neighbors more if we have this. 

Betsy Corcoran: That's exactly right. I mean, the whole point of this stuff is to help the people flourish, not the machines. I don't really care if the machines flourish. In fact, if anything, I'd rather hope they don't.

I hope the people flourish. And so let's try to drive the question around how it's supporting us. Why are we interested in these things? And it may be that we choose not to go down some paths and that's okay. 

Alex Sarlin: So, Betsy, let me throw this to you. I know that you both care a lot about this. When you think about co-design, you mentioned it before, but what does it really look like?

You've talked to Katie Booty Adorno, who thinks about this a lot. It's such a hot topic right now. I still, even though I've been conversing about it for a while, don't fully have my head around how the EdTech world should be adapting right now to really like authentically have. All the voices in the room.

Betsy, I'm curious what that looks like to you and then Jeremy, I'd love to hear [00:39:00] your take on it as well. 

Betsy Corcoran: I mean, it's a super good question and I'll offer you a couple of thoughts, but these are not meant to be definitive in any way, shape or form. And I'll use, I'll go back to Play Lab as an example because I think that they're doing some really interesting things, so.

Because they are trying to suggest to the educators that they should be asking the questions. They often do exactly that. So they went to New York City and they said to an educator who has been very involved in project-based learning, I. Who was also involved in rolling out a very high quality math curriculum, illustrative Mathematics.

And he said, Hey, what's the app that you wanna build? And so she said, oh, well, the app that I wanna build is one where the student is doing their project. Their project, working away, working away. And they get stuck on a problem and. They right now, they have to go outside of the [00:40:00] project, right? To like Khan Academy, right?

And say, understand trigonometry, say, right? And then they have to kind of come back into the project and say, okay. Like, all right, so Khan sort of taught me something about sign and co-sign, and now I gotta figure out how it's gonna work in this context. So her idea was. Hey, we've got some really great curriculum in illustrative mathematics.

I would like to be able to incorporate the really good curriculum so natively into every single kid's project. They don't have to go outside to do that. I love it. That's what I wanna do. And she found out she couldn't do it. She tried. She built and play lab and it didn't work. It didn't work for a really interesting reason.

It didn't work because it turns out that a lot of very sophisticated curriculum is not actually machine readable in the AI world, okay? That the way the curriculum is structured. Just can't be [00:41:00] easily sucked into AI and then it makes sense that way. So she co-designed, she went to the Play lab guys and they literally got a team of engineers and they spent three months and they rebuilt the curriculum so that the AI could make sense of it.

So this to me was a really interesting example of co-design two people or two groups, right? The educator and the engineers bringing their expertise together on a problem that has been defined. That is a real world problem. And so to me, that's the essence of co-design. When two different groups bring separate expertise and authentically try to build something new.

Jeremy Roschelle: I just wanna put a few words on that, 'cause I love that example. Mm-hmm. And one thing I love about it is it started with a real problem of practice coming from the educator. And too often we see technologists, they have a fantasy of the problem they wanna solve. Right. And it's so [00:42:00] great when these, when co-design.

Starts with a problem of practice. And then the second thing is like, how big is the aperture on that problem? Like who can see, can we all see the whole problem? Can we all work on the whole problem? Or is the relationship, say between the technologist and the teacher being structured through a survey?

And through that survey you can give some feedback, but you can't really see the whole problem. You can't offer. The full benefit of your brain. So I think if it's a problem of practice and the aperture is wide for people's contributions and then the last bit is like you in for co-design to work, like teachers are busy people, so are technologists.

There actually are a lot of logistics to pay attention to, to make it possible to enter into it in the fullest way. And if you don't pay attention to those logistics as much as they would like to, it's too hard for people to fit in into their lives. So I think logistics are a big part of co-design. 

Alex Sarlin: Yeah.

And that example, I think is a really good example of [00:43:00] how the logistics can work in that you had a educator with a real problem and that sort of vision, right? Of like, wouldn't this be incredible if we could make this happen? And then I. They didn't have to work for three months and sort of step away from the classroom 'cause they would never be able to do that.

But the engineers, the Play Lab folks who are trying to create amazing software can say, you know what, it would be beneficial to not only this project, but a lot of projects if Illustrative Math was machine readable and they can decide from a product perspective to put resources there, but doesn't require an educator to have to be constantly on call, which is never a good idea.

It's. 

Betsy Corcoran: We're gonna get the CEO of PlayLab Yu Ahmed to come on to future fluent and explain that example in a lot more detail than, than I just did because he's been living that every single day. 

Alex Sarlin: Yeah. Well, I'd love to learn more about it too, because I think the HQIM, I mean, that could be really transformative because the HQIM materials like Illustrative and EL, are incredibly high quality and [00:44:00] sort of proven high quality, but they are.

Formats that are not always, not only not machine readable, sometimes they're not always as human readable as they'd like to be. And I wonder if there's something very intriguing about being able to translate that material into a corpus that can then be searched and surfaced through ai. 

Betsy Corcoran: I mean, it's partially the great dream of open educational resources, right?

And we just. Talked to David Wiley as well of Lumen Labs, who's been, you know, a huge advocate and we've been teasing him a little bit. But, you know, he wants to, we keep saying he, he's gonna kill OER. Well, he is gonna kill OER because he wants to extend it with AI in a really interesting way. And he thinks that it may, may actually finally live up to his dream, which is, hey, you build one really great resource and millions of people benefit from that.

Yeah. 

Alex Sarlin: It's coming. I do think we're close. I think each of the Frontier Model products started doing things like custom GPTs or projects starting to make OpenAI did this app store [00:45:00] mentality very, very early. I don't think it quite caught on, but there is definitely a world coming where I. That educator who has this incredible idea and says, wow, imagine if you could bring illustrative material in context, right?

When somebody needs it to solve a real problem and then every other educator on earth says, yes, I want that too, and they don't have to rebuild it from scratch. I think that world is coming and I very excited to see it 

Betsy Corcoran: Again, we have the potential. We just have to actually make sure, gotta keep our, we go down those path.

Yeah, 

Alex Sarlin: so I think we've mentioned at least four of the five episodes of Future Fluent that are currently out, which is pretty amazing. You've had some amazing people on, I'm so excited to see your podcasting journey. It is fun. And you've mentioned early on that you wanted to learn, so you decided to do interviews.

That is the big secret, I think, of podcast. Every podcast I've ever talked to, they say that's the big thing you learn. You get more outta the interviews than they get. It's addictive, right. It's amazing. I mean, being able to talk to so [00:46:00] many people with so many different experiences, perspectives, capabilities, just, just understandings of the world.

It's been incredible pleasure. I'm really excited to see you both have. So I, Betsy, you've probably done thousands of interviews outside of podcasting, but putting in this context, I'm really excited to see where you go. And little fun tip, you can put all your podcast episodes in a, a Google notebook or in a chat bot and then use that to put everything together And remember episodes from a year ago when you get.

Year in and say, oh yeah, that was a cool one. What did we talk about? 

Betsy Corcoran: Yeah. Alex, you've been a great inspiration. We really, really love the work that you've been doing. It's really exciting to be on here. We hope that you and everybody else will subscribe. I'm supposed to do the little advertisement, right?

Of course. It's future fluent net, and we're on Apple and. Spotify and wherever 

Alex Sarlin: you get your podcast. That's right. Trying, trying to work on the, 

Betsy Corcoran: wherever you get your podcasting. But anyway, it's a huge privilege. It's a huge privilege [00:47:00] and I think there is, you know, again, it's a very important time to ask questions.

And to be open-minded and to explore. And I think one of the things that we are learning as we go is that there's a lot of nuance in all of these questions about how do we stay fluent, become fluent as this future kind of comes rushing at us. So thank you for having us here, and maybe we'll get you to come on future fluent.

Alex Sarlin: Anytime at all. And I just have to throw it back to you because EdSurge was explicitly the inspiration for everything we do at EdTech Insiders. So, and I've told you that many times, but I, I wanna, you're the root of, of all of it, so I really appreciate that as well. And, and of course, Jeremy, your work at Digital Promise and all of the work you've done in education, technology and education research is just.

The whole field sits on the shoulders of your Resa. Really, it's really true. So it's a privilege to talk to both of you. And yes, everybody should be adding future fluent to your podcast listening schedule and get on early, right? There are only a few episodes [00:48:00] in. You can see the whole journey. It'll, it's gonna be a blast.

They know, and we're 

Jeremy Roschelle: open to ideas. Yeah, 

Betsy Corcoran: absolutely. We'd like to hear from people. It really needs to be a two way street. So drop us a note. We're putting links up on LinkedIn. There are emails there. Drop us a note. You can always say hello@futurefluent.net. And guess what? That comes straight to my email box, so we're listening.

Go. 

Alex Sarlin: You'll should probably expect to get some pitches. 

Betsy Corcoran: I hope so. 

Alex Sarlin: Thank you both so much. 

Betsy Corcoran: Thank you. 

Alex Sarlin: Thank you. Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.

People on this episode