Edtech Insiders

Why AI Literacy Alone Isn’t Enough and What We Can Do About It with Katy Knight of Siegel Family Endowment & Dr. Allison Scott of Kapor Foundation

Alex Sarlin Season 10

Send us a text

Katy Knight is President and Executive Director of Siegel Family Endowment, where she leads bold, systems-focused philanthropy at the intersection of technology and society. Since 2017, she has advanced equity, flexible funding, and inquiry-driven grantmaking grounded in the scientific method, and centered on reframing big questions and learning alongside grant investments. 

Dr. Allison Scott is CEO of the Kapor Foundation, leading work at the intersection of racial justice and technology. She drives research, programs, and advocacy to expand equity in tech and entrepreneurship. A national PI on major CS equity grants, she also co-leads CSforCA, advancing access to K-12 computer science across California.

💡 5 Things You’ll Learn in This Episode:

  1. Why AI literacy isn’t enough and what deeper skills students really need
  2. The difference between coding, computer science, and computational thinking
  3. How philanthropy can help reimagine outdated education structures
  4. What “unsexy tech” in schools reveals about the future of teaching
  5. How the private and public sectors can (finally) collaborate on education

Episode Highlights:
[00:04:04]
Allison Scott on preventing AI from repeating tech’s inequities
[00:06:03] Katy Knight explains why Siegel has been ready for AI for years
[00:10:30] The structural barriers still blocking equitable CS education
[00:13:39] Why real CS reform requires rethinking the whole school system
[00:18:44] Should we replace math with CS? A provocative idea
[00:24:28] Addressing fears of too much tech in kids’ lives
[00:29:34] Why philanthropy is investing in public interest technology
[00:36:39] How schools and companies can better share learning tools
[00:41:37] Building a shared vision for AI education across sectors
[00:50:31] Why “unsexy” backend tech matters for teachers
[00:51:28] The looming risk of workforce displacement in education 

😎 Stay updated with Edtech Insiders! 

🎉 Pres

This season of Edtech Insiders is brought to you by Starbridge. Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

[00:00:00] Katy Knight: We lament the way that parent engagement has maybe taken a turn for the worst in the post COVID reality. But is there an opportunity there to harness some of that energy that parents have around wanting to know what's going on and get together and try to fundamentally change what education, what the school day looks like?

Who's involved? There are organizations trying to do that work. Trying to get that to be a reality. But you need to build coalitions and have some strange bedfellows a little bit, right? Like communities have to come together to say what we have right now is not good enough, and we cannot wait until we fix the same old problems to think about how we integrate the new stuff.

We have to try to fix the problems by integrating the new stuff. 

[00:00:45] Allison Scott: There is still a lot of fear. So in terms of the utilization of technology, there's a lot of fear that we have overburdened our young people with technology and then that has negative impacts, and so then now you introduce additional ways to spend even more time on devices and interacting even less.

With humans. And I think that worries parents. That worries teachers. That worries a lot of people. That is, are we going in the right direction? And I think, so we, the royal, we should also articulate how that is not what our goal is. We still believe in the importance of human relationships and how. These are the specific ways that technology or AI tools can help increase like maybe efficiencies, but we're not trying to remove human engagement in the process.

[00:01:35] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter, and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoy today's pod.

Katy Knight is President and Executive Director of Siegel Family Endowment. Where she leads bold systems focused philanthropy at the intersection of technology and society since 2017, she has advanced equity, flexible funding, and inquiry driven grant making, grounded in the scientific method and centered on reframing big questions and learning alongside grant investments.

Her prior roles include leadership at Two Sigma. And strategic communications at Google. Dr. Allison Scott is CEO of the Kapor Foundation, leading work at the intersection of racial justice and technology. She drives research programs and advocacy to expand equity in tech and entrepreneurship, A national PI on Major CS equity grants.

She also co-lead CS four ca. Advancing access to K to 12 computer science across California. So great to have you both here, Katy Knight, Dr. Allison Scott. Welcome to EdTech Insiders. Thanks. Thanks. Lot to be here. Let's jump in. So we were gonna connect at A-S-U-G-S-V. It's a madhouse at A-S-U-G-S-V. I think we didn't get a chance to talk there, but you both do like amazing work in technology, in philanthropy and in equity.

First question is for you, Allison, the Kapor Foundation. It's been a leader in racial equity and tech for a long time, and we're at a particularly fraught moment for that. How are you thinking about. Your work right now. First off, give us an introduction to the Kapor Foundation, but how are you thinking about your work right now and how AI is changing opportunity for underrepresented students?

[00:03:48] Allison Scott: Yes, so as you mentioned, the Kapor Foundation is focused on really re-imagining and rebuilding a more equitable technology sector, and we believe that the education space is critical to building a more equitable tech sector. And so when we think about AI in particular, we are. Squarely focused on ensuring that AI doesn't replicate the inequities that we have previously seen in the tech sector, and thinking about new ways to create a more equitable future utilizing the new advancements in ai.

So I think a couple key points in terms of how we're thinking about the impact of AI and access and opportunity in cs. So one we've been doing. Work around CS equity for the better part of a decade. And while we argued that all students need access to K 12 computer science education for a variety of of reasons, but just kind of critical literacies for regardless of future career or college path, we still don't have equitable access to computer science education.

So then when you layer on artificial intelligence. And the impact of AI in education. What we wanna ensure is that we don't lose sight of the critical importance of computer science education. One, two, that we don't over index on just thinking about AI literacy. We believe critical AI literacy is important, but we don't wanna over index on just becoming literate on how to use tools and more on how we can create more educational pathways for all students into future computing and AI careers.

[00:05:12] Alex Sarlin: Yeah, that last part is huge. I think we talk a lot about AI literacy right now, but that literally means just like basic access and knowing what it is at all, the future of really impactful and lucrative jobs, what CS is going to become is a little different than that. So I think we'll dive into that soon.

That's a really interesting point. And Katy's Family Endowment, you focus on the intersection of technology and society, or large, including education. How are you? Thinking, about in your work right now and give us a little bit of an overview of how SL thinks about its work, and then how are you thinking about AI?

[00:05:46] Katy Knight: Yeah, so at SL we've been focused on technology for our entire existence, which is a little more than a decade, and that's not very old in philanthropy years, but in tech years it's a very long time and we've seen waves of many things. And so the first piece for me when I'm thinking about AI and how it's impacting our work is to say that.

We're not new to this and it's not new to us and it's been around and we've been as an organization and because our chairman is a computer scientist and roboticist who has a PhD from what is now the artificial intelligence lab at MIT. I think there's a lot of groundedness and, uh, sense of reality when we're talking about AI inside of our organization because we've been talking about machine learning and AI impacts and the potential futures here for a long time.

And so I think I try to make a distinction between when we're talking about the advent of the AI era or the. Birth of ai, that's not right now. Right now we have like a rise in the public consciousness around what AI is and what it can do, and we have impressive new generative AI tools that are consumer facing for the first time.

But like a lot of this stuff has been around and behind the scenes and has been behind. Why our mission on the educational front, much like Allison said for Kor, has been to try to create a tech sector that represents the diversity of people around the world because they need those inputs to build better products and to create.

A world where people's access to technology and like facility with technology looks different. Not just because it's a way to get them a good job, but because we're rapidly approaching a time in which if you don't understand technology, you're not gonna be able to do the most basic things. And that's just not fair, right?

That's not the world we wanna live in. And so if you need to be able to understand technology or even register to vote at this point, it's imperative. That we build that understanding and you build that understanding by teaching the foundations of computational thinking in early grades of computer science exposure and computer science principles in older grades.

And then ideally, a larger portion of students will go on to stick with Cs or CS adjacent fields and be builders of these things. So. We've been trying to harness all this energy and excitement and even the fear around generative AI to remind people this is computer science. This is something that we actually should be excited about 'cause we can equip, we've done so much work to figure out how to equip students with CS skills.

We're just like waiting to deploy it. Like Alison's got all the research papers and so if you're excited about ai, be excited about cs. 

[00:08:31] Alex Sarlin: That's a great way to put it. Let's zoom out for a second and talk about the computer science education sort of arc and movement, because this is something we mention in passing a lot on this show, but I don't think we really get into.

Any of the details of it in a way that I think would be really valuable for this conversation. So let, let me just the lay down some of the big rocks in my mind about how CS education has worked. And you can either debunk them, you both know a lot more about it than I do, debunk them or fill in some of the gaps.

Big rocks. It took a long time for people to truly realize that computer science education was going to be so meaningful and transformative for students. There was, of course, Seymour Papper, right? There were people trying to get. CS principles into the classroom a long time ago, but it took a long time.

We were way behind, and it's only in the last, maybe I would say decade, maybe even the last five years, that we're really starting to see like actually very good CS education in classrooms in K 12. And even then, it's still. Not equitably available. Some schools, some states are much more dedicated to this than others.

California, New York have been way ahead. I'm sure there are other states as well. So my take is that computer science education happened a lot in, at the K 12 level. Happened a lot slower than it should have, and yes, we finally got there. But it's a lot slower than it should have. And meanwhile, we're getting to this next moment, and I think, Katy, this is your point here where it feels almost like coding as the core skill of the future that everybody has been saying forever is probably not quite the right story anymore.

AI is computer science, but it looks very different than traditional coding. Okay. Those are my big takes. Alison. Am I totally off base or do you feel like there's some truth to that narrative? 

[00:10:13] Allison Scott: I think there's definitely some truth to it. I wanna come back to the topic about coding as a core skill. 'cause I think that's where we get tripped up a lot.

But definitely it has taken a long time to explain computer science and get computer science adopted in traditional K through 12 education systems for a variety of reasons. So I agree completely with that point. I think where we stand today, like looking back. We still have, I believe the latest data is somewhere around 60% of all high schools in the country offer some form of computer science, which is really like 60%, I guess, could sound good.

And we have made progress. It was in the thirties at some point. I guess that sounds good, but when you really peel back the layers. That means like one high school offers one course, right? And that could be a course of 15 kids. So I'm here in Oakland. Oakland Unified has done a bunch of work, really fantastic leadership.

Some investments from lots of different philanthropic organizations as well as industry partners. And we still have high schools that have no computer science courses. So a student right now, graduating high school, has never taken computer science course at some of our comprehensive high schools.

That's the reality. And then the second piece is that even when we have the courses available, not all students are taking it. So that's a major issue that a lot of folks have been working on. Like how do we inspire students? How do we talk to parents about what computer science is and why it's important?

How do we ensure that we create the structures and systems in schools so that every student is gonna touch computer science at some point? And it's not just an optional elective course that only. A certain subset of students are taking, and then I talk a lot about this. I bore people to death about it.

But truly, like I don't think we've thought about the structural barriers that exist in education and then how we can incorporate computer science education. So you'll hear a lot of, well-meaning district leaders and school leaders saying, I want computer science for my kids, but I can't, I don't have the resources, I don't have the time in the day, and I'm struggling with.

Basic literacy. And so I just can't figure out how to prioritize this. And then to your point now, I think we spent a lot of time talking about why CS was so important. What's getting lost, in my opinion, in the hype around AI is, and Katy, correct me if I'm wrong, I think there have been lots of quotes from like industry leaders around and some truth right around like, well, AI can now do entry level coding.

Everybody doesn't need to learn how to code. That's like a bygone era. And we would argue. That nowhere in the K 12 computer science standards doesn't say that it's just coding. It is a whole comprehensive set of skills and ways of thinking and ways of breaking down problems, understanding algorithms, critical thinking.

That should happen from K through 12. So where does that go? Just because AI tools could maybe do entry level coding. What about all of those other skills that are critically important for young people? Right. 

[00:13:03] Alex Sarlin: Algorithmic thinking and the computational thinking. You mentioned Katy. That's a sort of digital promise framework I believe, or it's a framework in lots of different places.

Very helpful context. Yeah. And as you say, 60% and a lot of them, it's just an elective that only a few kids are taking. So then we really haven't cracked the code on having computer science in part of the K 12. Katy, what do you think? 

[00:13:22] Katy Knight: I mean, I wanna double down on what Allison said about the broader systems change issue here.

I think. One of the greatest learnings for us as an organization in the computer science space for all this time is that I say we started over here like many funders do, focused on the issue of wanting students to have tech fluency through computer science, through computational thinking. Where we've landed today is, I say we're a systems change and education funder because.

We're not going to get to where we wanna be in computer science without addressing the ways in which schools and traditional K 12 don't serve students as best they can right now. Don't serve teachers to do the best work they can do right now and don't allow the space in the day for the sort of.

Courses and integration that computer science can bring, but also the space on the day for like that same project-based, inquiry-driven learning to exist in other parts of the school day. We have a lot in common with the folks who are trying to push project-based learning more broadly or who are trying to push.

Other pieces of the STEM field. So I think being in partnership and collaboration with more people across the education space has really served us incredibly well and is something we need to continue to do if we're gonna build toward this. And then, yeah. I love that you brought up Seymour Pap because that for me is like a really appropriate starting point for thinking about computational thinking.

We're really fortunate at Segal. To have been early, really, our chairman was one of the first funders of Scratch, which was developed by Mitch Resnick at MIT Media Lab, who is a legacy descendant of Papper. And I think a lot of what we know and what we think about computer science. Access and opportunity and why it's important is grounded in going all the way backwards to why computational thinking matters as a fluency and as another literacy that the littles should be learning to and we've pushed on like so when The Scratch Foundation that supports scratch and allows the platform to exist was first launched, the official name was.

Code to Learn foundation, not learn to code, but code to learn. Because that perspective that Mitch and the team had was, this is creative open play space, but much like pepper developed logo, it's about the fact that this can teach you something else and empower you to do something else, and that something is much bigger than just the coding.

So yeah, vibe, coding. It's a thing. It'll be a thing. I've done a little vibe coding myself. I think when the code is not very good, I'm not equipped to actually check it. 'cause I don't have the level of coding mastery to do that. So we should also remember that even if some of those entry level jobs are going away, we're still gonna need senior engineers who can look at code and who can debug and all that.

But yeah, I think. Look, our chairman was joking the other day when the companies need more investment for more compute, more environmental resources. We're on the cusp of a GI and everything's going away when they're satisfied with their resources. Oh, you know, we're working on it a GI could be five years from now.

So like I think there's also just that marketing machine that is driving a lot of an AI narrative that we need to combat. 

[00:16:41] Alex Sarlin: Yes, and I admit I sometimes fall prey to the marketing machine. I'm always trying to be critically think about which of the promises and pieces of hype or ways of thinking about it are real and which are not.

And this is part of what I'm trying to do, a sort of check with the two of you who know this space really, really well. So something that I wrestle with and I'd love to hear you both. Talk about this is, you mentioned how basically it took quite a while to turn computer science into a literacy right, into something that's broken down into standards, broken down into sort of entry-level concepts and can build up from there.

And sort of in that way resemble. Literacy reading, resemble math resemble some of the topics, the core subjects that are taught in school that schools sort of know how to handle. I think there's a case to be made that we should just jettison math and put in computer science, but is that like the craziest thing in the world?

Let me phrase this a different way. The way in which education I think has thought for a long time is you have to teach basic core concepts so that everybody can then use them and build them into other useful careers or lives. You need to know basic math. You need to know fractions and multiplication or, or you need to know calculus to some degree.

If you are going to go do. X kind of science or all sorts of useful engineering. I feel like that moment may be changing. Right now. I'm not sure that we actually need to know these core little concepts, certainly in math to be able to succeed in life. I certainly never have, and I think many people don't feel like that promise comes to fruition.

So, am I crazy? Should we, like what would happen if schools were to say, you know what? We made a decision. No more XI won't say math. It could be anything. No more X instead, computer science. Would you guys say yes, you finally got it? Or wait, wait, wait, wait. Hold on. That's, that's too much. Allison, let me start with you.

[00:18:28] Allison Scott: So I think what you're bringing up is a really important and provocative, interesting question for us to consider in the entire education space, right? I mean, I think we have to think about where we are in 2025. How would we construct a learning arc for all young people? And I do think that, and a lot of us haven't had either the time, the energy, whatever, to deconstruct the entire.

K through 12 system. I know that there's some work being done on this, on like re-imagining the Carnegie unit, like how did we even get here? So I think these are important topics to consider. I also heard a quote when we were working on getting the computer science implementation plan in California. I heard a quote from one of the people on the team that was like, I'm voting against it.

We were like, what? I thought it was a unanimous win. And he was like, I'm voting against it because it doesn't go far enough. Like it doesn't go far enough to say. Yes, all students should take computer science because it doesn't incorporate all of these other things that we know are our current realities.

So I think that there is a space, maybe AI gives us the advancement of AI in this current moment in public consciousness. As Katy said around AI, gives us an opportunity to say, do we need to rethink? Education structures overall. And then maybe one more point was, I think going back to the point about AI literacy, there's a lot of interest and a lot of momentum around AI literacy, and I believe that that is critically important as just one of these things like as we're talking now, think about like your average profile of a student.

All of the things that we would want them to learn throughout their educational journey. I wouldn't say AI literacy would rise to the top and be the only thing like that can't be as comprehensive of a skill set that we're giving our young people. But there are like just referencing the World Economic Forums, top 10 fastest growing skills.

So they talk about AI and big data network, cybersecurity, but also creative thinking, resilience, flexibility, agility, curiosity, lifelong learning. So there are. These other skills that are also critically important to integrate if we are going to use AI as AI continues to evolve very rapidly. So there's a set of skills that also doesn't necessarily fit into our rigid assessment of a calculus course.

A data science course, a computer science course. Yeah. 

[00:20:34] Katy Knight: Katy, what did I miss? No, I think that's spot on. The only thing I would add is there's a lot of question within the computer science community about integration versus standalone and whether we have to incorporate free standing, for example, computer science courses into the school day in order to achieve the mission, or if we get further faster by thinking about how computer science.

Or computational thinking can be integrated into other subjects. And that integration piece, like that computational thinking integration piece, we should credit Jeanette Wing and like the early aughts for bringing that back to the fore is one of the things that I think if we were willing, like Allison says, to fundamentally rethink.

All of the skills that we say are important for young people, that it'd be a lot easier for us to say, and here's where we can incorporate computational thinking and here's where computer science will come up. But we're very precious about our Carnegie unit, about our reading, writing, and arithmetic.

Right? Like we have to move past that. Not that we have to blow it all up, but we have to move past that. 

[00:21:39] Alex Sarlin: I agree. Maybe blowing it all up might not be the answer, but what does it past? Some pretty abysmal results for a long time in in literacy education, and we've seen a lot of problems in, I don't know. I mean, I'm just in a funny mood today, but I'm like, I agree with everything you're saying, and it just feels disheartening to say that, okay, we're in 2025.

We are well into the. Internet revolution, the information revolution, the cell phone revolution, and still we're wrestling to convince schools to try to put some computer science in an elective. And they're saying, as you said, Alison, no, we're too busy trying to get literacy right. It's like maybe there needs to be a little bit more of a table flip there because if you feel like you have to get certain things right before you can expand to modern skills, well, we haven't in decades, so why do we expect that reading education is finally gonna be solved enough that we can get to these other things?

[00:22:39] Katy Knight: Or why don't we consider that reading education might be well served by some of the changes that we're talking about, and that maybe making some of these math pieces more relevant to modern life and an innovation driven world and the internet economy would lead us down a better path for students, right?

We have to be willing to open and let me not be vague like when I say we and an open like. Parents, administrators, teachers, but I think teachers are so close to like the students in classrooms and seeing like, I need to make this stuff relevant. Like I think teachers are actually already fighting the uphill battle to create more relevant content within the boundaries of really sometimes arbitrary things, right?

Yes. So we lament the way that parent engagement has maybe taken a turn for the worst in the post COVID reality, but. Is there an opportunity there to harness some of that energy that parents have around wanting to know what's going on and get together and try to fundamentally change what education, what the school day looks like, who's involved?

There are organizations trying to do that work. Trying to get that to be a reality. But you need to build coalitions and have some strange bedfellows a little bit, right? Like communities have to come together to say what we have right now is not good enough, and we cannot wait until we fix the same old problems to think about how we integrate the new stuff.

We have to try to fix the problems by integrating the new stuff. 

[00:24:11] Allison Scott: To that point as well. I, one other narrative that I think is real that we also have to contend with is there is still a lot of fear. So in terms of the utilization of technology, there's a lot of fear that we have overburdened our young people with technology and then that has negative impacts.

And so then now you introduce. Additional ways to spend even more time on devices and interacting even less with humans. And I think that worries parents. That worries teachers. That worries a lot of people. That is, are we going in the right direction? And I think, so we, the royal, we should also articulate a.

How that is not what our goal is. We still believe in the importance of human relationships and how these are the specific ways that technology or AI tools can help increase, like maybe efficiencies, but we're not trying to remove human engagement in the process. 

[00:25:06] Alex Sarlin: So Alison, let me build on that with a question for you.

You know, the Kapor Foundation focuses on racial justice and technology, just as we've seen in the technology age. We all have phones now. We all have computers. We're on them all the time. Kids are on their devices eight and a half hours a day on average. So in that world, there's lots of different ways to be engaged with technology.

Technological literacy, quote unquote, being able to use your device is relatively ubiquitous, but being able to design these devices to design the code behind them to create your own companies to work for these big tech companies is this incredibly. Vaunted something that everybody aspires to. And then there's this huge spectrum in between of working in computer science, adjacent fields or data science adjacent fields.

There's a spectrum of what it means to be sort of involved in tech in this age. We're now entering the AI age, and the same thing is gonna be true. Being AI literate, able to use AI tools is going to be sort of basic literacy to get to live your life, literally to vote, as it was mentioned. But being able to design complex AI systems is gonna be a very small number of very top technologists.

How do we learn? From what we've done over the last 20 years in computer science education where we have made some progress, but not enough, there are still not enough diversity in big tech and there hasn't been in a long time. You still have people having a glass ceiling there. How do we not replicate that or even make it worse with ai where it takes even longer for us to realize what it means to train a young kid to become the generator of the next AI system.

Like how do we get them ready for the AI age? 

[00:26:42] Allison Scott: I think this might, this, my answer might weave together some points that we've made previously. I think one, we would argue that we still need comprehensive K 12 computer science education for all students. Like as Katy mentioned, it could be integrated into other courses, but we need some like comprehensive foundational computing education.

I think that's gonna be core to all students in the future. I think two remaining intentionally focused on building that pipeline. Like there was a lot of work maybe. 10, maybe eight or so years ago there was a lot of work on increasing access and and equity in AP computer science because taking AP computer science had like a four x, like you were four x more likely to go on and major in, in computer science and post-secondary.

So like I think there are still efforts like that that we need to focus on and not throw out all of our focus on the computing pathways in particular. So I still believe that we need more diversity. From all aspects of diversity, geographical, gender, race, socioeconomic diversity in pathways that will be not just entry level potential software engineers, but could be, you know, some of the, like the top and most senior ai right.

ML engineers working at some of these like you. Really forward thinking companies. That's only one path, and I think that's important to also say that we're not solely focused on that alone, I think to create a more equitable tech sector. Another piece that we spend a lot of time thinking about and also investing in is like civil society, and Katy will speak a lot more to this public interest technology.

So on the civil society part, we want more like researchers. We want more scholars who understand. AI and AI technologies, but are asking critical questions about like, well, what is the impact and what is the trade off like on impacts on like climate? Should we, like, let's ask ourselves critical questions.

Should we be accelerating development at all costs or should we think about it slightly differently? We wanna train like future. Policy makers to make the types of policy that will both, you know, prevent harms, but also harness the potential of this technology. So I think we're, we're looking at it in a much more robust way than just like, we want more black engineers at Google.

[00:28:57] Alex Sarlin: Right. So Katy, you, your name was evoked there As somebody who's thinking about civil society, tell us more about how the philanthropic sector is trying to move the world towards a more equitable world and one that actually doesn't. Where education plays a really useful role in preparing students for the future, no matter what their pathway looks like.

[00:29:17] Katy Knight: Yeah, so at SLI often say like our position on education is that we're trying to create productive citizens. That means they are able to live in society. They are, they have access to resources, opportunity that they need to thrive, whatever thriving looks like for them. And we want them to be fluent in the technologies that are really defining their lives.

And in order to get there, we need people to be really smart. Like Allison said, within the policy world, within the social sector, within all the spaces that touch technology are tech adjacent or are incorporating technology so that it's done in a way that doesn't just perpetuate the same old problems, the same inequities, the same biases.

And so a couple things I think. On the broader philanthropic front, we've been partnering with several peers in different ways over the last five plus years on public interest tech, which Ford and Darren Walker really like unfurled the banner of calling this public interest tech and we from, from day one at Siegel have been investing in trying to build.

A world where the social sector organizations have tech capacity, A world where the government has tech knowledge and fluency inside of its halls, but also where the tech companies and new tech companies that are to come are staffed by builders that. Are trained not only in the computer science and coding sort of hard skills, but in ethics and in really understanding the impact that their technologies are gonna have beyond what they, you know, seek to do.

Really training toward a. Just because you can, doesn't mean you should. Mindset that I think has been lacking in the tech industry in, in many ways. So that's one piece of how we're looking, you know, a across not only education, but other facets of the sectors that philanthropy touches and how we can influence them to be more.

Tech forward and influenced by tech and how they in turn can influence tech to be better broadly. And then, you know, within education, like I was saying earlier, we've gone from thinking about ourselves as just computer science funders or computational thinking funders to being systems change funders who are trying to work alongside our peers in philanthropy because philanthropy over indexes on education anyway, everyone wants to solve it.

To say, okay, how can we be more. Intellectually honest about what we've done that hasn't worked, and where we might learn something or do something differently by incorporating some of this more. Modern thinking. Some of this more innovative thinking and thinking that is aimed toward what like the future actually looks like.

And that's not, it's not easy because you want to have humility when you talk to other funders. Like though I have been in philanthropy for, you know, over a decade now and. I am part of the problem. I can't always talk about it as just a problem. People don't tend to respond kindly to that always. So trying to navigate like, okay, we respect what we've done.

We all are trying to do the right thing, but we're not always getting it right. So can we do better together if we integrate, if we integrate more of this work? 

[00:32:38] Alex Sarlin: I think that collaborative approach where you bring different people with different incentives, different structures together to sort of solve the problem.

And contribute what they can is really rich one, and I think we're all still working on it as a society, as the philanthropic sector, the private sector. One question I have for the two of you, lemme start with you, Allison, is that in computer science education and now in AI education, one dynamic we've seen a lot is that one.

Schools, whether it's K 12 or universities do their thing where they have to man maintain a lot of the status quo and they already have standards, they already have lots of policy in place and things are very regulated and structured. And then meanwhile the, the tech companies are like, we know how fast are things are changing every month.

Something's changing. So we're gonna put out, you know, open AI Academy, or Google's gonna put out their certification programs, which just reached a million people. Or the tech companies are basically saying we can't count on formal education system to actually train people to be ready to. Help us. So we're gonna create our own parallel informal education system, which is incredibly dynamic.

In my time at Coursera, I saw literally the top courses go from being Ivy League school courses to corporate courses coming from tech companies. That's, it shifted very distinctly because they were moving faster. Do you see there being a role in either E existing CS education where we've also seen big companies create computer science education programs?

Or next AI education where instead of it being like the sort of public sector and private sector, and they move in parallel where people work a little more together to get the most cutting edge concept into actual classrooms. I 

[00:34:14] Allison Scott: think it's a great question. I think there's a lot that I think we've learned over the past decade in, in watching these movements, and I think if I were to synthesize what I think is the most, maybe the most promising, I do think the collaboration, because you laid out the problem correctly.

Typical public education system is moving a little bit slower than the technology industry. So what can both learn from each other? Right? So let me give one example before I give the example. What can they learn from each other? How can they support each other? Because I would still argue like we still need.

Young people who are coming into industry to have a comprehensive education. Like I think that is not to be debated, right? And so, so how can we think about like layering on additional skills? I think this, this concept of lifelong learning, upskilling, reskilling, I think all of those concepts together make a lot of sense for tech companies to come in and support on, right?

So like how do you continue to up. Skill yourself in specific topics. I think that makes a lot of sense. The one example I was gonna say was, it's a little bit less related to like working as an engineer, but one of the, uh, challenges was when in the early computer science days, there was a lot of investment in professional learning and professional development for teachers.

There still is, and you know, a couple of us were flagging. Okay. But wait, we have, we've had, we have entire schools of education. That produce teachers who get degrees that go out and teach, like, let's just integrate some of these concepts there so we don't have to train them after the fact. Right. So I just wanna make sure that there are like bi-directional collaborations.

I look forward to seeing more of what is to come one challenge, of course, we always have to raise the challenges. One challenge is like, you know, in the course, like we can't over promise what some of these courses could provide to individuals like. We all know people in our families, in our communities that would absolutely love to enter into the tech sector.

Taking one of the Google courses is not gonna get them there. And that's just the reality, right? It's, it's a great thing for many people, for many reasons, but we have to figure out how to make a comprehensive strategy that can actually. Get us to that goal and, 

[00:36:23] Alex Sarlin: and you say learn from each other. That's, I think a strength of the formal education system is cohesion, is putting it together into a curriculum with actual benchmarks.

Instead of it being shotgun, like, oh, a new tool came out, we'll put out a course on it. That's not actually gonna get anybody anywhere by itself. You have to put it all together, and they're not always that good at that. I think that's a great point. How about you, Katy? How do you, how do you see the sort of public private sector collaboration being able to hopefully move us faster with AI than we did with computer science?

[00:36:49] Katy Knight: I think Allison's spot on that like we can learn from each other and try to partner together internally at Siebel. A couple of years ago, I very rarely make a unilateral pronouncement, but I said we're gonna remove this sort of our arbitrary siloed line between our quote unquote education portfolio, which we call learning and our workforce portfolio, because.

Workforce development training has been sidelined and almost like it's a less than, it's like this sort of dirtier thing when in fact, like Alison said, like lifelong learning matters. Workforce development training is not just about people who quote unquote failed out of the traditional system and need some alternative.

It's about. How we get people the skills they need to do the things they want to do. And there's nothing wrong or less than or dirty about that. It's just not been treated that way and not been integrated into formal education systems in a meaningful way. And if we. Didn't put up that arbitrary barrier.

Maybe we could do a better job directing students and young people to the right place the first time so that it wouldn't have to feel like they failed at traditional education before they were able to get on a pathway to a different sort of learning experience that's gonna benefit them. And so when we see the hunger for these courses coming out of.

Companies, it should reflect back to us that simple fact, like people want to build skills. People want to learn. People want to know useful things, and how they get there doesn't need to be placed in a bucket of like, well, this is real learning, or this is real education and this is not, we have to consider that we're all learn.

Like even when I Google something like it's a learning experience that I'm having, how is that reflected? We're learning all around us all the time. And we just don't give it enough credit. And I think there's a lot of potential in, you know, giving some credit to what companies are putting out into the world as resources and incorporating them into our formal systems.

If you understand pedagogy and you understand how people are going to absorb this learning and can help facilitate this learning. That's fantastic. Take the course material that a company is providing and integrate it with everything that you know how to do so that you don't have to then go off and try to figure out how you build the curriculum.

Like here's the course material, adapt it, use it, get it to people instead of being sort of precious about, you know, it doesn't come from our formal system. 

[00:39:30] Alex Sarlin: There's been some real movement there in higher ed because of all the turn and questioning about higher ed's value and various things. A lot of higher ed institutions are starting to do things like a credit workforce learning or previous experience, or incorporate training and teaching instructional material created by companies there are starting to incorporate it into their actual.

Curricular or into their actual offerings and there's, this has sort of been this movement of higher ed is starting to say, well, when you graduate, not only should you have a major and and have a coursework, but maybe you should have some tech certifications that will actually carry specific weight in the job market and give you a leg up and getting your job.

I can see that same thinking coming down to K 12. I mean, there's no reason why that's a higher ed. There is a reason because it's a little bit closer to the age of workforce for the, but that's what I'm hearing both of you say or like not recommend necessarily, but say there's potentially room for, is the idea of if you're looking for a way to systematize education, but also include cutting edge materials, well maybe you can incorporate some of the cutting edge materials that are being made all the time by companies.

With the systems and the accreditation and the benchmarking and the teaching, you know, and support that go in the school system. It feels like what education should look like to me, this is my personal take, but let me throw it back to you, Alison. You know, as AI becomes more embedded in all of our lives, in education, in philanthropy, this concept of AI literacy, we know it's not enough, but it's coming fast.

It's going to, AI literacy is definitely gonna be incorporated. We saw executive orders for that. You know, how can we. Start to get ahead of this and really build an education system where these barriers and this sort of obsession with the status quo is starting to be historical. It doesn't feel like, oh wait, I can't do anything new.

I have to deal with my literacy problem. 'cause honestly, that is just not gonna cut it anymore. 

[00:41:20] Allison Scott: I can talk about a couple projects that we have from the foundation side that's trying to, to tackle that very problem. So one, I mean, I think it's very important that we like, listen to our communities to like better understand what are the challenges that they're facing.

And then the amazing opportunities that Katy and I have to be in these positions can then say, okay, well how can we solve that with not just resources? 'cause it's not always just about the resources, but it's also about. The like thought partnership. Okay, so great. I want computer science and AI education for all of my students.

How, where do I start? How do I think about how it fits into the day? So we've had three cities that we focus on in particular, Oakland, Detroit, and Atlanta, where we've launched the CS four, Oakland, CS for Atlanta, CS for Detroit initiatives, working directly with the districts. And we've learned a lot.

There's a lot of like really interesting, really good work happening. But to your point, like one of the, the pieces that we will be doing now and into the future is some of that like deep strategic planning and thought partnership with them sharing some of the ideas that we're discussing here that don't always make it into conversations with district level administrators, helping them envision a different way to do things.

And I, I do firmly believe like they genuinely want the best for their students and they. Can be very overwhelmed at times. So providing resources, providing thought leadership, providing networks and communities of practice is one of the, the strategies that we're, we're excited about. Continuing to lean into 

[00:42:44] Alex Sarlin: that is exciting, and I think putting all the pieces together in, in a collaborative way is, is really key.

But a shared vision of where it's all going is also important. And I think philanthropy can be really supportive in sort of field building around a shared vision of what we all want to happen. Katy, how about you? 

[00:42:58] Katy Knight: I think that last point you made is the really key one is we have to think about what we want to happen together.

One of the problems that I see in education, philanthropy and how we're approaching trying to achieve this change is that we very rarely talk about the end result we're trying to accomplish and what we, what our ideal state would be. Try to align on that. We're very focused on, you know, the pragmatic, we've gotta do this right now.

How do we get together, build a coalition, figure it out stuff. But we don't really give enough time and space to, if we succeed, what does that actually look like for a student? What does that actually look like for a person entering the workforce? What do we want it to be? What would a school look like if you walked into a school?

Tomorrow that you had created whole cloth out of like what your wildest dreams and imagination is, what would it look like and do we share the same vision for that? And if we don't, what do we have in common enough to work toward on a longer term basis than just the short term? I think particularly like what I've learned through these computer science for all efforts that I've been involved with over the course of my career, right.

Coming from I've. Even though I am not a computer scientist myself, I've been involved in tech and involved in the CS movement for over a decade, even when I worked at Google, and it's been the long-term vision that I think really nails it when you are trying to bring someone who is not a computer scientist and doesn't get it intuitively into the space of understanding why these skills matter, because when you can share.

This is what it would look like for a student who has these skills to navigate the world. 

[00:44:39] Alex Sarlin: Right? 

[00:44:40] Katy Knight: That's what opens people's eyes to why it matters, to put them through the, the pain of learning the skills. 

[00:44:46] Alex Sarlin: Yes. That point really resonates with me because I think it speaks to something we've been talking about throughout this conversation, which is.

Maybe breaking down some of the artificial barriers between these different types of education. Right. You mentioned this workforce learning is considered a less than or lifelong learning, or the way that schools have all of these responsibilities to the existing standards and policy that then others don't, and so that you just have trouble putting the pieces together.

It feels like, you know, the different pieces of the education system have so much potential to work synergistically, but. Often because there's no defined vision of where it's all going. What do we all want to happen? It can get muddy and it can get a little bit lost, and philanthropy does a lot of amazing things to move things in the right direction.

I'm a huge fan of all the things that some of these big philanthropic foundations have done, including both of your work, of course. Yet I think there's still work to do, especially as we enter this AI era. And I don't wanna wax poetic. I feel like I've talked too much on this podcast already, but like as we enter this AI era, I feel like there's so many learnings that we could take from the computer science movement in terms of equity, in terms of access, in terms of how fast it's gone, in terms of how things like The Scratch Foundation.

You know, we, we've talked to Mitch on this podcast. We've talked to Margaret Honey, the new president of Scratch just recently, and it's like they made incredible things happen yet. Scratch is still not part of any curriculum. Right. Or often it's not baked into what schools do. It's still considered extraneous even though it does the incredible results and students love it, the formal systems and the informal systems move at such different speeds and they just often don't work together.

I'm really excited about the work that we could be doing there. The royal we everybody in this education technology space, but especially both of your foundation. So we're over time. We're getting to the end. I know we started a little late, but I wanna give each of you a chance. I feel like I've. Hijack the conversation into some of the things that I, I, I care a lot about, but you both do incredibly powerful work.

What I'd love to ask both of you is. What is the most exciting thing that you see coming right now, the most exciting trend from your particular standpoints within K four, within Siegel Family Endowment? What's coming that people might really want to keep an eye on and might not see coming? Something that you feel like you're like, I think this is gonna matter.

I'm not sure other people realize that yet. What would be your answer to that question? And Allison, let me start with you. 

[00:47:08] Allison Scott: I think coming out of A-S-U-G-S-B, which for the audience is a huge EdTech conference, I think one of the trends that gave me hope was that there was a lot more conversation about like, what are the tools that we should be building and how should we be building them?

Much more than I think in past eras of, of some of the EdTech development, and I think the awareness on the part of entrepreneurs. Some investors, I'd like to build more awareness amongst the investor community around like, let's try to solve very specific problems that are facing our, our students in education and let's ensure that we have, uh, lots of folks at the table that represent our students, our educators, and our broader education systems.

So I think that trend has been really. Promising. Exciting to see and also basing the like technology development on learning science. I think you'll hear more about that, but that was like, yes, as a person with education background, I'm like, we understand good pedagogy. Let's make sure that gets incorporated.

The other piece, I think, and I'm gonna shout out Katy here, the other piece that I'm really excited about is it feels like we are at this interesting inflection point. We've talked about a lot of things we talked about. Data science and the math wars. We've talked about computer science and where we've been in the last decade.

We've talked about ai, AI literacy, digital literacy, lifelong learning. I think we are at this really interesting inflection point of defining what we want to see for the future. So like the field building point, I think is a really strong one, and I think that the more we've talked about it to. Folks all across the spectrum, the more we, we've got almost a unanimous agreement that yes, like we need to have a shared vision.

We don't have that yet. So that's also an exciting piece to me 

[00:48:45] Alex Sarlin: at a time of very fast change. Hard to know where things are going. But yeah, I, I love what you're saying there. I totally agree. Katy, how about you? What do you see coming that maybe others should be keeping an eye on? 

[00:48:54] Katy Knight: Yeah, I think I'll go the other direction.

'cause I think. I love everything Allison said about the student facing stuff, and what gets me really excited is what I call the unsexy tech, like the backend stuff. I'm genuinely excited about the ways in which new tools are being designed and developed that serve. Schools as places that deserve like competent and effective and efficient technologies to solve their backend problems.

As much as everyone wants to put a new shiny thing in front of students and build something, you know, that goes in front of a kid, stuff that works for teachers, that treats teachers like the professionals, that the educators are professionals, these are real jobs. This is not a volunteer opportunity. And you know, we can get into how.

We try to martyr teachers all the time, and that's not fair. They're professionals who deserve tools that help them do their jobs better. And just like we've been talking about in every other sector where we're like, productivity, productivity, AI tools will free you up to do the best parts of your job.

That can be true for educators as well, and if we're giving them tools that eliminate busy work, that make compliance issues easier to navigate. They can spend more time doing what they know how to do, which is teach and facilitate learning. I 

[00:50:14] Allison Scott: had one more thing that Katy sparked my thinking on because I had been thinking about it outside of the realm of education, but it's absolutely relevant in education as well, is that I don't think is getting as much attention is like potential displacement.

And so we like, what do we want to see in terms of like our overall employment sector and let's just look at education. What do we want to see related to the teaching sector? Do we want to displace teachers in the interest of creating more efficient processes for education? Do we want to displace a whole variety of paraprofessionals as secretaries of like, you know, administrators?

Like what do we actually want to do? And I think as we look, something we've been talking and thinking about in terms of the trends and the, the conversations about predictions of how long will it take to displace workers, but also like, is that actually gonna happen? I think we should be paying attention to this in education as well.

Making some like very clear stands about what we believe in and what we want to see happen. 

[00:51:11] Alex Sarlin: It's a incredibly important point and I really appreciate you sort of bringing it up directly. It's something we nibble around the outside of a lot on this podcast. We talk to a lot of AI founders and of course the standard.

You know what everybody says, which they believe, but it's also a little bit of a hedge, is we want to give teachers more room to do the things, the higher order things, and the relationship building and, but if you start adding it up and you, there's no way, there's not gonna be some changes in the education workforce and.

I agree that sort of a vision of what that actually looks like rather than we all sort of ignore it until it suddenly happens, which is I think where we're headed incredibly important. I really appreciate you bringing that up directly and the professionalization of the education. I mean, I think this could be a moment where we could truly professionalize the education profession.

Mm-hmm. By taking some of the things that teachers do. It should be able to be done by a machine like attendance and putting things into learning management systems and grade books. Those should not be what teachers spend their time on, but if they spend time on other things, it could be a really professionalized career trajectory, which could be amazing.

And frankly, if we were to bring in. Training from tech companies. Suddenly, as you've both seen in the computer science move, you know when teachers learn computer science principles, suddenly they have all of these additional incredibly valuable and important skills as well. And that's also gonna be true in the AI era.

So anyway, there's a lot more to talk about. I'd love to have you both on another time. This has been so much fun. I've loved this conversation. 

[00:52:35] Allison Scott: No, this is a 

[00:52:35] Katy Knight: great, 

[00:52:36] Alex Sarlin: yeah, 

[00:52:37] Katy Knight: super. I love it. 

[00:52:39] Alex Sarlin: Well, thank you both so much for being here with us on EdTech Insiders. Katy Knight is President and Executive Director of the Siegel Family Endowment, and Dr.

Allison Scott, CEO of the Kapor Foundation, leading work of the Intersection of Racial Justice and Technology. Thank you both for being here with us on EdTech Insiders. 

[00:52:55] Allison Scott: Thank you. Thanks, Alex. 

[00:52:58] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it.

With others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on Substack.

People on this episode