.png)
Edtech Insiders
Edtech Insiders
Week in Edtech 6/18/25 (Part 1): Professors Battle Cheating, Students Lose Sleep, Meta Buys Scale AI, AI Use Doubles, Gallup & McKinsey Reveal AI Trends, OpenAI Faces Backlash, Handwriting Returns, Parents Drive College Choices, and More!
Join hosts Alex Sarlin and Matt Tower as they explore the evolving edtech landscape—from rising AI use to old-school solutions for modern challenges.
✨ Episode Highlights:
[00:00:20] Banning AI boosts student engagement and brings back handwritten assessments
[00:03:23] Gallup finds AI use in the workplace has doubled in two years
[00:06:15] OpenAI’s college push sparks backlash over trust and cheating concerns
[00:09:30] Professors turn to blue books and flipped classrooms to fight AI plagiarism
[00:13:20] Meta’s $15B investment in Scale AI reshapes the AI training data market
[00:21:40] Poor sleep linked to tech use and lower student performance, says new data
[00:29:01] Study finds most students rely on parents for post-high school plans—and parents don’t know the options
➡️ Go to Part 2 to listen to the guest interviews this week
😎 Stay updated with Edtech Insiders!
- Follow our Podcast on:
- Sign up for the Edtech Insiders newsletter.
- Follow Edtech Insiders on LinkedIn!
🎉 Presenting Sponsor/s:
This season of Edtech Insiders is brought to you by Starbridge. Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.
This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.
[00:00:00] Matt Tower: It's sort of like, would you rather get great human feedback a week to three weeks after you submit the assignment, or would you rather get AI feedback in 15 seconds or five seconds and. For me, and it seems I'm ahead of students here. I would rather get feedback in five seconds.
[00:00:20] Alex Sarlin: One thing I've noticed after banning basically chatGPT for assignments is that engagement is back.
The wanting to ask questions, the wanting to learn, even coming up to me after class and having a discussion about what we talked about, they're basically saying it felt like that the AI was almost cheapening the entire educational enterprise. On both sides, and by removing it, they feel like there's starting to be more engagement.
Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders.
[00:01:05] Ben Kornell: Remember to subscribe to the pod, check out our newsletter, and also our event calendar.
And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.
[00:01:29] Alex Sarlin: Welcome to the week in EdTech. We have one of our very. Favorite guest hosts for the podcast. We've been a friend of the pod since the very, very beginning. Matt Tower from Whiteboard Advisors. Welcome to the pod, Matt.
[00:01:44] Matt Tower: You're very kind, Alex. It's always a a pleasure when, when y'all invite me on.
[00:01:48] Alex Sarlin: Oh, absolutely.
And we have two terrific guests at the end of this podcast as well. We talk to Ed Buckley and Marley STR from Scholar Education who are doing interesting AI, dog based AI bots for schools for both teachers and students. And Yesenia Svia from Chalk Coaching who is doing professional development and classroom observation for three through 6-year-old teachers.
But for now, let's jump into some of the news. So, Ben, as you know, Ben and I, more and more of each week is about like sort of big ai. I do wanna talk a little bit about it today, but why don't we keep our sort of big high level AI conversation a little bit briefer because there's so much happening in actual ed tech this week at in actual education this week.
To talk AI first. Yeah, let's do it. Okay, so in the last couple of weeks, a couple of big things have happened. One, there was an interesting Gallup report that came out that basically says AI adoption in workplace settings has basically doubled in the last two years, and that's across the board. You're seeing the daily use of ai.
Two years ago, that was 4% of people. Now we're about 8% of people using it daily. You might think that's low if you're a type type of person who uses it all the time yourself, like both of us. But considering that's the entire workforce, that's actually a pretty fast adoption. We saw more frequent weekly use from 11% to 19% people using it a few times a week.
And we saw 21 to 40% of people using it infrequently, but still using it. So basically across the board usage has doubled. Is that faster or slower than you might expect, Matt? In two years?
[00:03:23] Matt Tower: You make a good point about it's sort of hard to contextualize the entire workforce. Yes. Especially when folks like you and me are, are sort of deep in the weeds and pushing ourselves to use it every day rather than being pushed to use it every day.
I think what's consistent to me is I, I keep seeing these numbers from AI companies, their, their revenue numbers that I. Honestly are sort of unfathomable to me as, as like a education investor. Like I saw OpenAI leaked that their annual revenue grew from 5.5 billion to 10 billion in the past year, and Anthropic grew from like one to 3 billion in the past year.
And like adding a casual two to $5 billion in annual revenue. For companies that are like less than five years old. From a commercial standpoint, it doesn't make sense to me. But when you think about the adoption rate, again, we're talking about the entire workforce, like, I mean, I guess it sort of tracks.
Yeah. Which is bonkers.
[00:04:24] Alex Sarlin: It's the entire workforce. And there was an interesting McKinsey report that also is saying that what is the workforce doing? They're using these commercial tools. I mean, that's the thing. Maybe they're using a teams version or an enterprise version, but it's not that. When people say, oh, I'm using ai, there's like 5,000 tools that they might be using.
They're using one of these two or three big providers. So. I bet almost all of that increase would be attributed to either open AI or Anthropic or Google with a much smaller fraction going to perplexity or copilot or all sorts of other things. So that's a lot of people to literally double down on your product in just two years and that's, that's where the money's coming from.
And that's for sure. So speaking of OpenAI, Leah Belsky, I've mentioned on the podcast before, Leah Belsky was a colleague of mine in Coursera for years. She recently became the head of education go to market at OpenAI, and there was a big profile of her and open AI's. Pushes to go into college. It was a really interesting article.
This is a big times article. She was sort of at the center of it, but they also talked to a number of professors who are using it in different ways and how everybody's sort of wrestling with this. The thing that jumped out to me, I started reading the comments on this one, which is, yeah, exactly. You're always in trouble.
The comments on everything. But I literally couldn't find one positive comment about ai. It was dozens and dozens of professors, students, people in college settings, just saying they think that we need this and I hate it. It's killed my class. Everybody's cheating with it. And I was just like, my God. I mean, obviously that's not the only perspective out there, but it was a little splash of cold water to see the biggest startup in the space going into education, trying to do it.
I mean, I know these people, they're trying to do it right. They're trying to do it right and the lack of trust is so deep. What do you make of it?
[00:06:15] Matt Tower: I think a running theme of, of today's episode, just knowing some of the other stuff we're gonna talk about is going to be like, well, what is the bar? And is the status quo like that?
Much better than what we're talking about here? And I, and I think the useful example from the article is they talked about a case law book that had I think 250,000 cases in it so large that it actually was bigger than the, the token model that at the time OpenAI, Google and Anthropic allowed in their products.
And you know, today they've expanded the context window to be bigger and that professors were frustrated that the tool had some mistakes in it. And my response to that is like, okay, if you gave the best law professor in the world a 250,000 case book. To quiz them on, are they not gonna make any mistakes either?
Can we just establish like the status quo is not that great and that, and that's not a shot at the law professors, that's like a shot at the human brain can only hold so much in their, in their head at one time. So I go back to like, what are we using the tool for in like what is the status quo for that use case if it's better than the status quo.
I think that's pretty interesting. That doesn't make it perfect. But what frustrates me about this sort of doism is like you're sort of protecting a status quo that honestly isn't that great to begin with.
[00:07:37] Alex Sarlin: Yeah. No, I, I totally agree. Well, one thing that I think we should talk about as well is just, is the integrity, cheating thing.
You know, I, anybody who listens to this podcast knows that I really think that this is a, a, a tangent in the history of AI and education this moment where we're in, where, where the cheating just keeps coming back up as sort of like people's definition of AI is students use AI to outsource their thinking and do work.
I get it. And I really don't want to be callous to the people who are teaching on a daily basis, whether in college or in K 12 and are saying, oh my God, this is so painful. And there were some, some examples in the comments again here, like very specific examples. I don't wanna be callous about that, but I also don't wanna continuously go down this incredibly deep end.
I personally find pretty ridiculous rabbit hole of people defining AI as cheating. How is it different than when we used to discuss Wikipedia as cheating and 15 years ago? I just, I I do not get it. What do you make of it? I, I maybe, I feel like I must be missing something here because it just drives me crazy.
[00:08:38] Matt Tower: Yeah. I mean, I think I have empathy for the folks having to navigate this on the ground. I genuinely believe it is hard for them. I think by the same token, like there is a tool that is doubling in workforce acceptance year over year and probably is gonna double again. And so like if your job as a university, and we can talk about the job of a university is to train students for the workforce.
This is a tool that is becoming necessary and therefore has to be a part of your plan. Again, like if you consider your job training students for the workforce, and there are folks who would disagree with that too. So I just, I like, I think the two have to go part and parcel and to. I am more interested in the universities embracing it, the, the CSUs, the Dukes of the world, the Cal State University system and the Dukes of the world than the folks sticking their heads in the sand.
[00:09:30] Alex Sarlin: I agree. There are gonna be these continued battles within even those really, those campuses that are very sort of tech progressive, where they're gonna say, we're using Cal State, you know, give, giving it to everyone. And then there still will be instructors who say, now everyone has it. And wow, my class has fallen apart.
One of the things we saw this week was an interesting story about the increase in the rise of handwritten assessments and blue books and all of these old school techniques to try to, you know, maintain integrity. We'll talk more about that, but. I wanna be more empathetic because I can imagine, but I, I, you know, one of these comments again, was like an instructor saying, oh, I gave my students a, an assignment about describe the Santa Fe School of Art history.
And half the assignments came back with the same answers, with the same mistakes. Some of them were about the totally wrong thing. And, oh my God, this is a disaster. And I, I can't help but think like, I'm like, if half of your students do not care enough about this to even want to think about it like something, something is wrong here.
Right? And it's not the tech, it's the whole system. And the tech is just making these students lives. You're asking them to resist this technology for something that would just makes their whole life this much easier because you, in the name of learning. But there's obviously something missing in sort of the actual pedagogy there, the actual learning piece, the actual, why am I learning this?
Why should I look this up and actually learn about it myself, rather than just ask an a i. If you can't answer that question, then why should your students. Answer it the same way you do,
[00:11:01] Matt Tower: and not to jump ahead, but like, I think the Blue Book example is a great one. Like, yeah. I'm fully in support of, of using Blue Books.
You do with the understanding that if you assign, you know, a test or a problem set or whatever to be done on a blue book, not only are your students gonna look at you pretty hard, like, are you sure Professor X. Also you're gonna have to grade all those blue books. So like you have to, you have to be super careful.
Right? And I think like when you do do that assignment, I think everybody involved will be bought in that like, yeah, this is probably pretty important if our professor is going to take not only our time, but their time to engage on this topic. So I think that's, I actually thought it was great. I endorse Blue Books.
[00:11:44] Alex Sarlin: I do too. And, and one of the pieces of that was this concept of, you know, bringing back this concept of a sort of flipped model where you may be able to use any kind of tool including AI outside of the classroom to make sense of things, to explain things, to learn concepts, to practice even, to have things, write things for you, to have AI write things for you.
But then when you get back into the classroom, if the goal is to assess the knowledge or or skills that you've actually attained, it does make sense to have a sort of secure environment to do that. And in a Blue Book is one way, pretty old school way, but one way to do it. That's fine. I'm okay with that too.
But you're right, I, I think that's a really great point. It shows buy-in from both sides and that we've covered articles in the past about how students get annoyed because they're like, Hey, I'm seeing my professor use AI to do all sorts of things that seem like they might be corner cutting and yet I'm not allowed to.
And that's an issue as well. Before we leave the sort of AI beat, we don't have to spend a lot of time here because this is really not specifically education related, but I do think it's. Interesting. Just in terms of all the scuttlebutt in the big AI players we saw just last week, meta basically put huge, you know, billions and billions of dollars into basically 15 billion.
15 billion into bringing scale AI into their orbit. Orbit and including, yeah, and one of the things that I found interesting today was, is an article basically about how like Google was scale's largest customer and they are now ending the relationship. As you can imagine, all the other competitors used scale for all these different things that they did to basically clean and make the AI data usable.
I know you know a lot about this. I'm curious what you make of this move and just like how it's sort of gonna shift the landscape for these big players.
[00:13:20] Matt Tower: Yeah, so I, I think there's sort of two, two threads to pick at one we can do quickly and the, and the other, maybe we can engage on a little bit. So the first, it's, it's a, it's a 15 ish, it's probably a little under $15 billion deal for 49% of scale ai.
And the scale ai, CEO has to come over and work for meta. That that's sort of the, the, the broad terms of the deal. And it's, it's, it's illustrative of the, like, you can't quite acquire things the way you used to be able to when you're as big as meta. So you do these like sort of hacky workarounds and, and there were a few examples of this last year.
Microsoft bought Mufa, Suliman, I'm probably pronouncing that wrong, but his, his AI startup. And there've been a couple of these like people call acquisitions. So it's like sort of interesting about the m and a environment, like what you're sort of allowed to do in that sense. I, I think more relevant for, for this conversation is, is like we're running up against the barriers of data on the internet.
And so companies like scale AI and, and from what I understand they are the market leader, are sort of like basically vast armies of data. Labelers and collectors. I think they have 200,000 contractors on staff around the world. And it's sort of funny to think about like that. We need that type of service to undergird all these models.
What's interesting to me is like Meta has been very public about its stance of an open source model provider. So to me there was less of a like competitive concern, I guess. Like if scale was providing you your secret sauce data, like there's some competitive anxiety. But it surprised me that all these providers were now leaving that they're a part of Meta.
If it had been OpenAI or Anthropic or one of the like proprietary model providers, it would've resonated a little bit stronger. So I'm curious if if maybe those announcements of like, oh, we're leaving are really just sort of the first volley of Renegotiation talks. Again, like I don't know anything about anything, but that to me, just given Meta's stance about being an open source provider.
[00:15:32] Alex Sarlin: I'm sure scale AI also has competitors, and it is possible that, you know, that's gonna be the move is this sort of downstream shift where if scale AI is seen as a, as contained within one of the ecosystems, even if it is one that is trying to be very dedicated to open source, their llama model for a long time.
So I've seen some of the coverage of this has seemed to say that there may be a move away from that and it's hard to know. I've been reading some interesting histories about all the, the folks involved in this and how they all built up their companies and Facebook was always working with Jan Koon, one of the original AI Masterminds, and he, from the beginning was really always kept a foot in academia.
He doesn't like lms. Exactly. Yeah, that too. And then Alexander Wang, the, the, the founder of Scale AI was Sam Altman's roommate. Like, I mean, this is like a small group of people. Oh, I didn't know
[00:16:20] Matt Tower: that. Yeah. Interesting.
[00:16:21] Alex Sarlin: Yeah. This is a small group of people who all sort of knew each other and they know that a few of these folks, this is also the, the Mustafa Solomon acquisition, you're talking about small group of people who were sort of at the heart of this, and then now they're all, they're all jocking for position and trying to figure it out.
Elon Musk was an early investor in both Open AI and DeepMind. Like, like it's just really, really weird incestuous relationships here. Yeah. And I think it's, it's gonna be interesting to see where this goes, but in terms of your idea about, I mean, your comment about the, the human tagging, I mean, I think there's something that maybe people underappreciate about AI is that like ai are these like mathematical models and they're, they're very sophisticated mathematical models, but they're mathematical models that, that basically learn from people and.
Labeled data. Just that that huge contractor base, that scale has put together. They're basically selling the ability to make data sets in a format that AI can learn from. Right? To make training data and training data is the fuel of the realm. And as you say, it's, we're hitting the coin of the realm, right?
I mixed metaphors there. So according to the realm, that's what it's about. We're, if we're hitting the limits of the internet, which is like a pretty crazy thing to, to say and there's synthetic data, but like it makes sense that. Becomes a differentiator. The ability to, and this is true in education, just to tie it back to education, this is totally true in education and something that I think we maybe under appreciate, even those of us who talk about education, AI all the time, AI all the time, is that education data, there's some amazing large data sets out there, open data sets, there's a lot of closed data sets, a lot of proprietary data sets in education, tons.
And sometimes there's this concept of like, educational data feels like icky, right? The idea of like, oh, student data, how could we use it for this? How could we use it for that? But if we want educational AI to be as powerful as it can be, there needs to be some strategy for how to have incredibly good educational data.
And I don't know how this is all gonna come together, but uh, it may shift the landscape a little bit in terms of the other folks.
[00:18:19] Matt Tower: I think maybe two points and then we can move on. So the first is. Google's Waymo unit also produced a report this week that talked about their self-driving cars in various metro areas around the, around the us And they talked about they have scaled their training data set.
I think they've driven at, at this point, north of 10 billion miles, and they continue to see the effect of the scaling of data. A lot of people over the years have argued, like, at some point we're gonna start seeing diminishing returns on volume of data against a training model. And Waymo's unit is continues to see the effect of scaled data sets.
So I, I think that's interesting as. Maybe the largest like real world data set that's being translated into AI continues to see the effect of scaling. And I think that might be reflective of, of where the other
[00:19:08] Alex Sarlin: models are going. And given the, the complexity of the human brain and how big our differences are between brains.
I think data sets for, for therapy, data sets, for education, data sets for, I mean, not to be crass, but for like targeted advertising for like psychology, that is the, the last big frontier of giant data. I don't know, which is bigger, the data set of, you know, cars driving everywhere and taking pictures of everything, or the dataset of trying to mine how our brains work and what every different kind of person thinks about every different kind of thing.
It seems like the latter still, still might be bigger.
[00:19:42] Matt Tower: I
[00:19:42] Alex Sarlin: think
[00:19:43] Matt Tower: it is too. A little bit of a, I don't know what, what to call it, but to be consistent with Jan Koon's stance, that it's physics-based models that will ultimately produce super intelligence versus these probabilistic, you know, data database models, digital database models.
There is a consistency there. Between that and, and the takeaways that I found from, from Waymo's unit, and I also think there's, there's sort of a, a. Irony in education of two years ago, a bunch of the big education players announced like their own proprietary models, and you don't see anything related to proprietary models in education now because we've recognized, I think at this point that, that the cost of, of building one is just beyond what any said education can comprehend.
[00:20:28] Alex Sarlin: I remember you saying that on this two years ago, so I've proven right there. Yeah.
Charts about how much money it costs to do each new iteration of four and two, five Gemini. It's like the money is massive.
[00:20:43] Matt Tower: Yeah. And, and so I think the takeaway is like education remains unfortunately downstream of all the big model providers, which makes it extremely relevant for, for folks like you and Ben to be, be tracking them.
But it, it sort of like remains to be seen. The effect will be, which is a little bit scary.
[00:21:00] Alex Sarlin: Yeah. But, but when you see things like the AI going into colleges in a big way or, you know, anthropic announced Claude for EDU just a few, couple months ago now the fact that they're even addressing it in that direct way means that they, not that they have to take it seriously, but they, they have to sort of begin to think about the made for the tailored for education use case and what it actually would require.
And it requires a lot beyond general models. And I think they're starting to realize that, that I think that's good for the space. You found a really interesting article this week just as a shift to K 12 about the effect of sleep and its relationship to tech in K 12. Jump in and tell us about it. I'd love to hear your take on this.
[00:21:40] Matt Tower: Yeah. So the core concept is, is something that I think people intellectually agree with, but in practice forget, which is that students and probably workers, although I I have less, this, this article is about students perform better when they get more sleep. And there has been an unfortunate wave of students getting less sleep often as a result of being on their devices deep into the night or even just close to their bedtime.
So this author Tim Daley wrote about like, can we track the sort of deleterious effects of lack of sleep among students or children that have screens against like NAP scores and the directional answer. And there's a bunch of variables. Some smart statistics person could probably object to this, but I think like from a core principal's point of view, it's sort of holds, it's like, yeah, like the increase of screen time, especially late at night.
Affects our ability to perform when it affects our sleep. So I thought that was an interesting contrast to this sort of world. We're moving to where, you know, devices are taking on more of the core learning function, and I think it highlights the need for humans, teachers, parents, et cetera, to be on top of their game on the holistic side of saying like, Hey, in order to be healthy humans, we need to put our screens down before bedtime and we need to get eight hours of sleep and we need to like, interact with each other throughout the school day and not just like, have a screen in front of our face.
So I, I, to me it was sort of like a, a nice break from the AI and tech stuff to say like, yeah, there, there's a really important holistic side to the educational experience that is important to keep in mind.
[00:23:34] Alex Sarlin: I also like it as a sort of fresh and additional take on the movement that we've begun to see around the country over the last couple of years of schools being much more proactive in actually sort of shutting down cell phone usage or being really vocal with their communities and with their parents about limiting phone usage.
Uh, you know, the data has been there. Jonathan Hyde's book, which is cited in this article two and Jean t twinge, like there's been sort of a growing voice about like, Hey, it's not that these phones that kids are glued to are just annoying. It's actually really messing up their lives in like really serious ways, including their psychology, including their educational outcomes.
And this, I think, ties it very directly. It basically says across the board, NA scores have been down with a little few exceptions in the deep south, Louisiana and Alabama. And they're even worse for people who are, who are already lower, right? So the, the top scoring students are doing alright, but the lowest students have gone down the drain and it's like.
You start to look at the timing, you start to look at this, and I think it's a, there's a pretty good case to be made that the, that screen time and sleep are a big part of it. And it's obviously related to ed tech in that, you know, this is not an ed tech problem. It's sort of a tech problem, but it's an
[00:24:43] Matt Tower: education problem.
[00:24:44] Alex Sarlin: It's an education problem. Yeah. It's an education problem, not a
[00:24:47] Matt Tower: tech problem.
[00:24:49] Alex Sarlin: Right. So
[00:24:50] Matt Tower: to shout out, Ben's former employer Common Sense found that 40% of children under two have their own devices, which is, is sort of consistent with the socioeconomic problem you were citing of for, for lack of a better way to frame it like.
People like you and me who have a lot of privilege in life have the time and the access resources to be thoughtful about this and can sort of limit our children's device time. And frankly, like I still watch YouTube with my kid. It is what it is. There are a lot of parents who don't have that, have that privilege and it's really hard to reconcile them.
And again, it's, it's a community problem. It's a parent problem, it's a teacher problem. It's very holistic. That has to be part and parcel of all of the tech discussion that we have with regard to schools.
[00:25:36] Alex Sarlin: One recommendation in this article that is directly relevant to the ed ed tech is device management, is that the schools could make devices have time limits.
They could shut off after 10:00 PM Basically, even if a student is in the middle of the homework, you say, you know what? We're gonna shut you off. We know you're in the middle of homework. Teacher will get a note about this. But like sleep is the most important. Like there are some sort of proactive moves that schools and ed tech companies could offer.
I remember when the texting and driving stuff started coming out and people were having, it was like, oh, you're much more likely to crash if you're texting. And I remember being like, well, why don't the device providers just say, oh, if you're going over 20 miles an hour, if the phone's moving over 20 miles per hour, you can't text on it because can't you get involved in that?
And they did start to introduce things. I mean, we all remember that moment. The iPhone first said, I'm not driving right. Like it starts to notify you. We could have that in educational devices in all sorts of ways. And,
[00:26:30] Matt Tower: and I think there are providers working on that. Not to like name drop too heavily, but I know the folks at SECURELY are thinking about this problem.
And Tammy, their CEO made a good argument to me of like the past decade was this whole like bring your own device movement. And that was good for access, especially when we didn't have one-to-one devices, both in school and in the workforce, et cetera. But like the pendulum is now swinging back towards managed devices specifically because you can manage access to, to certain things.
And it's easier to do that on a school owned device rather than a family owned device.
[00:27:08] Alex Sarlin: You can imagine an ed tech company doing that for parents as well, for home devices, right? Because you're starting to install that kind of software that that is, that helps set limits, or it exists. I mean, they're out there, but I think there's, there's now a stronger case for them forever.
So a couple of other things I wanted to talk about. In K 12, there was a report that came out this week that was very, it's caught my eye that basically was about, it was a commissioned report by like JFF and Walton Family Foundation and a bunch of different folks. But basically saying that it was trying to assess whether, how students feel about their post-graduation options.
And there was a focus on, hey, well. College and job are sort of the default post-graduation options that most people see. But there are increasingly more and more different sort of alternative pathways, credentials, internships, apprenticeships, short form. There's a lot of different things there. And this study was to see how students were getting their information and what they think about this.
And some interesting findings in there for, for me, which is that among recent high school graduates, fewer than half said that their school provided them for any option other than a college or a job. That that may be not so, so surprising given that this, these alternatives are, are pretty low. It said almost 10% said their high school didn't prepare them for any option, which is a little sad.
But the thing that stood out for me is that students really get their, their biggest source of information about post-graduation options from parents. 90% of students said they got it from parents and meanwhile, many of the parents said they know almost nothing about anything other than college or unpaid jobs.
So basically there's information pipeline issue here because parents don't know there's anything other than college or job. Students Dunno. And schools have less to do with it than you might think, which I'm surprised about. Only 54% said they relied on teachers or counselors. Less than half said they relied on school counselors.
What did you make of this study? I thought it was an interesting insight into sort of how, how information spreads in, in the high school community.
[00:29:01] Matt Tower: Yeah. I think to me it felt very consistent with research in a couple of different tracks. So I think it, it was helpful and additive to the overall corpus of knowledge.
But again, it was almost nice that it was consistent. So I think about the Raj Chetty work where it talks about how like the sort of best path to economic mobility is to be around people who are in different socioeconomic classes. So I think this is consistent with that right? Of like whatever your parents have been exposed to.
That's where you're gonna look. The more parents and the more sort of socioeconomic diversity that you are around, the more options you will sort of understand to be out there. So I, I think it broadly holds on that. The other is, and this is the thing that I has sort of baffled me forever of like, the students are most likely to go to college or a secondary institution within 50 miles of their house.
And that holds true like across the US And it's sort of again, for folks who have had a lot of privilege in their life, you're like, I. There's lots of good schools all over the US like you, there are tons of options. Why would you do that? And it's, it's because it's close to home. It's what your family knows.
It's what your friends know. And so I think again, it's when you don't know where to look, you, you talk to the folks around you. And so it's, it's consistent with that as well. And then I think finally is what we just talked about with sleep too, of like, the job of a school is not purely about education, right?
It's not just teaching you algebra, it's this much more holistic teaching you how to exist in society. And, and I think the folks who make it just about school or, or sort of, to me it's sort of a disingenuous argument. It's, that's not why we send our our kids to school. It's, it's to learn how to be effective members of society.
And that includes what you're gonna do for work, it includes sleep, it includes being a helpful member of your community, et cetera. So to me, this holds across all three of those tracks. Yeah. In a, in a positive way.
[00:31:06] Alex Sarlin: It's a really interesting point. I mean, where students get their information from, especially among, in terms of incredibly important decisions, like what are you going to do with your life after you graduate from high school?
It's kind of a big decision. It really matters. And it's interesting, you know, when they talk about, you know, 90% of students relying on their parents, if you then sort of break that down and you say, well, what percentage of their parents got a degree? Not that there's a one-to-one that, you know, parents who got a degree are gonna push their parent, their kids to get a degree, and parents who got a job are gonna push.
Their kids get a job that's not, you know, necessarily exactly how it works. There's a lot of political stuff in there. There's a lot of aspirational things in there for people wanting their children to be the first generation in school. So there's a lot of cross issues there. But it is interesting to think about it as like if students are listening to parents, parents have their own experience, which does tend to be these traditional, more traditional pathways, and then you could maybe say somewhere between 30 and 40% of them went to college based on the, the numbers.
Then you're getting like this very specific type of information from home. And then the question is, what should information could and should be coming from the school to your point about, you know, what is the school's role and. Obviously, you know, the people who commissioned this study that the jfs of the world think it's a lot.
They think that a lot of these foundations are saying, Hey, look, there's suddenly all of these other interesting pathways. Some of them are short form credentials, some of them are through going through apprenticeships that, you know, they asked the students in this, how many of them had internships or apprenticeships.
It was very, very low. And I think it just shows that this sort of, nobody owns this, which is, which is actually a little bit scary to think about, right? Like, no one institution owns the, the, the guidance of students to. What they're gonna do in life. And then they come out and say they don't feel prepared.
And the wor and the hiring managers really don't feel prepared. They said more than half of hiring managers say that recent graduates are unprepared for the workforce. So you have like, especially 'cause of excessive phone use to circle back there. So like you're in the world where it's like, if you're a teenager right now, like who should you be listening to?
Your parents have a specific viewpoint. The world is changing around all of you. The colleges are gonna try to recruit you because they need you, especially if you're a domestic student. And who do you trust?
[00:33:18] Matt Tower: And it's a, it's, it's a serious question. Yeah. Not to be crass about it, but the, this is like a place where investors have spent a healthy amount of money in the past couple of years.
I think recognizing this sort of hot potato problem of high schools are like, well, why do I care about your life after high school? I'm just trying to get you to graduate. Okay. Like, I'm incentivized to get you to graduate high school. And then colleges are like. I don't know. I'm incentivized to get you in the front door, but like what's my role in the workforce?
I dunno. Right. So I think it is, for better or for worse, there are companies that are starting to step up and and meet that need, right? And so I think Handshake has been around for a while. They're trying to fill this gap. School links is trying to, that handshakes at the higher ed level school links is trying to do this at the high school level and introduce more pathways.
You have
[00:34:05] Alex Sarlin: Pearson just bought nonprofits
[00:34:07] Matt Tower: like JFF.
[00:34:08] Alex Sarlin: Yeah,
[00:34:08] Matt Tower: sorry,
[00:34:09] Alex Sarlin: Pearson. A company that knows exactly that's
[00:34:11] Matt Tower: more content than it is like. And, and Pearson. Pearson people would argue content. We're establishing pathways. So, so I think it is an increasing focus of investment and so my hope is in, if we do this again in two or three years, there will be some more brand recognition in this space and, and students will feel a little bit less overwhelmed at the potential options.
[00:34:35] Alex Sarlin: Yeah, that's a great point. There definitely ed tech companies, pretty large ones in, in many cases, that are trying to sort of inject some of this guidance and these career pathways and these career exploration into the high school environment or the college environment, to your point, but. I would argue, and not nothing against any of these companies, but I would argue that if, if teens are not even trusting their school counselors, right.
If less than half are trusting their school counselors, how many are going to make their decision of life from a career pathway software? Like, it's, it's hard to know if that's really gonna translate to, to real outcomes. I, I, I think it could, but I probably need some surrounding supports. I don't know if it's just like students are these balloons blowing in every direction and if you happen to click on something that says, Hey, be a data scientist, and they say, great.
Now when my life is set, like, I'm not sure that is exactly how it's
[00:35:24] Matt Tower: really works. Yeah. I mean, yeah. I, I think that's, I think that's the correct rebuttal. I will say, I think like. You talk to high schoolers and it, it's some, you get some pretty bizarre stories. That's true. Are inspired them to go to, you know, place X.
That is true. So, but I, but I, I think that's the correct rebuttal and, and the proof will be in the pudding. So I, I'm not gonna sit here and say like, oh, but Provider X does this. No, no, that's true.
[00:35:52] Alex Sarlin: Um, I, I think we'll see you get weird stories, but I would argue that a lot of those weird stories often involve a, another human being.
My cousin said this thing right? Or whatever, and you're like, oh, that's a weird story. But it still involves a person and, you know, but, but look, nothing against any of these companies. I think they're doing incredibly important work and, and they're picking up a problem that is obviously massive and this total lack of, of guidance there.
But it just, I wonder what this sort of combination could be of, of human supports parents, for example, the people who we know students trust, or other family members or teachers to a lesser extent. And then combining that with information about career pathways that actually might be useful. One other study that stood out, and we don't have to cover this much, but I do recommend our listeners check it out 'cause it's really interesting.
The WGU labs, that's Western Governors University Labs, which is sort of an accelerator within, they do really interesting work, put out a study, uh, they call it how students really feel about ai. And it basically asked all these students about how they feel about it from sort of within the WGU landscape that they're, they're a college, so this is a little bit of a higher ed transition, but.
They found some interesting things, right? They found that more than 60% of students are comfortable with AI using their data to personalize their learning. Interesting, interesting stat. I don't know what number I would've predicted for that, but it's, I think that's higher than I would've predicted. Just a third.
Are interested at this moment in AI tools for social or emotional guidance that potentially actually flies in the face of some other data. We've seen that that emotional work is becoming an increasing, uh, usage, sort of depends
[00:37:18] Matt Tower: on the topic and, and their comfort level. Talking to humans versus talking to a bot, I think is all over the map.
[00:37:24] Alex Sarlin: Exactly. That's a great point. And then one that is I think, very relevant to a lot of the AI startups right now, this concept of, so it said, they asked about how students feel about AI powered feedback and how they feel about AI powered evaluation. Do you want feedback from an AI or do you want an AI to actually be able to grade you?
And they saw a big gap there, which is interesting. I think it's, it's sort of logical, but it's interesting. You said almost 60% of people were open to AI feedback. That's actually lower than I would've thought because AI feedback doesn't hurt you at all, but. 58% percent and only 35% about one in three trusted AI to actually give them a grade.
That was interesting to me. I do do. How do those numbers square with what you might've, where you might've thought the student body was? Right now?
[00:38:08] Matt Tower: I think the delta between feedback and grade make sense. Yeah. I think, I would've thought they were both higher, but with a similar delta. So, you know, grades are high stakes, right?
So anytime I think the higher the stakes, the more conservative people tend to be. So that sort of tracks that people will be more nervous about getting a grade from ai from feedback. I think like the thing I will say until I'm blue in the face, at least with the current crop of AI models, is they are good at repetition, responsiveness, and facts.
And facts, I think really trips people up. But I think it's true. And I, I go back to our exa our case book example of AI can hold more in its head than a human can hold in theirs, even if it does make mistakes. But I think the, the repetition and particularly the responsiveness is relevant here of, it's sort of like, would you rather get great human feedback a week to three weeks after you submit the assignment or would you rather get AI feedback in 15 seconds or five seconds?
And for me, and it seems like I'm ahead of students here, I would rather get feedback in five seconds.
[00:39:19] Alex Sarlin: We had Jeff nine and a half times at a time from Coursera on here, and he said that the students vastly preferred the, a instant AI feedback for their assignments in course.
[00:39:27] Matt Tower: Right. So I think there might just be like a.
Folks haven't experienced it as much. And, and I got to see some of this even going back before the LLM sort of wave with the work we did at SEI of, we were designing bots to provide feedback at two in the morning when the student was doing their homework rather than waiting for the teacher to respond.
So to me, I, I take that 10 times outta 10, but I also fully understand when it, when it's giving me a grade, I'm gonna have a lot more anxiety of potentially getting it wrong.
[00:39:57] Alex Sarlin: I think it's really well said. The, the delta of high stakes versus feedback is makes sense. But I agree that AI feedback is, it's, it's cheap, it's instant, it's free.
It, it tends to be extremely high quality, but I think maybe a lot of people don't realize that last part. Yeah. Yeah.
[00:40:14] Matt Tower: I think you have to experience it.
[00:40:15] Alex Sarlin: Yeah. I mean, to be fair though, you also have what we've been talking about earlier, and there's, we're, we're talking about this Blue Book article where professors are complaining that students are using AI to do everything for them.
So if students are using AI to do everything for them, then presumably they, they may be getting feedback as one of the things, but I don't, maybe not, maybe they think of it as like, I asked AI to do the assignment and I did it, and now I'm handing it in, versus I'm asking Ida, help me think about the assignment and give me feedback on my draft and all that stuff.
Maybe that's really the, the schism here is that students might have a sort of odd mental models about what AI is for, and they're not thinking about it as a, a partner or a feedback provider. They may be thinking about it more as a sort of solver, like a, like a photo math, you know?
[00:41:00] Matt Tower: Yeah. And I, I think that gets back to your earlier point about establishing guardrails within the education space specifically, and understanding the models probably do need to behave.
Somewhat differently for students than they would in in other contexts.
[00:41:15] Alex Sarlin: Totally differently. I mean, and that's, that's what Conmigo has been working on. That's what a lot of these AI tutoring companies like is exactly what they, and companion companies exactly what they're working on there. So yeah, a couple of other higher ed stories I think we should cover.
I know we're, we're, we're, we're coming on the hour now. Oh my God. This has gone fast. This is a super fast one, but I'm curious about your take on it. We saw this week, basically 11 of the 12 person Fulbright board resign. I know. For, for political, out of political protest. Basically because the administration was overriding their choices for Fulbright Scholars politics has been sneaking into this podcast over the last couple months because it's sneaking into, especially higher ed, everything, every way.
Yeah. What do you make of that? I mean now, now that it's an empty board, which means it can be refilled by the administration, I believe. So is this just yet another domino to fall in this sort of aggressive. Reactionary take back of higher ed? What, what do you make of it?
[00:42:11] Matt Tower: Yeah, I mean, it bums me out. I think the Fulbright program has been sort of a shining example of helping Americans get international experience in a variety of different contexts for doing research, for teaching, et cetera.
I, I know a lot of folks from, from my class at Amherst did them. I think like, it, it, it gives people a perspective that is really hard to get from any other type of program as a, as a post-grad. So it bumps me out. I think like it's an unfortunate choice too, that like, that the choice to resign means that those folks will not.
Have any say in, in the go forward plan. Right. And so that, that, that means something too of they sort of took their ball and went home, which I can get in some contexts. And, and the constraints may have just been such that they felt like they could no longer make an impact. And that's fine. But it does, it does make me sad for, for the program.
And it seems more likely to wither on the vine now than it did five days ago.
[00:43:11] Alex Sarlin: Yes. And there's sort of a xenophobia injected into this. It's like there was like 1200 Yeah. Foreign reci, foreign Fulbright recipients. We had an additional review process. This is not really ed tech, so I don't think we have to spend a whole lot of time here, but it just jumped out to me as just another sort of canary in the coal mine of, of what the next few years might feel like in, in higher ed, which is just quite intense,
[00:43:36] Matt Tower: reflective.
The problem with. International students. Right, exactly. Which is a massive, massive business. Again, to be a little bit crass about it, like there are companies all over the world that make millions of dollars helping get foreign students into the US that are also, uh. You know, facing a complicated reality right now.
So I think the two are related, even if Fulbright was not making money on placing students.
[00:44:00] Alex Sarlin: Yeah, that's a, it's a, that's a really good point. And I, we're already seeing some fallout from a number of different types of ed education and ed ed tech organizations and of course colleges and, and graduate programs that are trying to adjust to a new reality where their, their student base is just totally, totally different than it was.
You know, we talked a little bit about this sort of blue book. There was just an interesting article in, in Higher Inside Higher ed this week, basically about this return to in-person handwritten work and, and complaints about, you know, CHATT and the AI tools. One thing that stood out to me in this article that I.
I don't quite know how to feel about it because it's essentially sort of almost like a, a Luddite, you know, proposal. So I, I, my instinct is to say, Hmm, but I think there was something interesting to it. It was this concept of some of these professors that cited in this article who are going back to handwritten, are saying that they're seeing these sort of add-on effects.
You mentioned this idea of like, Hey, it's showing that the professor is dedicated to the work, that they're gonna actually read my work. They're actually gonna go through it in a much more dedicated way. That's an interesting byproduct there, but they were saying that, here's like a quote there, right?
It's like, one thing I've noticed after banning basically she GBT for assignments is that engagement is back. The wanting to ask questions, the wanting to learn, even coming up to me after class and having a discussion about what we talked about, they're basically saying it felt like that the AI was almost cheapening the entire educational enterprise.
On both sides, and by removing it, they feel like there's starting to be more engagement. And that's interesting to me. And something that, I wonder if that's something that is gonna become somewhat of a trend, just like we're getting, you know, starting to move cell phones outta high schools. What do you make of that?
[00:45:39] Matt Tower: Well, I think there's sort of two sides of the same coin, right? Of the, and, and this is another sort of soapbox I've been on of like, most of the stuff that I have AI do are tasks that I don't think are particularly fun or important. I think that the, the best example is like, oh no, like AI is gonna get rid of paralegals.
And it's like, cool. I watch suits, like being a paralegal looks like it sucks. It just looks like a terrible job. I work long hours, I get treated terribly by my, you know, superiors and it doesn't really help me. Like, I guess it sort of helps me become a lawyer. Like I see more stuff, but really I'm working so hard that I'm skimming most of the stuff I'm doing anyways.
Like it's not a fun job. I'm okay if that job doesn't exist in 10 years. Right. I don't know how controversial that is to say. And so I think like similarly, like the homework assignments that are rote enough for AI to do, like, probably not that fun. They might be important. They, they might be skill building, in which case like bring on the blue books.
But like I just want the professor or the instructor to. I've thought about whether this is important or not. That's what I care about. And like, you know, if 90% of your course has to be done via Blue Books, I'm okay with that. I just want you to have thought about why,
[00:46:58] Alex Sarlin: it reminds me a little bit of the British education system, the higher ed system, which I've always thought had a really interesting way of looking at it.
Yeah, yeah, yeah. This
[00:47:05] Matt Tower: is a really good example.
[00:47:06] Alex Sarlin: Yeah, it's a lot like that, right? It's like we're gonna let you go and educate yourself with some really serious guidance, and then you come back and you show us that you really, you've, you've done, you did something reading. Yeah. You did something. And I feel like that's sort of where this is.
Heading in some ways, and I mean, they talk about, you know, there's this quote, it's kind of reintroducing them to what the classroom was a decade ago. It was a space for learning, but also a space for connecting. I definitely agree with that. I mean, if we, we, we definitely don't want AI to, to start making distance between students and, and professors or between students and each other, but at the same time, like maybe the role for ai, like you're saying, it's, it's making it more interesting and relevant.
It's helping with the conceptual teaching that students are doing on their own or the practice and review and, and feedback. Maybe by splitting the assessment and the learning a little more like the British system does, that's actually a, a place where AI can really, you know, sing because it can be on the learning side.
Because it, because it, it, it sort of does, can at least bastardize the assessment right. Pretty quickly.
[00:48:09] Matt Tower: I think that's a really wonderful example. And like, I, I'd love to borrow it. Uh, like I, I, I think it, it's a, to me that does feel like a pretty bedrock use case of like how things could look as we move into the future.
[00:48:23] Alex Sarlin: Yeah. Yeah. And then, you know, on the flip side, you're also seeing certain universities, you mentioned Duke before and, uh, Cal State, which are adopting AI in big ways, but there was an interesting report about certain universities. Penn State is, is launching a, a big, a agentic AI assistant. There's a basically an interesting, um, report about how some universities are using agents to sort of combine pieces.
Georgia State, U Michigan, which, which we've talked about on the podcast, is doing a lot with ai. Basically building, you know, virtual teaching assistant, there's things doing admissions. I think that's really interesting, the idea of universities using agents to sort of put the very many complicated pieces of a university together for students.
That sounds like very promising to me. What do you think make of that?
[00:49:04] Matt Tower: Yeah, I mean, I, I hope they are finding the, like. Boring, but important tasks and just getting rid of them. Right, exactly. Like, again, like a very personal example to me is like, yeah, I go through about 200 emails a week to like source links and funding announcements and like.
Honestly, it's a pretty dumb task. Like it's pretty boring. It's finding a needle in haystack and like, I will be so excited when I can outsource it to AI reliably. Like I am counting down the days. I, I try to, I work on stuff once a quarter to see if I can outsource that, and we haven't gotten there quite yet, but I, I cannot wait for that day.
And, and my hope is the colleges are finding examples like that all over the place.
[00:49:47] Alex Sarlin: That's a great example. And what I love about that example too is that if you're the paralegal example or the sort of like go through all the emails and pull out the right, the key stuff, what that does ideally paralegal stuff is, is separate.
'cause it's, it's actually, there's a job function there as an entry level thing. It's a whole other can of worms. But the idea of being able to outsource the, the initial compiling of information, some of the structuring of information. All that should do in an ideal world is allow the human to then enter at a, at a higher floor of the elevator, right?
It's like you don't have to spend all this time combing through stuff. You already have amazing stuff with you, and then you can actually synthesize it, make sense of it, use it, apply it to something you actually wanna accomplish. It's like that's the goal. And I think that's what Sam Altman and and Dario Aade, like this is what they're picturing, I think in a lot of ways when they're picturing AI productivity.
And I feel like sometimes it gets lost in the mix. This idea of like outsource the boring things, the things that you don't want to do, that people don't enjoy doing, that humans are not actually that good at this. Like being detailed oriented for infinitely long time. Exactly.
[00:50:51] Matt Tower: Yeah. I, like I said, I'm counting down the days.
Yeah. There's a bunch of small things like that that I'm sure you have. I have. And, and students and faculty and administrators have
[00:51:02] Alex Sarlin: no question. Two more quick things and we're, we're over time, so let's do these in, these are big topics, but let's do 'em in a lightning round. One is, you and I have a history and you've always been really ahead of the game on this about talking about by Jews.
We have not covered by Jews in quite a while on this podcast for obvious reasons, but this week we saw two headlines. As sad as you might expect about by JS one is that the by JS app was removed from the Google Play Store over some legal issues by J has multiple apps. So is that was the main by JS app.
But the other is about the US Court approving the sale of some of the by JS assets. And this is something you, you are an m and a guy and a bajes observer, a long time by JS Observer. So, you know, we saw Code HS by Tinker. We saw Epic being sold to China's TAL education group. I dunno what to make of that.
Tell us about these stories. Catch us up on all things Ba J
[00:51:53] Matt Tower: Yeah, I mean, I, I think that the long and short of it is Baiju has moved into a new phase of the company, which is sort of trying to recover value of the various assets so that it can sort of go softly into the night. You know, I, I. The, and, and like there was actually some value.
Epic sold for almost a hundred million dollars. You know, I think it shows there were some interesting assets in there. They were just mismanaged pretty horribly once they came under the by juice tent. So I don't think there's a ton of, of meat on the bone other than like, you know, we went through the sort of like ugly public divorce of the founder from the company and now they've got new leadership in place that are, are trying to get rid of the assets in a way that satisfies investors.
I think this was always the inevitable outcome.
[00:52:42] Alex Sarlin: Yeah. I mean, you see numbers like, you know, bi g has bought Tinker in 2021. It's not like, oh, it's that long ago. It was over pandemic days for $200 million and it sold for $2 million. 2.2. I mean that, you're talking about 99% of value disappearing, but you know, you've always been ahead of, you've always been sort of the ultimate proven.
[00:53:07] Matt Tower: Yeah, I, I think like it's complicated, right? Once you go into this like asset sale phase, my guess is Tinker was starved for resources almost as soon as it was acquired, right? Because we know that things started going wrong in, in sort of like late 21, early 22. So my guess is they got in the tent and then didn't have much to work with and have sort of declined.
My understanding is most of that 2 million is, is like physical goods inventory. So I was happy that Epic got some money and hopefully enough money that tall Will will keep it going. Paul, buying them is sort of interesting as, as like, uh, as formerly high flying company that was hamstrung by its own regulatory regime in, in China, they, from what I understand, have.
Recover decently. I think they're sort of figuring out who they wanna be at this point. They have a bunch of users in China, which means something. I don't, I don't know exactly what. They're still restricted from doing much on the tutoring side, again, due to regulatory concerns. Last I heard they were selling like literal stakes on the internet as a way of sort of pseudo teaching English.
But yeah, I, I think it's sad to see a, a company that was a high flyer just being sold for, for parts. But my hope is everybody involved can then move on with their lives.
[00:54:29] Alex Sarlin: It, it's also, I think on the epic side, a little bit sad to see such. I mean, epic is an amazing product and always was an amazing product.
It's been fantastic and the idea that it is now going to be. Contained entirely within sort of English language learning in Chin, the Chinese ed tech company. I mean, it's fine, I get it, but what a weird space we are in where that can happen. I think it's a little bit of a, I don't know, a little bit of a dismal.
It's hard
[00:54:56] Matt Tower: to know what's gonna happen.
[00:54:57] Alex Sarlin: It's hard to know. May, maybe they'll do something different. Maybe there's just something international with it that'll, I dunno, it's just, it's interesting. The last thing I wanted to talk to you about, sort of murmur out there is always confused. The heck outta me.
The last thing I wanna talk to you about, which is, I think it's something a little bit in the air right now, but basically, you know, we've been in the middle of, of a higher education bill going through the government, going through the two houses of Congress. And there is looking like there is now going to be some movement on this concept of workforce Pell, which is basically allows Pell Grants a federal financial aid to go to short form programs that are workforce oriented.
And this is something that, you know, people have been sort of wrestling with both parties have been wrestling with for a long time, but it looks like it's in both forms of the bill. So it seems like there's a likelihood it'll pass and when it does, you know, we can revisit it. But let's just go, you, you follow this space really closely.
I, I follow it pretty closely and I think it's an incredibly important space. It relates to what we talked about earlier about these CRE and different pathways, postgraduation pathways. All of these Pell grants can go to for-profit providers, it can go to unaccredited providers, it can go to providers that have only been around a year.
They have to be around, you know, at least a year, but only a year. But they do have some limitations and, and qualifiers, which are around job placement and graduation rates and tuition fees. And they're basically these sort of economic qualifiers that say, Hey, you have to prove that you have, you are showing some, some ROI before you can accept PE money.
But you know, this could be either a disastrous and really problematic policy that drains a lot of money outta the government or something that actually opens things up and it takes away that sort of monopoly on mobility that, that, that, uh, two and four year universities have had. What do you make of it?
[00:56:45] Matt Tower: I think going back to our earlier conversation of what the status quo is, and I would posit that the status quo of post-secondary funding today is, is not great. 40% of students that enter the post-secondary sort of system don't graduate. We've got 43 and change million people who've done some post-secondary and gotten nothing from it.
So I think like my perspective is we, we should be experimenting a lot to figure out like a better system. And yes, there will be some waste and probably some fraud associated with that experimentation, but like fundamentally, like the system does not work for the majority of people. Like 60% of the people who enter the system do not end up with anything of value, like across whatever metric you want to assign value to.
Right. And like that to me is a failure. So. And I would also say like workforce fellows enjoyed broad bipartisan support for years and years and years. Like I'm so happy that we're finally gonna experiment with it. Importantly, they're not, and this is where the headlines drive me nuts. We're not just like handing money out like handy, right?
They're like pretty strict requirements. Yeah, we can like debate the requirements, but like I think it's gonna be way easier to debate what works and what doesn't. In the wild, we've made a requirement around income, we've made a requirement around you and I can't just start a school and get hell. Like we have to actually be in the wild and like produce outcomes before we're eligible accreditation.
Like I think I would debate what the word accreditation even means to me. It means that you have access to capital and access to time and you're able to conform to a set of rules that we established about a hundred years ago. I don't know that that actually is that meaningful today. So again, I will be the first to tell you there will be some ways there will be some fraud associated with this.
Expansion of Pell eligibility. But I would harken back to like, we have wasted fraud today. We have all those bots that we're enrolling in California community colleges. Like there was like $30 million of bots that got fell, uh, Pell money this year before this happened. So the status quo is not great. I just, I just want to be working towards something that is better than today.
[00:59:03] Alex Sarlin: Yes, and maybe goes without saying, but let's say it, the ed tech industry has a really interesting role to play in. Whether this goes well or badly, we're talking about job placement and graduation rates of at least 70% that ed tech companies could be creating their own programs. They could also be sort of fueling or providing support for a whole variety of new providers that are gonna be out there that need communication platforms.
They need assessment platforms, they need job placement, certainly platforms. There's a lot of opportunity here, and as with anything where the, the money flows from the government into, into unity, huge amounts of money flow in there will be problems. We lived through some very strange things with for-profit universities in the past that have sort of somewhat in the, mostly in the rear view mirror, but like I think there's been a fear for a long time that, that you go back to a little bit of a free for all Wild West that could be negative.
Hopefully these. Eligibility requirements are meaningful and hopefully they're continuous. You know, that, that it's like somebody can't start a program, get this kind, these 70% job place under graduation rates for for a year and then suddenly open everything up and sort of lose all quality. There just has to be some interesting enforcement, which is not something this administration is necessarily it's gonna be on top of, but top we'll see.
We'll see. I'm cautiously optimistic, but like quite cautiously.
[01:00:23] Matt Tower: Yeah. Again, I and I, and I think like you're right, the for-profits of the late aughts took advantage of a very sort of open regulatory regime. I think at this point we've established generally a set of rules that. Incentivizes better behavior.
Again there. I think there's waste and fraud in the system today, but I think the vast majority of actors are trying to do the right thing and we should be incentivizing experimentation with an eye towards a better system and saying like, yeah, the rules will change over time. They will probably get more complex and not hard to follow, but important to follow.
Yep. And I think that's okay. That's part of learning. That is part of the process.
[01:01:06] Alex Sarlin: It's also gonna be interesting because if, if these requirements work the way they want it to work, if you're trying to get job placement rates of 70% year over year, as the entry level workforce starts, continues to change, disappears rapidly.
Yeah. If paralegal, if the paralegal world disappears, if, if a lot of entry level work changes quickly or disappears, and you still need to be placing people at 70%, it means you have to, you're gonna see a lot of innovation there. You're gonna have to work really darn hard. Yeah, exactly. I mean, this is what bootcamps wrestled with is how do you get lots of people placed and, and, and get them the outcomes they want.
As the world keeps changing all around you, it's gonna change really quickly. Anyway, this has been a ton of fun. We also saw Pearson acquire Dyna. We didn't mention that directly. We came up before, but an early career is sort of, uh, pathway space that has, that already serves over 885,000 K12 students.
Really interesting acquisition. This has been a, a sort of bonus, extra long episode, but I have had a ton of fun. We may, we may end up having to split it into two, into sort of a part one and part-time this week, but it's been a blast. That is always great to talk to you. You always bring incredibly deep and, and very informed insights to so many different topics in, in the higher ed, uh, and K 12 and workforce space.
I appreciate you being here with us on EdTech Insiders. Thanks Alex.
[01:02:21] Matt Tower: Really grateful that y'all invited me on, and, uh, happy to do this anytime. This is fun. Oh, yeah,
[01:02:26] Alex Sarlin: absolutely. And our guest segments will be next, or they may be in a slightly different order, but, uh, they were really interesting. I, I definitely recommend sticking around and hearing Yesenia Svia and Ed Buckley and Marley Straw, the founders of Scholar Education.
Thanks for being here. Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.