Edtech Insiders

Week in Edtech 6/11/25: OpenAI’s $10B Milestone, Meta’s AI Superteam, Grammarly’s $1B Raise, Ohio State’s AI Mandate, IXL & CodeHS Acquisitions, and More! Feat. Rod Danan of Prentus & Lars-Petter Kjos of We Are Learning

Alex Sarlin and Ben Kornell Season 10

Send us a text

Join hosts Alex Sarlin and Ben Kornell as they explore the latest developments in education technology, from AI breakthroughs to high-stakes funding rounds and institutional shifts in AI strategy.

✨ Episode Highlights:

[00:02:45] OpenAI’s $10B Annual Run Rate: ChatGPT drives unprecedented growth
[00:05:12] Anthropic CEO criticizes proposed 10-year ban on state AI regulation
[00:08:04] Google.org Accelerator: New cohort tackling generative AI for good
[00:10:17] News Sites Struggle as Google AI Summarizes Content
[00:13:33] Zuckerberg’s Meta Bets Big: $14B stake in Scale AI and ‘Superintelligence’ team
[00:17:02] Microsoft’s Plan to Rank AI Models by Safety
[00:19:20] Apple Research Paper Questions AI’s Reasoning Power
[00:21:46] Harvard Gets Backing in DEI Lawsuit from Ivies, Alumni
[00:24:09] Education Secretary Suggests Harvard May Regain Federal Grants
[00:26:48] Ohio State Requires AI Fluency Across All Students
[00:30:20] IXL Learning Acquires MyTutor to Expand Global Tutoring Reach
[00:32:55] CodeHS Acquires Tynker to Bolster K-12 CS Content
[00:35:30] Grammarly Secures $1B in Non-Dilutive Funding for M&A

Plus, special guests:

[00:38:12] Rod Danan, Founder of Prentus, on bridging bootcamps to careers with community and coaching
[00:46:10] Lars-Petter Kjos, Co-founder and CPO of We Are Learning, on building generative AI tools for educators to create custom video content at scale

😎 Stay updated with Edtech Insiders! 

🎉 Presenting Sponsor/s:

This season of Edtech Insiders is brought to you by Starbridge. Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

[00:00:00] Ben Kornell: How interesting is it that we're using puzzles and these logical models to test ai, and yet our own tests for human beings rely on memorization, regurgitation of tests? What would it look like if we tested human beings the same way? We test these AI models.

[00:00:25] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and 

[00:00:38] Ben Kornell: work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter, and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.

Hello EdTech insiders, friends and family. We are back with another week in EdTech. I'm Ben Cornell alongside my trustee Co-founder. Co-host Alex Arlin. Man, we're heading into the long summer sprint. And things just keep getting more and more on fire. Alex, are you the fire itself or are you the dog saying everything is fine?

[00:01:28] Alex Sarlin: Those are the two options. I think I am the dog, although I'm feeling the heat a little bit. I'll tell you that Ben. I feel like there's just so much. In some ways it's incredibly exciting. It just feels like Lings are heating up in every direction. AI literacy, heating up. I think capital is starting to come back.

I think there's really interesting things happening with the big frontier models. There's just stuff happening in every direction, just trying to stay on top of it as it grows and changes. I'm feeling the heat, but it's a lot of fun too. And we have not gotten to do our weekend ed tech in a couple weeks, so I feel like we have a bunch of things to catch up on.

Um, looking forward to jumping into that. How about you? How do you feeling like you're, especially with your 

[00:02:03] Ben Kornell: Oh, I'm definitely the fire. I'm definitely the fire. I am not the dog. Let's go, let's go. Part of that is just when you're operating, running a company, you gotta like really. Burn it. And then I think the other side of things is as we look out into the landscape, the things we've been complaining about for 20, 30 years.

And we've been chronicling the cracks in some of these institutional dinosaurs. There's a way in which the combination of ai, economic pressure, uncertainty, regulatory change, is really accelerating. I. A landscape change that I think presents real risk, of course, but also real opportunity in the ed tech space.

And so it does feel like, yeah, I'm on fire. These good. In a good way. 

[00:02:53] Alex Sarlin: Yeah. 

[00:02:54] Ben Kornell: That's good. In a good way. Speaking of on fire, man, on the podcast, we have had so many incredible guests this past couple weeks who's been on? And then also can you share a little bit about our new partnership with Sure. 

[00:03:06] Alex Sarlin: So we've been talking to so many people on the podcast.

We have so many incredibly interesting and deep interviews coming out in the next couple of weeks. We have an interview with Jerome Passi, froms ai. We've been doing a lot of great interviews for our weekend EdTech sessions. I talked to Julia Stig last week, who is XGSV. Ex Coursera love Julia. She's amazing.

And she is doing all sorts of incredible things with AI agents, with her company up limit. We just, every time you turn around, something more and more exciting. Is happening. We've just talked to a company called We Are Learners out of Oslo. That's basically changing l and d by creating almost really Pixar like animation.

So instead of it even being hyper real, it's extremely playful and fun, but you can do like what would've been $200,000 worth of instructional design by yourself. We just talked to scholar education, which is going state down, trying to break into the AI space. There's just so many. Points of light happening if you wanna put all the pieces together, but it has been really thrilling.

We also have, by the time this is out, it'll have just finished a co-design webinar this week with Sesame Workshop, with PlayLab, with Lean Lab, with UX of ed Tech. We hit the limit on that Zoom. We have over 500 people registered. It's gonna be amazing. And then, yes, let's talk Stanford. So we've been working on this for a while, and we finally got it out this week.

So Stanford has been doing this. Stanford, the scale, the learning acceleration unit of Stanford has been doing this incredibly interesting project to basically gather all the research that's out there about EdTech and ai. It's a generative. AI EdTech research repository inside the Stanford collaboration for to acceleration of learning.

And why that's so great is that we, as everybody here knows, have been doing this incredible market map and trying to keep it updated as things change so much about who's doing everything in education. They're chronically all the research about what works. They have over, I think, 500 papers at this point.

So what we've done is created a partnership and now when you go to the ed tech market map, you can actually take particular categories like what's happening with teacher training, what's happening with student support, what's happening with assessment, and go and find exactly the research up to date with their takeaways from the Stanford experts about what we know in research about this exact topic.

So it's a really exciting partnership and I think there's a lot we're gonna continue to do with it over the next couple months. So please check that out. 

[00:05:27] Ben Kornell: Yeah, I mean, I think this is the time where we've been in such an expansive and pun intended generative mode where it's like, okay, what else can we use AI for?

And what else? And now we actually have the research community catching up to that speed and starting to point to. Practices. Kind of where things have started is like, what are evidence-based practices already that we have a good evidence base for? And now let's look at them with an AI component engaged.

And the key breakthroughs I've been seeing in the research space is that I. AI as instrumentation of these practices can be really, really valuable. And the idea being that you're not replacing the teacher, you're not even replacing the content. But what you're doing is making a subscale practice much more scalable by instrumenting it for both students and teachers.

And that goes on the assessment side. It can also go on the peer-to-peer engagement and inquiry-based practices. All these things that are just really hard for a individual teacher or professor to do real time with every individual student. So there's a lot of excitement there. And meanwhile, I think it's important to bring in the news on that landscape.

Paper from Apple really raises a lot of questions about what is the. Potential for ai, and I think it's probably worth calling out on our podcast. We have a little bit of a bias towards optimism and a little bit of a bias towards, we're pretty excited about AI use cases, and so I thought Ethan Malik's response was quite good, which is, if you imagine that AI never evolves at all from where it is today, there are still so many breakthrough applications that we haven't fully exploited yet due to the fact that human systems and institutions and structures take time to change, but already there's so much on the table that allows us to leverage AI for its fullest, even if you believe that we're getting to some Asim total point.

I think also because you and I cover this drumbeat, I feel like it was a year ago, it was like, oh, the new GPT isn't any better and it just, I. We have enough evidence of the last couple rounds that the March of AI is still in early innings, and one of the ways that that's kind of been exposed is there initially was this wave of more and more compute, bigger and bigger data sets.

But then we saw the ability for specialized smaller data sets to add value. And now we're seeing agentic AI where you're even narrowing things down and daisy chaining like AI agents together. So there's a way in which I feel like the AI revolution right now is like water. And it may not be one rushing stream straight to the ocean.

It may get dispersed and come back again, but it's going to the ocean and it's very hard to get in the way of it. 

[00:08:25] Alex Sarlin: I have an even more extreme take on this, which is that this Apple paper is making a lot of rounds on the internet. People are using it as a little bit of a aha moment. I read Gary Marcus's piece on it.

I think it's very, very silly. So I mean, really the nature of the paper is basically they're saying, oh, these frontier models are putting out these large reasoning models, L rms, and they're saying that they're better at reasoning, they're better at sort of thinking through problems going step by step than regular LLMs.

So they said, well, if they're so good at reasoning, let's give them some reasoning tasks. And they gave them things like the Tower of Hanoi, or the classic problem about bringing the fox and the goose back and forth. They're like, well, these are so good at reasoning, they should be able to do amazing at this.

And they didn't do great. They didn't do great. In fact, they would quote unquote collapse. If it got too complicated, they wouldn't follow the prompts like they said. Even if they would give them the crack the code for them and say, here's how you do it. Here's how you solve Tower noi. They would still sort of not be able to do it and go all the way.

I see this as the silliest little speed bump you could imagine. I mean, really all it is, for one thing, apple is not always necessarily the most reliable person in this space because Apple is way behind in AI person, I mean, most responsible narrator for the speed of which AI and I, I don't mean to who knows, it's research.

They're obviously, they're trying to be totally neutral. But really what it is, it's the takedown of this very particular slice that is one of the newest slices of ai, which is reasoning is like visible reasoning, trying to be able to solve problems in this step by step way. It's so stupid. It's so stupid.

I think we will have all forgotten about this in two weeks. It will not even make a footnote in the history of ai. I just, I think it's ridiculous and I think the people who are crowing that, oh, AI isn't actually reasoning, it's not actually smart. A GI is further out, what? Whatever the heck they're saying, have no idea what they're talking about.

I'd be like, none. I really do. I look through a lot of the stuff around this, and it's not even really about large language models. In a lot of cases, they compared the regular large language models to the large reasoning models, and the large language models did better. That's what they're saying. They're saying, Hey, the large reasoning models aren't actually that good at reasoning, but that's a nascent branch of this AI that nobody is talking about.

Have we ever said large reasoning models on this podcast? No. It's so dumb. So anyway, I don't wanna talk about it anymore. Yeah, because I think it's incredibly dumb. Let's talk about some of the other things happening. Yeah. 

[00:10:41] Ben Kornell: Go ahead please before we go on. And I think like I love Apple products. Me too.

Their hardware is like so great, but they're behind in the AI race. And we're gonna talk a little bit about Meta, who, I think it's fair to say Meta is behind. We've got OpenAI, Google, and Anthropic as kind of the lead. And I would also put, Microsoft has a relationship with OpenAI that I would kind of couple them up with.

That remains to be seen how, where Amazon fits in this, but the people who are chasing the bus here are meta and apple. The EdTech point here is, how interesting is it that we're using puzzles and these logical models to test ai and yet our own test for human beings rely on memorization, regurgitation of tests.

What would it look like if we tested human beings the same way we test these AI models? The reason I think that that is a great channel to explore is one, the AO models can't solve it straight up. So it's more immune to cheating. But two, if we say that this is the apex of logic and reasoning, why aren't we assessing human beings on the same thing?

And I just get excited about, you know, I went to one of those escape room and I was seeing my kids using all these like logical tools to get outta the escape room. And I'm like, what a great performance task. And you know, one of the things we do at at Art of problem solving is kids design their own escape room as like a project and they put tons of energy into it because they know their peers or their parents are gonna have to escape the room.

And so like another kind of wave of articles, this last really month has been all of our assessments are totally broken. All of our assignments totally suck. I'm excited about more games and puzzles being in schools, and I think kids would be excited about that. And if this is what we're assessing large language models or large reasoning models on, I.

Let's start doing that with our students. 

[00:12:50] Alex Sarlin: I think the only places where that's usually been done often in the past is through things like, you know, oral exams or I think of the, the famous, you know, management consultant interviews where they'd say, you know, tell me how many your school buses there are in the state of Texas.

Yeah. And work it through. Tell me your reasoning. Tell me how you're coming up with these numbers and what you're doing. That is very similar to how these reasoning models are, are trying to work. And you're right. That's never been done at scale. It's only be done when you can do these really complex one-on-one when you can have a expert interviewer in both those cases, right?

An expert interviewer asking really detailed questions and using their own internal rubrics for whether the answers are good and it's never been done like that. Instead, you know, we've done all these much more standardized, much more multiple choice. Surface level assessments for many years. I can't wait for us to get into puzzles, games, reasoning tests, just a totally different way of assessing what knowledge looks like.

And frankly, I think LLMs are gonna be absolutely core to that. You don't need the language model to be the best reasoner in the world to be able to assess whether someone else's reasoning is sound and makes sense and shows a deep level of understanding. So I think we're heading there and I, I think we're heading there hopefully pretty soon.

Let's talk about some of the other things that happened just around the horn in the AI space, and then let's jump into to some of the EdTech stuff. So a couple things that jumped out to me. I'd love to hear your thoughts on them. You know, we saw OpenAI this week say that it hit $10 billion in a RR because chat GBT is continuing to grow like crazy.

That's, you know, amazing for them. And frankly, I, I, I think the product is getting better very rapidly. I've been really impressed with what they've been doing, both, even in the, the regular pro version. You've sort of really noticed the difference. They're putting their money back into product development.

It's been exciting to see. We saw philanthropic talk about the 10 year ban on the state AI regulation that's in the, in the bill right now. That's being debated in our government. Basically saying that they, he thinks it's a bad idea, which is very, you know, on brand. It's, it's too blunt to basically keep states from regulating.

I'm curious what you think of that. We saw some really interesting news about Google's AI interviews starting to have downstream effects on news sites because, you know, a lot of clicks to news sites are driven by regular Google web search. And if Google is actually answering the question, rather than sending you to the news articles, it's breaking the model that we've all been, you know, relying on for many years.

A lot certainly SEO people are relying on, that's a huge deal. Not necessarily for education, although I think that does have education ramification. So any of those stand out to you? Which one of those makes you most say, oh, I wanna dig down on that one. 

[00:15:19] Ben Kornell: I mean, to a degree all of them do. I feel like first on the regulatory side, we are in a zone where we do not have the functioning processes to make precision regulation.

And so we are in a spot where pragmatically, it's either no regulation or like brute force regulation, and that is gonna be tough. I wish the US would just. Try to do more fast following of the European Union. I do feel like they might be more conservative than we are on ai. So being responding to something as a framework is sometimes better than just trying to nascently come up with your own framework.

And if I know anything about Europe, they really know how to regulate shit. So I mean, and we can do that without like hampering free enterprise. I think, you know, one of the big challenges too is our privacy regulation and other laws that protect kids. We're not necessarily even enforcing the laws that we already do have.

So a big point in that space is not only do we probably need some new standards and laws and approaches, but also. We just need to enforce the ones that we already have. To me, the most interesting news of this week is meta, basically getting in the game. And I think that this idea that there's this super intelligence AI team at meta reporting direct to Zuckerberg, it gives me a great image.

He's like in the bunker with an Oculus on being like, okay, team, we're gonna create super intelligence. And very quickly you could imagine him having a cat on his lap and him petting it. Having like some Doctor evil vibes, the super intelligence AI team. The question really with meta is they went all in on this open source strategy and I don't think that it has accrued value to the company.

And in some ways pairing that with the OpenAI news is like, oh wait, you can make a lot of money from this. OpenAI is making it largely in a consumer motion, and Anthropics largely making it in a B2B motion. Meta can't do either of those because it's got this OpenAI, you know, I think they were trying to make the developer play, which is we'll be the developer like underpinning ecosystem, and that just hasn't come to pass.

Like the developers are willing to pay. For the open AI directly out of credit card or for the anthropic. And then, you know, meanwhile, Google's doing AI everywhere 'cause they're already everywhere on all these surfaces. So, which we think is a totally winning strategy too. So I, I think after all the money that they invested in the Metaverse, you have to wonder how many more big bets does Zuck have.

And if it were not for Instagram, you know, basically had they not made the acquisitions that they made. This would be a company on its way out. 

[00:18:24] Alex Sarlin: That's a really interesting insight. So I have a take on this. I'm curious what you think of it, but I've been reading a lot about sort of the recent history of AI and how this technology sort of got into all these companies, how they bet on it.

And one of the things that's particularly interesting about the Facebook and you know now meta AI strategy is that it was really focused very much on Jan Koon, the head of Facebook's AI initiative for many years, who is a professor and continue to be an N NYU U professor as he was also the head of at a ai.

And I. I think what we've been seeing, and you've seen this in every way, especially with an open ai, but we have seen in the last few years, AI go from being this almost totally academic field where it was these top professors and graduate students and postgrads and graduate students who were the only people who could really make this work.

And suddenly when they realized that they could make tools that were more powerful, that were better than Google's translation tools, that were better than Siri, that were better than all these things, suddenly it was this mad rush for talent where all the AI companies started to try to hire everybody they could.

And one of the things that happened with Meta is that they. Went outta their way from the beginning to have open source and you know, to have a somewhat of a academic approach to ai, which includes all of the openness. And then you saw Sam Altman and Open AI come along and say, well, yes. They also came from an academic standpoint.

Originally they, IS and some, a lot of the people behind Open AI are from the same labs, they're the same academics, but they commercialized it so quickly and they've wrestled as you know, back and forth with what are we, do we have an ethics board? How fast do we have to go? How do we do this? You know, Facebook tried to buy Deep Mind and one of the reasons it didn't happen is that they didn't, this ironic, 'cause it feels like the opposite story, but it isn't.

Right. Facebook didn't agree at the time to have an ethics board. DeepMind was like, if we're gonna get bought by one of these huge tech companies, we want to have these. Guardrails in place to keep them from making us work on defense department and Facebook wouldn't do it for them at the time. And that was one of the reasons they wouldn't go there, that there were other reasons as well.

Why? One of the reasons they went to Google. So there's been this push and pull of, you know, how open is ai, how much do we care about a GI and other things like that. How much do we want ethics baked into what we're doing? That was why Dario Amay left OpenAI and started Anthropic and now you're seeing Zuckerberg.

I mean, I don't mean this, I know what you mean about supervi. Take the Dr. Claw. I don't think that's the goal, but it is really interesting that they are now working with scale ai. They are working in defense. They are literally working in AI and defense as of you know, right now. And I think all these big companies have been trying to navigate the right combination of hiring academics, figuring out what is their ethical story?

Is it a story or is it actually baked in? How much do they care about a GI versus. Commercial value of AI right now. Google famously made Google Brain, they got some of the best people in the world, many of the best people in the world, and put them on, make the Google search algorithm, you know this much better.

And a lot of the Google experts went their own way. They basically carved their own path within the company and found ways to do other things without, they sort of pushed Google to do its own thing. Anyway, the point is Facebook purposefully went to a strategy of openness, and I think now I'm agreeing with you that they are feeling like they have not won commercially by doing that.

And they're trying to go in a different way. And it feels like this. Alexander Wang hire is the exact opposite. Move on purpose. 

[00:21:53] Ben Kornell: Yeah, it feels like a pivot. Yeah. The irony that they might end up being trying to be the govtech player is so ironic, just given the history of Facebook meta with the US government, but I.

That's really the last big marketplace to go to where they could be number one and all of the open ai, anthropic and Google people have a lot of moral questions about whether to go there. I'm not clear that, you know, Zuckerberg and Meta have many moral questions, period, full stop, and I'm not sure. One, I do want our government to be able to have AI tools to be more efficient, but also.

I think that there is an arms race worldwide in global ai, and as much as there's ethical dilemmas here, there's also you've gotta be a realist and say, well, if we don't do it, other countries will do it and we need to keep up. It is just a very, very odd marriage to me to imagine meta going really hard in that space.

It's also very hard for me to imagine them winning share from Open AI or Anthropic or Google at this point. And maybe it's in the the developer space where developers are willing to switch to things or still open source. You know, there's definitely defenders of open source. When you write your chapter of the book on like this last generation of AI in education, it is interesting to imagine what would've happened had Dario not left open AI and there is no anthropic would that have left the door open for another player or not.

And so you told a story of research to commercial. Another story here is the economics of startups. If you are the world's preeminent AI person and you're at Google, it's a publicly traded stock, you're gonna see a certain percentage improvement over time. And you work with a big company that has like some halting, you know, start, stop on things because they have to worry about all the other business lines they have.

Whereas if you go and you work at Open AI or you go and work at Anthropic. You've created life-changing material wealth for you and your family. It's all on one product and on one set of rails. And I think that's where Meta has also had challenges is that they're a big company now and it's hard for them to just absorb an initiative.

And this is why always have been so impressed with Microsoft's moves over this time. I think because they had the antitrust stuff in the nineties, they understand like, okay, LinkedIn, that's awesome, but if we bring it into Microsoft, it's not gonna be its full potential. Let's find ways to add strategic value.

Same thing that they've been doing with OpenAI and the way in which Microsoft has managed to hold enterprise, but then have these adjacencies that are growing its net footprint. I mean, it's a brilliant strategy for a like, I think it's their 50th anniversary. Wow. This week. Oh my god. It's pretty incredible.

That is crazy. You know, on the EdTech beat. We're seeing all of this happen in the AI space, and meanwhile, the higher ed machinations continue to evolve. I do feel like with the Elon Trump spat in the news, Harvard has kind of ratcheted down on the list of priorities, but it was interesting to see 24 universities, including five Ivy League schools and more than 12,000 alumni, took legal action to back Harvard in its legal battle with the Trump administration, Princeton, Yale, Dartmouth, brown, and University of Pennsylvania.

Where some of the schools, I think there was a feeling for a period of time, I had a feeling that Harvard was kind of out on its own and everybody else was like happy to keep their heads down. But now you're seeing a group rally because they believe the laws on their side and they also, as they game this stuff out, I think they realize that if Harvard loses on any of these fronts, it creates like pretty fundamental challenges for their long-term future.

And those have to do with things like the funding from the government. It also has to do with things like, you know, student visas and the role of the government in that. So I feel like there's something about the political news cycle that these headlines hit hard and then every single day the Harvard people are gonna wake up saying, let's try to do what's best for Harvard.

And the news cycle moves on. It seems to me like we're back on a more positive trajectory from a higher ed perspective. What's your read of the situation? 

[00:26:41] Alex Sarlin: I agree. I think we've seen over the last few weeks a constant ratcheting up of the war on Harvard, specifically from the federal government. And I think, but there's been a lot of counter suing and I think at this point it's starting to head towards more of a, the word is a, a deante.

I mean, I had said a few weeks ago on the pod that, you know, I think Columbia is now acknowledged to have probably made the wrong move by basically trying to acquiesce and then immediately getting rolled over anyway, and that the president getting let go anyway. And I think Harvard by saying, are we not gonna roll over?

We're gonna stand up. I think it's galvanized the higher ed community, especially the Ivy Leagues, to say, okay, yeah, this is. You have to punch the bully in the nose. I used the metaphor a few weeks ago, and I think they did, and now it's like that scene in the all the teen movies where the, you know, the kid punches the bully in the nose and all the other kids rally around them and say, yeah, go away.

That's what we're seeing happen with all these Ivy League schools and alumni and organizations start to say, okay, they may back down, but whatever's gonna happen, the whole higher education ecosystem needs to ride this out. They can't give in at all. And I think that's what's happening. Whether or not they will be able to truly pull, you know, billions and billions and billions of funding, which they're still threatening to is one thing.

You're also seeing Trump start to move his attention to a University of California system because he's starting to declare, you know, literally war on California. And so California might be an even more convenient target even within the education system. So I think we're gonna see more there. Let's talk about some other ed tech stuff that happens.

So handshake. Made some news this week, which is interesting. You know, when we were compiling all our AI tools, there was a initiative a few years ago at Handshake, they started to launch something called coco, a little AI assistant. As you may know, handshake is one of the, the major higher ed, basically a job.

It's like the leading job, listing site and job board for higher ed, which is very important role in the, you know, in the EdTech ecosystem. But they hadn't done much with ai. They just rereleased. I think it's yesterday or a few days ago, this handshake ai. 

[00:28:43] Ben Kornell: Handshake AI is. I think really important use case for ai, which is really about the matching process between uh, students and employers.

Oh, okay. And you know, handshake's origin story was really, they were kind of going out of the back of a car, driving around to mid-tier universities across the country where there were no career fairs, and actually bringing the jobs to the students and bringing the students to the jobs. And I think this idea of career assessment, career placement, career connections is actually super right for ai.

But the challenge has been distribution. And how do you get enough people on your platform with this double-sided marketplace of jobs and job seekers? What is really great about Handshake stepping into this zone is. They've got the distribution platform, right? They've got a lot of universities on board.

The challenge is that this space has gotten so freaking crowded that is it enough for Handshake to kind of reassert its control in the space? I feel like they kind of reached an Apex pre pandemic and then when everything went online during the pandemic, the kind of value proposition of a career fair that's online really dissipated.

[00:30:02] Alex Sarlin: So it looks like what they're doing is actually trying to bring AI Labs in as the other side of the marketplace, either using, as you say, they're using their distribution network, 1500 schools, university partners in the US and Europe, over 18 million students. Uh, I think what they're trying to do is create a pipeline for college graduates to go work at AI Labs by helping them.

Match make between people who are actually ready for cutting edge AI work. They mentioned, uh, you know, 500,000 PhDs inside the handshake network in stem, in 50,000 in STEM fields like quantum mechanics. That's an interesting move. In some ways it seems smart in some ways it seems very specific because if they're truly trying to have only the AI labs as the other side of the marketplace, it's not a very big other side of a marketplace.

So we'll see. You think it'll work, Ben? 

[00:30:58] Ben Kornell: I think it's a necessity that they do this. And you know, there's a way in which some of these announcements today feel more like. Requirements to keep up with things rather than some sort of new breakthrough. 

[00:31:12] Alex Sarlin: Yes. I think that's really well put. That's what it feels like.

This is, it feels like Handshake is trying to dive into the AI space in a very specific way because there is a need there, but I think they're gonna have to expand their scope over time. We also saw Ohio State University this week come out and say that they're requiring every student to use AI in class and become AI fluent.

I'm probably exaggerating that a bit, but it basically, it was a very public stance saying we are gonna be forward leaning on AI and actually make it part of our philosophy. That's really interesting. We saw IXL Learning, acquire my tutor. My tutor. We interviewed the founder of my tutor. Yeah, Berkeley.

Yeah, he's great. That's a really interesting acquisition. IXL has made a lot of really good acquisitions. They're very smart acquirer, and I'm curious how they're gonna wrap that into the IXL suite, which is getting increasingly powerful. We also saw Code Hs. Acquire, tinker. Which is in the coding space, and we saw Grammarly get a pretty significant billion dollars in non-diluted funding, all to drive strategic acquisitions.

Very interesting. Grammarly is on the Downlow. One of the more successful, yeah, you could call it EdTech. I'm not sure I would call it pure ed tech, but you know, certainly writing assistance platforms. We followed them for a long time, but a billion dollars for acquisitions. Wow. 

[00:32:32] Ben Kornell: Yeah. When I hear non-dilutive, that tells me debt and then I get worried.

You know, non-dilutive is, means it's either a grant or it's a loan. 

[00:32:44] Alex Sarlin: Mm-hmm. 

[00:32:46] Ben Kornell: And a billion dollar loan in this market. That's a pretty expensive loan. That's 

[00:32:51] Alex Sarlin: true. I think Grammarly was way ahead of the curve on using AI for, to do some very, very practical things and has spent many years growing its product footprint until it's, it's very, very big right now it feels like.

Yeah, I mean, without knowing a lot of details about what they're thinking about to using this money for or the format it's taking, it seems risky. But I also think that there are so many directions they could go if they really have the options to buy a company. So I don't know. 

[00:33:20] Ben Kornell: I don't know. Like when I hear this move, I wonder about what's grammarly's differentiation in an AI world where I can use AI to write my emails anyways?

And are they losing their competitive advantage and therefore need to go on acquisition spree to try to regain like market hold or something like that? I'm a big fan of theirs and over the last decade they've been. At the forefront of a real world application of ai. Exactly. And my wife being an English language learner, it's like Grammarly is an awesome tool for English language learners who are trying to communicate in a professional setting, but all the other tools now are really good at those things.

Second, when you get debt, you're either becoming a private equity firm yourself, or you can't raise the money through some sort of stock fundraise, and if people aren't willing to take your equity. That is a negative signal about your company. Now, if debt was cheap or they were getting taken over by a private equity firm that already had debt on the cap table, that would be different.

So, you know, whenever I read these tea leaves non-dilutive, this is great. You know, billion dollar capital raise, or sometimes people have these crazy valuations. But what ends up happening is it's like a very, very small amount of equity and a lot of debt. That is a red flag move and they're gonna have to justify carrying a debt load of, you know, if this is indeed debt of 7% on a billion dollars, that's a lot of cash flow.

So coming back to the Roundup, I mean, one thing that I think in the m and a space, we're seeing the tutoring companies, this, my tutor acquisition, the tutoring companies really are representing a channel of delivery that is adjacent to the school delivery channel. And so those that have real footprint, it's hard to make those individually a awesome business.

But if you think about it as another delivery channel beyond schools, it can be really like attractive. Well, we're excited to have some great guests here. Thank you so much for joining us. We're gonna see you around the summer, and if it happens, EdTech, you'll hear about it here on EdTech Insiders. Thanks so much.

[00:35:50] Alex Sarlin: For our deep dive this week, we're talking to Rod Danon. He's the founder and CEO of Apprentice, a career services platform that's reimagining how the next generation discovers and lands their dream jobs. Rod is a first generation immigrant who sucked at job searching. I know how he feels. Rod previously built AI models at American Express and advise students on careers at a bootcamp before taking all his experience and founding Apprentice in 2021 with Prentice, his mission is to build a job search experience that is fair, fast, and fun for everyone.

Not always the most fun thing, but it could be. Today. His platform uses AI, community and gamification to help schools like DeVry University and Miami-Dade College. Big schools get students hired 54% faster. Let's 

[00:36:37] Rod Danan: unpack it, rod down, and welcome to EdTech Insiders. Thanks for having me, Alex. Excited to dive into everything fun.

Everything fun. Let's, we'll do it. Fair, fast and fun for all of us. Yeah. 

[00:36:47] Alex Sarlin: So I wanna give you a chance to just tell us everything about what Prentice is, but before we go right in, there was a quote this week that stuck out to me, and we talked about it a little bit on our news show, which is about Dario Amay, the head of philanthropic, basically put out a prediction saying that within five years, a large percentage, I think you said half of entry level white collar jobs might disappear.

They might get taken over by ai. And there's this big open question about what this means for the next generation of workers. You are right in the middle of this. You are helping current college students and bootcamp graduates get hired. What do you make of this crazy job market, and how do you think AI could actually help and not just always be something making it more difficult?

[00:37:27] Rod Danan: Yeah, well one, we hear a lot of these kind of bold statements and jobs disappearing and a lot of times it's, it's one to get a headline to boost the company, but at the same time, there is gonna be some sort of shift, and we're seeing things change so quickly that I think everyone's just kinda like throwing stuff of the wall and see what sticks.

But what I'm seeing so far is yes, like the numbers show it entry level roles are down in terms of being posted. Is it gonna continue? Is it gonna get worse? Time will tell on how own, I think as AI gets better, it's gonna be able to do a lot of this menial work. And in contrast, like kind of previous revolutions of tech or industrial revolution stuff, what's happening so quickly that markets are delayed in terms of how they can react to the shifts in labor.

So as a result, it's like, now what we're seeing is I think there is gonna be an initial dip and then people are gonna figure out, okay, like what do we need people for? Like what is the. Next level, the next generation of work. And overall, I think it's gonna come down back to what really makes people human.

If we're looking at, okay, a job like data entry, just putting numbers into a box in a spreadsheet, is that the best use? Is that the most human thing to do? I don't think so. So other jobs will evolve and it's okay. Okay. So now that we can automate this menial task, people can use their humanity, their creativity, and from there do new things.

What are those things? I honestly, I don't know. I'm not smart enough to predict the feature yet. But yeah, I think there's gonna be short term pain, but it will stabilize after that. Yeah, so 

[00:39:06] Alex Sarlin: you're up close and personal when you talk about jobs changing. It reminds me of the original computers, the when computer meant a person computing at a company.

And obviously computers took the computing role from people because it, as you said, it's not the most human thing to do rapid computation and changed what we do. And I think it's, we're entering another phase of that and nobody knows what it's gonna look like. But you are up close with it, with Apprentice, because you are working with colleges, you're working with boot camps, you're working with employers.

Tell us a little bit about what Apprentice is and how you got it started. 

[00:39:39] Rod Danan: Yeah. So with Prentice before doing this at different startup, but scrap that, but at the time I also worked with a lot of like bootcamp grads and saw how these people that had skills were hard workers and everything. They were great, but they didn't know how to navigate the job search.

And the reason is most people don't have an Uncle Bob that tells 'em all these kind of unwritten rules. And unless you know what to do in a job search, it's really just figuring it out until you finally crack the code. And most people just never do. So as a result, I was like, okay, so we have this big issue, all these talented people, how do we get them into roles?

And I need to figure out how we can get people from zero to hired. Started experimenting with a bunch of different things. I actually had a podcast myself. I built the newsletter. I had a community of job seekers during COVID, and it was help people out and really narrowed it down to like, people need one, they need direction.

You need kind of Google Maps for what I do in the job search. You need motivation. So you're gonna get no after, no after, no, no matter who you are, even senior people. So how do you keep that momentum going? And then three, you need to make sure you have people around you. Wow, you're job searching. 'cause you might think, oh, woe was me.

I'm the only one struggling. I don't know anything. But other people struggle too. And having people that are on the journey with you, whether it's an advisor or an alumni or a friend, that's gonna be what helps you get hired faster. 

[00:41:00] Alex Sarlin: Yeah. So let's talk about the first of those three, the guidance piece, because I think this is really an interesting role that education sometimes takes on, you know, traditional educators sometimes take on and sometimes don't.

Universities have been increasingly under pressure for, I think all the right reasons, to provide career guidance, to provide a return on investment for all of the tuition that people are spending. You work with some big universities that have people very explicitly looking for careers as one of the main outcomes of their education.

When you work with these universities, how do they approach that role of guiding students, you know, what do they see as their responsibility once a student graduates and making sure they get the outcome they 

[00:41:40] Rod Danan: want, and how are you supporting them? So we talked about how AI's changing everything, but also in general, just the markets for schools are changing too higher ed, where previously it was just the government's gonna give out more loans.

People are just gonna pay you and the prices keep going up. That age is over. Now people are asking the real questions, which are I. If I go and dedicate four years of my life and a hundred thousand, a hundred thousand dollars to this, like what is the outcome on the other side? And schools don't have a good answer, are not gonna be the ones that survive.

And we've seen tons of school closures, record closures in the past few years. So if we look at some of the schools that we work in, the bootcamps, we started in the bootcamp space for them, they have income sharing agreements and tuition refunds if you don't get a job. So for them it's literally on the bottom line.

If we don't get someone hired, we're losing our money. And when we look at higher ed, so it's interesting, we're like work with like Derive for example, I know before I started working with them, I always remember back when, when you know they got su, all these, all this information. But in general what we're seeing is that like for-profit universities are actually.

The most innovative and most career driven because they have to, in the bottom line, if they don't get people jobs, they're not gonna get new students and enrollments. So as a result, they're taking kinda the first steps. And I've seen them with DeVry and some others where they're really innovating. And then additionally, look, you know, Miami-Dade College, again, very innovative in general.

I know when, when adopt investors, they're like, yeah, don't use Miami data as a model for like modeling up the market. But in general, there are schools that are doing a lot of interesting things and saying, Hey, we made a promise to students. We have to deliver and integrate AI and workforce base and like all these other things to make sure that people have the best shot.

So if I'm a student or if a parent, I'm looking for one of those schools. 

[00:43:25] Alex Sarlin: Yeah, 

[00:43:25] Rod Danan: it 

[00:43:25] Alex Sarlin: makes a lot of sense. It's interesting that you say that, you know, the for-profit sector for a long time got a really bad rep or chasing veterans who, who are using public funding or for various kinds of tactics. And what's funny is, I, I agree in some ways we're coming full circle in this strange way.

It's this articles just this week about how for-profit colleges, partially 'cause of the government right now are employees too. They've been focusing on career outcomes. They've been focusing on fast graduation, they've been focusing on basically, you know, the customer experience for students for a lot longer than, than traditional higher ed.

And now that's starting to be really, uh, interesting to students. Again, we'll see how it plays out. But you know, we mentioned in the intro you are helping these schools get students hired, 54%. Faster. So faster hiring speed is an absolutely core, it's an aligned incentive, right? It's a place where students certainly wanna be hired faster, but schools also really wanna be able to say, our students are hired quickly when they graduate.

What does that look like? How have you been helping the colleagues as you work with showcase that success and say, look, with AI based career tooling. We can accelerate your success even more. What does that look like? 

[00:44:33] Rod Danan: Yeah, well, one apprentice is great and I'll tell you why in a second. But you know, I think it starts with framing from the actual organization.

If we go to an organization that's just like, oh, hey, here's some extra tool we got. Like use it if you want to, like, then you're not really gonna see great results. So it has to be something that comes from the organization top down is be like, Hey, you know, career outcomes, like this is something that we're actually focused on.

Weave it into either curriculum or events or something else that make sure that students understand this is a focus. And then at the same time, you also have events and you're promoting it and you're making sure that this is a new way of doing things. So then once we give apprentice to someone out at our school, we help them from the bottom up.

Here's how to frame it, here's the marketing materials, here's how we get people in. And aside from just, you know, kind of connecting to their systems or uploading the CSV, like we do everything else from there, we will follow up by SMS and phone call and make sure that people actually get signed up and once they're in the system.

You know, we set a baseline with kinda like our AI flow in our onboarding and figure out what are you starting from? Are we starting from zero? You haven't done anything or have you, do you have a resume and a LinkedIn already? And then from there we guide you where it's like each week you have here your goals.

I. Here's your summary report. Here's what you gotta do to improve, here's your job track, or your AI resume tools. Let's tailor this. Let's prep for that. So it's always something to do with an application. And as a result of kind of these weekly timelines and daily updates, it accelerates people to go quicker and quicker and then eventually get hired faster.

So there's that structure 

[00:46:06] Alex Sarlin: and guidance and confidence building. You know, you mentioned the motivation, feeling like you have a support system. You're not just, you know, gonna send out a PA resumes and then shrug and say nobody seems to want, you know, it's such a morally like destabilizing moment for, especially for people who just graduated from that, from a, either a formal or informal education.

They suggest that they just finished a bootcamp, they just finished a college. Whether it's a two year college, a four year college, wherever it is, and then suddenly they're out in the world and say, I don't even know where to start. I don't have the guidance. I feel lonely. And as you mentioned, but one of the things that Prentice does I think is really interesting is it's fun for everyone.

You use gamification, you have this concept of career points. Why do you do this fun gamification? I can sort of imagine it's always great if something's more fun than, than not, but I'm curious how you came up with this structure of trying to gamify the job hunt. Does that 

[00:46:59] Rod Danan: certainly sounds like an interesting way to do it?

Yeah, so at a kind of bunch of different early influences, when I was trying to think through this system and you know, when I was entrepreneurial, like in the early days there were a few different communities. There was a. Indie Hackers, which had points, which they didn't explain. You couldn't use them for anything.

But for some reason I was still motivated to keep running them. And then there was another platform, I forget what it was called, but basically also gamified like your, your Journey while building a startup. And then of course, the biggest gamification example is Duolingo. You know, I know people log in for three years, still can't speak a lick of Spanish, but they keep logging in, you know, so there's something special there.

So from the beginning I was like, we need to figure out a way to keep people going because that's what's gonna help 'em succeed. So gamification was built in and career points was the metric that we named because we thought about this as kind of like a new score. In some ways it's kinda a little black mirror where it's like, okay, you have this score for someone that's like how they might get hired.

But at the same time, we wanna figure out a new way to evaluate talent because. Jen, early days, I've seen so many talented people that just didn't get a shot. They didn't look good on a resume or something like that. But they're a diamond in the rough. So with this, our goal is, okay, we can track our people hitting weekly goals.

Do they complete task? Are they active in the community? And a few other different like psychometrics. But then eventually we wanna add in the skills component to a different project based contest and figure out to how we can build this profile to be like, Hey, this person, yeah, they don't have any experience in this field yet, but everything says that they're gonna be an amazing employee.

And as a result, we can give more opportunity to the people that are putting in the work, but may not have the right connections. 

[00:48:42] Alex Sarlin: You mentioned connections just there and, and earlier on the sort of, you know, not everybody has an Uncle Bob, right? Not everybody has somebody to, to guide them or to make intros or to do informational interviews or some of it, as you mentioned, unwritten rules that are the job hunt.

And I think the job search has been strange for a while now between applicant tracking systems and just this machines hitting other machines. And that's how everybody gets noticed. And then of course people put out these job listings and get hundreds and hundreds of applications now because it is easier and easier for people to apply.

So it's already been a strange shifting time for hiring. I'm really curious. AI is, I think, responsible for some of those changes. It's, it just makes it easier for both sides to try to gain the system for both employers and aspiring employees. But your take with Prentice is that AI can also make sense of this complicated, opaque system.

It can quantify things. You Blackmeyer example, it can quantify things that aren't always quantified. It can highlight experiences or skills or behaviors that might not be obvious from a resume and maybe could help someone stand out. Tell us about how you see the job hunt changing and what you'd like apprentice's role to be.

To even the playing field and not make it about who had the internal connections or who happened to have somebody who told them the trick about how to connect with the hiring manager on LinkedIn rather than whatever the heck it is. You know, this, this stuff is so complicated. Like what do you see P'S role and AI's role in general, but start with Prentice in even the playing field and not making it just more and more chaotic.

[00:50:13] Rod Danan: Yeah, well, you know, it's not just, okay, I'm building premise and I'm seeing the perspective of the candidate, but also we hire ourselves. So I see it as an employee. I post a job and I get hundreds of these applications, many of them irrelevant. Many of them are fake candidates in general. And it's just a huge waste of time.

And because all these kind of AI to AI systems are going on in terms of philosophy, like I don't like, like the auto apply ai where it's like you pay a fee and it just applies a hundred jobs for you each each week, because usually it's irrelevant. But what I'm seeing is, one, I'm seeing a lot of companies in general be like, okay, no, we don't wanna post this job because we know we're just gonna get flooded with dms, emails, and applications that will waste our time.

So what if we just look into a talent pool somewhere? So they're looking at either LinkedIn recruiters obviously the biggest one, but there's other tools I've used like people, GPT, so I. Employers are looking at talent themselves and doing more outreach. And then in general, I would say kind of applications I think will go down and down, which is why like also apprentice, you know, we have like networking trackers and community, we're trying to help people like building social capital and with that, you know, it's the people that will help the people get jobs, but it's kind of a gradual change.

You know, we still have the infrastructure in place for resumes, cover letters and everything that's been done in the past. It just, it'll take time and be more people oriented moving forward, I think. So how do we increase social capital for people that don't have it? That's the new problem. 

[00:51:46] Alex Sarlin: We've been talking, especially in in education technology for a number of years about how.

You know, the resume is gonna disappear or how the job search is going to change, or you, I love the way you're mentioning the proactive approach from employers, rather than just putting up a sign and then letting the whole world flood you. How might you proactively jump into a talent pool? These changes are so funny because we predict them and predict them, but there's a lot of inertia, right?

This is a huge, people are hiring all the time. There's millions of jobs changing all the time. So it's just hard to change a system like that. At the same time, I have a feeling, and I'm optimistic on this, that the change has been slow, but it's gonna speed up, especially as we figure out collectively how to use AI to make sense of the complexity here, rather than just to add more complexity.

Like, like have like applicants or applicant tracking systems that are doing some mysterious algorithm that nobody understands to recommend people like it's getting messier. But I think there's a light on the other side, which is, I think what I'm hearing you say as well. Let's talk about the boot camps.

I worked in boot camps. It was huge. Trend, and I think very useful for a lot of people looking to change their career outcomes for a few years, and now it's taken a little bit of a backseat. Enrollments have gone down a little bit and people are trying to figure out what is the right alternative pathway that can truly help them stand out in a private job field, truly help them prove that they have very up to date technical skills that used to be coding.

Now it's ai. You work with bootcamp providers, you work with a lot of students who are trying to figure out this calculus of, you know, how do I, what is the right set of things that I should invest in to make myself stand out? What do you see as do, do you think bootcamps are gonna come back in and pivot to AI or other technical skills?

Do you think they're holding on and trying to maintain their status as a meaningful alternative to colleagues? Like what do we see as the future of the bootcamp space? 

[00:53:38] Rod Danan: Yeah, it's funny 'cause like, you know, we've worked at bootcamps for a few years, so we saw like kind of industry change and one, we saw a lot of customers close.

And then the customers that didn't and kind of our new partners we're seeing a lot of expansion actually. So where's that coming from? One is there's a lot of money in workforce. So with these workforce boards, you know, they're kind of doling out might these organizations and getting a lot of students.

So that is one big source. And then what we're seeing kind of on the university space is most of our bootcamp customers, you know, well used to be, I think we talked about Trilogy was the first one to partner with. Like universities. Really what we're seeing, basically all the bootcamps do that and they're doing it successfully.

They're growing. And as a result, also what we're seeing for students is, you know, they might go and do accelerated Bachelor's for like three years or something like that, stack on a bootcamp, and then they come out with both the bachelor's and the job training, and now they're more ready than other candidates for a role.

So it's an advantage. It's something that can be used as long as you kind of keep in your pocket. 

[00:54:42] Alex Sarlin: Yeah, so boot camps for a while stood out as clear parallel alternative to school, and now it's formal education. You have a lot more university partnership models, you have fewer bootcamp providers, but they're the ones that are there are bigger and still have the serious cloud and ability to bring students in.

And there's this idea of like, boot camps are maybe not considered an alternative as much as. Part of a strategy that students 

[00:55:09] Rod Danan: might use to accelerate themselves into a job market Exactly as part of the strategy. And you're seeing, you know, bootcamps, which is, might be a three month program, but you're also seeing kind of even shorter credentials where people are stacking on these AI courses and you can, you have a lot more maneuverability in terms of the course material for a bootcamp versus a university that might need a couple years to build that curriculum.

But AI's changing by the week so they can outdated really quick. 

[00:55:35] Alex Sarlin: Yeah. One more question. This is bootcamp related, but it's a little bit of an oddball question, but I'm really curious about your take on it. You know, one of the things that I found. Very strange about bootcamp space in some ways was that it was very clear that the reason people were going to bootcamps was that they wanted to change careers or launch themselves into a new career.

And bootcamps would always have, parts of their curriculum would be dedicated specifically to resume building, listing senior your skills, junior LinkedIn profile, you know, job searching and, and keeping your motivation up and finding a support system. Many of the things that you've mentioned, students weren't always very excited about those sections in my experience, which is so strange.

You know, they would get very excited about the actual skills, about the coding, about the design, about the design thinking or the ux, whatever they were interested in. But the job part of it still felt ones, and they would actually avoid, I, I saw this really close up. They would avoid the sections of now build.

Profile because it's gonna help you get a job or now start applying. And I always found that like a real weird paradox and I, I attributed it to anxiety, basically. It's like if you're focusing on the material, you're in control. If you are like, okay, now I'm out in the world, out in the cold trying to get somebody to get me an interview.

Not a lot of control there. But again, you're right in the middle of this, you see it very close up and you see it in in 2025. What do you make of that? I, it was always so strange to me. 

[00:56:57] Rod Danan: Yeah. So one job searching is tough. Like Yeah. People don't wanna focus on it. And it's funny 'cause uh, I forget like, actually it was probably multiple times I talked to people and be like, oh yeah, like we have to motivate people to job search.

Like that's why you have the application, all this thing. And they're like. What, like you have to motivate people to go get a job. Like it didn't make sense for some people, but when you're in the job search, it's just like, this is tough. This is tougher than I thought. Yeah. I got the bootcamp and I got this degree and everything was kind of laid out for me, but job searching is more kind of wild west.

So that's one reason, you know, we wanna provide a plan for them, be like, Hey, yes, you know, it is a little bit wild. There is no deterministic outcomes at the end, but we're gonna give you kind of a path and you do these things, then you're gonna get hired faster. So by as, as a result, we can make all these programs more successful, handle the job, search career part, and get people hired.

That makes sense. 

[00:57:47] Alex Sarlin: So it's structure and guidance. Try to take some of the demoralization and the feeling of like, now you're out in the wild West alone. Instead you're in a community, you have AI support, you have platform support, you have points and you know, very specific tasks laid out. You know, checking in weekly, all of the pieces you mentioned that hopefully makes it, you know, fast and fun and changes psychology of transferring from.

I'm just paying attention to my homework where if I work harder, I'll get it done to, I have no idea how many applications I'm gonna have to send out or what I'm gonna have to do to get a job. It could be a week, it could be a month, it could be a year. That's like such an uncomfortable feeling and it feels like you're putting structure around it and using AI to help that structure.

[00:58:27] Rod Danan: Truly, you know, A lot time. Yep, exactly. You just need more structure, more time and you know, I tell everyone, be patient, put in the work and things will work out. 

[00:58:35] Alex Sarlin: Yeah. Well that's great. Well, thanks so much. It's really interesting and I'd love to follow up and see how you are expanding with Apprentice, and this is such a.

A weird space. It's a space that is changing in a really odd ways where the educational side is changing. The hiring managers are trying to figure out what to do, like all elements are in flux. So having a structured system to help them connect to each other is really, really important. This is Rod Bannon.

He is the founder and CEO Apprentice career Services platform using AI to help the next generation discover and land their dream jobs. Thanks so much for being here with us on EdTech Insiders. Thanks for having me, Alex. Great for the week in EdTech. This week we are talking to Lar Petter, who is a seasoned EdTech entrepreneur who has over 25 years of experience.

He has founded five successful companies, including Motivate, which was acquired by Kahoot, and now he's the CPO and co-founder of a really interesting company called We Are Learning, where He Shapes product, strategy and vision For We Are, which is a course authoring tool that uses all sorts of really animated, interesting game-like characters to create immersive, high quality learning experiences and scenarios.

Welcome Lars to the podcast. 

[00:59:50] Lars-Petter Kjos: Thank you. So nice to have you on the show. I'm looking forward to this chat. 

[00:59:55] Alex Sarlin: Me too. You have such an interesting background in Ed Tech. You've been in it for a while and you've been combining education and learning with gaming, with communication. Tell us a little bit about your background and what brought you to this idea of we are learning.

[01:00:09] Lars-Petter Kjos: Well, I've always been like shifting from being like a pedagogue, um, working in communication and design. So I did a bit of both when I was studying to become whatever. So after finishing as a university as a teacher, I never entered the classroom. I just stumbled upon some computers, started my own company back in those days.

So I played around with Ed Tech since I guess 98 or something. Wow. Yeah. And what you're doing with We Are Learning really is 

[01:00:37] Alex Sarlin: feels like an enterprise ed tech solution. Tell us about what it is and how it uses both game elements and obviously artificial intelligence to make really interesting Ed TechCom Alive.

[01:00:49] Lars-Petter Kjos: So, you know, I'm working a lot of years with an agency like we were a hundred people doing project for companies, for enterprises, and that could be using any desired project, right? We can do game for one company like Series Gaming. We could do VR project. We did a lot of apps. So we did a lot of those things.

Of course it was super expensive because it took like 10, 20 people on a team working like 6, 9, 12 months to create one single project. And then after that we started AM Automate, which was like a regular LMS company where we started to say, okay, let's make anyone able to create beautiful looking learning content and make it easy and like fun and easy as we said.

And with we are, I was mean. After we sold that to Kahoot we, I started thinking about wouldn't it be nice if a instructional designer can actually do all those cool things that we did as an agency, but now without that team on 20 people And then with AI like experience from the gaming industry, we do also run a game studio using Unity technology.

And then of course Metaverse was there talking about the new 3D things and all of these kind of things mixed together. And also like an obvious, I. Missing kind of feature in almost all LMSs is that it's very similar. You know, it has like video, text, quizzes, whatnot, and that's kind of it. And if you want more, suddenly you are sent into this agency setup, which is super expensive, right?

So we are trying to fill that gap actually. So you can do whatever you like with your LMS, make all the content you like, but you can just plug in and we are, and then suddenly you can have all those kind of gamified 3D animator. You can be that you like, PR animator yourself within seconds. It's super easy.

So that's kind of what we built. 

[01:02:32] Alex Sarlin: It's really amazing. And I mean, I'm looking at it right now. It really does look like P authentically looks like very, very professional, high quality animated characters that you can make and adapt and dress up, and they can speak any language. Of course, because of ai they can deliver any content.

Tell us about some of the capabilities that you had as an agency that you've tucked into this program so that now any individual instructional designer or company can do it themselves. 

[01:02:56] Lars-Petter Kjos: Yeah, if you go back, yeah, I mean, you don't have to go back that many years, or even today, I guess you start up with a project.

Let's say you have like a big company. They want to have the safety course or some ethical guidelines course or whatever it is. And then you say, well, we need to engage the audience mean we want to make them immerse into the content. And then you start setting up an idea for this. Let's say it's more like a gamified 3D kind of environment where we interact with characters.

And those characters should of course be in your work uniforms, in your actual environments, et cetera. Then you would need to have 3D designers. You need to have script writers. You would need to have like sound studios. You need to have like animators, yeah, programmers, whatever. So you can think of it like that.

But now we don't. You can do it all by yourself as an instructional designer. You don't have to be skilled in all these different, separate. Areas, you can put them all together yourself, and with the help of ai, you can animate yourself. It's super easy. Just drag and drop. Or you can say, just click a button, say animate for me, and analyzes the script.

And it just pulls out the correct animations to put it all together. And you can of course have the script coming out and all that. You know, everyone knows how you can use AI and you can build very fast. But of course there is a flip side to this fast AI building, and I guess you want to dig into that as well because even though we equip people with all the tools to do this very fast, I don't think necessarily that means that anyone can be an awesome instructional designer with a snap.

It's more like if you are a seasoned. Instructional designer, you can create amazing, cool stuff very quickly and you can maybe spend a lot more on your budget on actually creating a good script or working with the, you know, like the, the learning objectives or whatever it is. You don't have to spend anything, everyone, or all your money on, like designers or 3D animators or whatever.

Yeah, yeah. 

[01:04:45] Alex Sarlin: One thing that strikes me, you know, you mentioned how past classic, especially enterprise instructional design is you spend a lot of money on videos, you write scripts, hopefully you have like good instructional designer, instructional design team put together, as you say, the learning design, the learning objectives, the design, the backwards design.

You really know where you're going, but if you're thinking really big, you do immersive or you do scenarios or you do branching or you do something that goes beyond, you know, classic, you know, videos, quizzes, and text and this really flips it on its head. I mean, this is really like immersive becomes the go-to and branching scenarios are baked in and it just feels like a totally different.

Type of learning that you expect to see in, in classic enterprise learning. So as you say, it's agency quality that you can just get in your hands at a desktop tool. So tell us about that decision to sort of go have immersive and scenario based and character based as the lead, as this main method of delivering instruction.

[01:05:43] Lars-Petter Kjos: It's a good question because my co-founder, Rolf, we kind of been together since 98, so have done all this together, but he is told me that what we are trying to do with VR is actually solving the hard part first, because this is the hard part, right? Like just imagine, okay, anyone, I mean anyone should be able to animate 3D characters within a second.

How could you do that? Right? And then you, of course, as you say, you have all the branching, you have all those scenario based learnings. You can set up like multiple characters. You can place them in in any type of scenario. They can lie in a hospital bed, if that's the kind of your case, you can sit behind a counter in the back or whatever.

And we wanted to be super easy for anyone to do that, right? So, and the reason for this is that when we were working with more traditional kind of LMS, we got the. A question. I don't know as a CPO, how many thousand times I got the question, can we do branching? Right. And of course, no, we can't. Okay. Can we do anything more than, you know, like just having like this, just click next, click, next thing with a quiz at the end.

Exactly. And it's, no, but we have very cool quiz. Yeah, I get that. But still, you know, and then we say, okay, then you have to do a custom production thing and squeeze it in somewhere. And of course that is super, you can get exactly how you want it. Right. But still you need to pay a lot of money. I mean, I think the average, immersive.

Course in, in the US it's something like 150, $200,000 for one project. And you know, like most companies don't have that money and don't have the time to create those. So that was kind of what we set up to say that, well we want to make this tool that can actually do enable anyone to do this, but we don't wanna be the system.

We don't wanna be the platform. We don't wanna build another LMS because we've done that and we know that's very, it's a feature race as any, but we say the world doesn't need more LMSs, it's covered. So we said, well, what if we make the tool so that it's kind of fits into any LMS. It does. So it can just take that link you get from, we are just paste it in and solar is set up, it integrated and it's just plug and play for any LMS.

So that means that as a customer, you don't have to kind of shift out your LMS in order to get this kind of learning content. You can just still continue use it and for compliance training or you know, like tests or whatever it is. Fine. And then you can kind of mix it in with VR contents. And I think that's kind of the smart thing that we did from the get go.

Yeah. 

[01:08:02] Alex Sarlin: Yeah. The integration with LMS is makes a ton of sense and the ability to create these really immersive scenario-based or 3D animated feature or educational experiences and then combine them with anything else that might already be on an LMS or put it together is very powerful. I, you know, you have customers like seven 11 or uh, Volvo who are using this and I'd love to hear, you know, obviously.

You know, not in detail about any particular customer, but when you have corporate customers like that come in, they're probably not used to having this power to do things that look like Pixar movies, is literally what this looks like. How do you help them combine that amazing new AI capability with, as you say, this instructional design hat that they're used to wearing and getting those objectives right, and making it engaging and making it, you know, effective and assessing properly.

Like how do you put the pieces together so that they can, they can use it without just getting sort of carried away with, with, and start building the next, you know, cars, 

[01:08:56] Lars-Petter Kjos: you know, if you're used to only create like those click next courses, that's where you go, right? That's where you're, you know, know how to do stuff and can be okay and it can be good, of course.

And then we come in and say, well, you can, you can create like scenario based learning, gamified. That means that you can, not only getting a video, explaining something, you can actually set up the scenario. You can be in the scenario, you can act, you can interact with characters, you can even speak freely.

Now with AI to the characters in a whole new way of interacting and a lot of different kind of ways to do that. So we do try to train our customers instructional designers by having a lot of webinars. So we do like having 3, 4, 5 every month. We have had like the top 10, 15 EdTech influencers from the US joining our learning expert, Stina, doing webinars.

We have a community, so we really try to train the trainer in a way, but there's a difference between some customers they have kind of done this before, but then the expensive version, they get like a door drop when they see the potential in our tool and the cost. And it goes like, okay, this is, this is outta this world because we are used to spend like $200,000 on every project and here we can just create projects for almost, you know, like one 10th of the price of 100, the price, whatever.

Uh, so that's one point. The other one is the ones that have not even gone that route. Before at all because of the cost. So at some point we're kind of interesting it, but never went that role because it's like way too expensive. Right. So in that case there are different approaches. But it's a very good question.

I think it's a very relevant topic for the learning industry. Yes. To start talking about not tools, because there are so many tools out there and there's this, like, this debate going on. I mean, is AI avatars the new thing? Yeah, of course. It's an amazingly cool thing, but the output is very old school and no offense.

I mean it's great like us to having this talking heads right now. Right. But it's still, I. A talking hand video, it's very good to convert text into something for people that are struggling reading and writing. That's awesome. You can do that with a click, but it's just a tool. It doesn't revolutionize learning in any way.

Just another tool for the, the instructional designer use it, well use it. Smart. Right. And I guess the same goes with with, we are as of course, as well. So it's, you know, like finding those. Perfect tools. Put it together. Yeah. Yeah. And I think that's what it's all about. I know we 

[01:11:22] Alex Sarlin: only have a couple minutes left that I'm so intrigued by what you're doing here, and I wanna follow up Two quick questions related to what you're saying there.

One, it is very clear as I look at, you know, what you're doing with we are that you have this race, you have the, the Hagen and Synesthesia, which I think are amazing. They could do like extremely realistic, you know, hyperrealistic avatars that look and feel like a person. This is a more playful, fun way.

And you could take a, you know, an employee within a company or a, a mascot of a company and have them deliver the content or do it in a really fun way. I'm curious, when you go to companies, do they have the instinct to say, oh, we're gonna take. Our main trainer that we usually video and make up a digital version of them?

Or do they wanna create new characters or do they wanna do, you know, their CEO as a digital version? Like where do they go when they have the capability to make something? That could be anybody. 

[01:12:10] Lars-Petter Kjos: It's a lot of CEOs. Yeah, makes sense. That are creative. Makes sense. Or in like experts or whatever. And that's cool because they have the avatar and then can represent them.

That's cool. But I think mainly it's about like work uniforms having so that the learners could identify with the characters. And I think for our purpose when we started was that I didn't want to end up in the uncanny valley. Right. I mean there's a difference between in light the Synthe versions because they are rendered videos.

It's like game technology. So there are these kind of 3D game characters and there are a lot of uncannyness going on with that. So close to like human look like. We want to have something that stand out as cool, but then also we can express emotions in a much more exaggerated way without becoming uncanny because of this character style.

We can use mobile and have like eye contact. Because we can exaggerate the ice a bit. Yep. Et cetera. 

[01:13:00] Alex Sarlin: That makes a lot of sense. And then building on what you're saying about, you know, going from the old school way of doing corporate training, click through, click here, read this, click here, read this, click here.

If you're lucky, you watch a video, click here, read this. Yeah. We've all been through training like that. The way you have built this technology, it's, it's character based. It's responsive. You can, as you say, you can ask it anything and it will actually respond because it has AI under the hood. It potentially creates this world where you've totally flipped that, that type of training on its head.

And it could basically be there for you when you need help. If you're like, I need to know what's going on it at Volvo. And I, yeah, I hear we just started this new technology. You could go ask questions that it could respond to. You can have a conversation. Much like conversational ai. Do you see people using.

Open ended and structured. 

[01:13:45] Lars-Petter Kjos: We got the question a couple times, but I think we, we decided to not go in that direction to optimize it for like this internal chatbot because there are so many like focused solutions out there that actually does that. So I think, well, let's like the agent, like the visual agent makes sense.

Kind of makes sense. So I think this is for the training scenarios mainly. So that's kind of where we are. What 

[01:14:07] Alex Sarlin: you're doing is 

[01:14:07] Lars-Petter Kjos: really exciting 

[01:14:08] Alex Sarlin: and I think the design decisions to say we don't want to go hyperreal, we want it to be fun, we want it to be playful, we want it to be interesting and engaging and feel like you're watching something that you, you would watch for entertainment purposes is really gonna pay off.

I'm really excited to see how the EdTech and the enterprise. L and d world starts to embrace this type of learning. I wish I had learning like this when I was, uh, you know, training at various companies. It just feels like a totally different level of excitement and engagement. Cool. I wish we had more time, but we'll have you back on.

This is Lars Petar Chu. I'm doing my Best shoes, you call it, from Oslo with we are Learning. He's the CPO and has had all these amazing companies, including Motivate, which was acquired by Kahoot. What they're doing in corporate training and l and d is, is really worth looking at. It's called We Are or We Are Learning.

Thanks for being here with us on EdTech Insiders. Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.

People on this episode