Edtech Insiders

Week in Edtech 11/27/2024 Thanksgiving Edition: Edtech Insiders GenAI Market Map, Google AI Summit Recap, OpenAI’s Sora Leak, AllHere Scandal, Amazon Bets on Anthropic, TikTok Study Trends, Free College Tuition and More! Feat. Brett Waikart of Skillfully

Alex Sarlin and Ben Kornell Season 9

Send us a text

Join hosts Alex Sarlin and Ben Kornell in this Thanksgiving edition of "Week in Edtech" as they explore the latest developments in education technology, from groundbreaking AI innovations to edtech controversies and funding milestones.

Episode Highlights:

[00:03:16] 🧠 Edtech Insiders publishes AI tool market map with 300+ tools across 60 use cases
[00:05:39] 🌐 Google AI Summit recap featuring LearnLM and Gemini’s memory feature
[00:14:57] 🎥 OpenAI’s leaked Sora video generator sparks ethical debates
[00:18:08] 📚 TikTok’s study tools trend raises concerns about gamified learning
[00:27:38] 💰 Amazon invests $4 billion in Anthropic to deepen AI ties
[00:32:10] ⚖️ AllHere founder charged with fraud impacts trust in edtech startups
[00:40:53] 🏫 MIT, UT, and others announce free tuition for low-income families

Plus Special Guest:
[00:46:02]
🎙️ Brett Waikart, Co-founder & CEO of Skillfully, discusses skills-based hiring and future of work.

😎 Stay updated with Edtech Insiders! 

🎉 Presenting Sponsor:

This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

[00:00:00] Ben Kornell: I also saw some concerning stuff where, you know, on Tik TOK, people are now generating video that goes along with their lessons, you know, so basically they have an audio stream of their lesson. With a video feed of people paying Minecraft. And I do also feel like we're getting to a place where, you know, if we go with what the user's going to engage with most, we might end up leading ourselves to a dumbing down or watering down of all content and curriculum to kind of feed the ADHD.

Like vibe of, you know, clickable short form video.

[00:00:39] Alex Sarlin: Welcome to ed tech insiders, the top podcast covering the education technology industry and funding rounds to impact AI developments across early childhood, K 12, higher ed and work. You'll find 

[00:00:53] Ben Kornell: it all here at ed tech insiders. Remember to subscribe to the pod, check out our newsletter and also our event calendar.

And to go deeper, check out ed tech insiders plus where you can get. Access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.

Hello, EdTech Insider listeners. We have a Thanksgiving edition. Of the world's largest ed tech podcast, newsletter, and community. I think I can say that with a straight face now, Alex. I mean, we really, it's been a month, but we're so glad to have you join us here today, and we're so grateful for all of you as well, before we get started, let's jump into what's going on with the podcast.

[00:01:44] Alex Sarlin: Yeah, we have some really exciting episodes coming up next week. We have Laura Ipsen, who is the CEO of Ellucian, which is basically a student information system has a wide, wide variety of services and basically has a third or more of the higher education market in the U. S. She had fascinating ideas about A.

I. And the future of higher ed, Ben, you talked to Trish Sparks, the CEO of Clever. That one is coming out shortly after. And this one is also you, Gavin Cooney from Learnocity, who has been doing fascinating work in assessment. Do you want to tell our audience? I mean, you know, friend 

[00:02:18] Ben Kornell: of the pod, Gavin Cooney, and also Trish was great.

Um, Former teacher, former classroom teacher. Great to see a CEO of a big company like Clever led by a former educator. 

[00:02:29] Alex Sarlin: Absolutely. And you know, just everybody is just getting their head around this amazing time and finding partnerships and finding strategies. And it's, it's really, really exciting to talk to them, especially people who have these huge footprints are already in so many classrooms and so many institutions.

So the big news for us as well is we just published our first book. Map of AI tools. It's basically there are generative AI use case database in collaboration with the creators of the use case database, Lawrence Holt, former amplified now, you know, man about town does a million important things in ed tech and Jacob Klein, who is.

Formerly amplify and now at teach effects. Ben, do you want to give a little overview of the AI tool database and how that came about and what's sort of going on there? 

[00:03:16] Ben Kornell: Yeah. I mean, I think just like our listeners, we've been trying to make sense of this Cambrian explosion of ed tech tools and apps that leverage AI, some natively leveraging AI and some adding AI on top of an existing product.

So there's many ways to kind of cut that space. What we've realized is that starting with pedagogic use cases is actually the most productive way to understand how do tools fit in. It's also really challenging because many AI tools are category defying and that creates really great insights into what are the pedagogical purposes of these AI tools and how might people use them in the space.

So it's both a way to make sense of the landscape, but it's also a starting point for a conversation of what can AI tools do today and what will they be doing in the future. 

[00:04:08] Alex Sarlin: Yeah, it's a great point. And you can find that at edtechinsiders. ai. What's also really fun about it is there's a market map of about 160 companies across 60 different use cases.

That's one single downloadable graphic. It's sort of a classic, you know, Andreessen style or reach capital or owl style market map. But you go further and there's actually a database of over 300 companies. And we have at least a hundred more that we're, you know, adding now where you can go really deep and see all sorts of companies.

And as you said, Ben, a lot of them are category defying. So each one sort of has a primary use case and often many additional use cases. So we are also taking suggestions very, very actively. So please check it out. And if you are at a company or if you are in some aspect of the ed tech world, especially the K 12 world, which is sort of the focus here and you find yourself not on the map or not in the database yet.

Let us know there are forms on the site to submit it. And we are constantly looking for new ideas and doing frequent updates. So please let us know we're trying to sort of help the whole field understand this space. And of course, Lawrence and Jacob's pedagogy first approach has been incredible way to make sense of it.

Rather than looking at what's out there, you look at what do we need in education and who's trying to fix the problems and make things better. So let's talk AI. There's a bunch of interesting AI news. Let's do a sort of around the world with AI, and then we can go deep into some of the EdTech aspects of it.

Ben, what stood out to you most in the AI world in the last week or two? 

[00:05:39] Ben Kornell: Well, let's start with Google. We were both at the Google Learn L. M. Summit, and it was really fascinating to see such firepower across the organization. As I've often said, they are the largest ed tech company in the world, and yet sometimes it feels like a real rounding error in terms of their overall business.

But they are going really deep on learning and A. I. And note that it's learning and not education. There's a sense of learning happening in a lot of different contexts and on a lot of different surfaces. So there were conversations about YouTube and around search and around Google classroom as well, but also notebook LM and a number of new tools and surfaces and projects that frankly, you know, Google is spinning up and shipping out like a startup might.

It's really such a sea change from where they were just two or three years ago, when chat GPT first came on the scene, Google was lumbering. It was slow. It was very hesitant around AI. And so you have two factors coming together. Learning is a very, very powerful use case across their tools. And then second, This kind of diving into a I and shipping.

I will say one thing that we uncovered, too, is that the API for learn lm is now available to developers. So the ability to build on top of learn lm, which is Tuned to not give just the answer. It's basically been trained for the purposes of creating a learning LM rather than just a general purpose. LLM.

And so I think that's a really exciting opportunity for developers specifically in the education space before we go to the other around the world. What were some of your takeaways from the Google event? 

[00:07:29] Alex Sarlin: Oh, boy. I mean, it was I think I can say with a straight face, as you say, you know, without exaggerating here, that this was one of the most exciting and just sort of eye opening set of interviews.

We had this amazing backseat access at the event, this learn with AI summit. And there are sort of eight. Learning leads across Google, just like you mentioned, there's, you know, the learning lead for YouTube, a learning lead for Google classroom, which is Shantanu, who we've interviewed in the past. There's learning leads for grow with Google.

There's learning leads for just Google research, all sorts of things. And we got a chance to sit down with each of them and really dive deep into how Google is thinking about AI, where it's going to go. And I agree with you. I mean, You know, I think we've talked for a while about how OpenAI really caught Google flat footed way back, you know, I guess it's two and a half years ago now, whatever we're talking about, and just, you know, created this product that just went viral and sort of, I think, you know, just really caught the DeepMind team and the Google team a little bit off guard.

We talked to the DeepMind lead too, learning lead too, but. I think they not only realized that they've gotten this incredible a team around learning within Google. Almost everybody we talked to has been at Google for like 10 to 20 years. Many of them came from the search team, which, you know, if you know Google's model, the search team is the team that brings in all the revenue.

It's like the super, super duper a team for Google, right? So they're bringing in these incredible people and they're really thinking holistically about All the surfaces, you've talked about this a lot, but all the surfaces that Google touches all the mobile surfaces, the mail maps, Gemini being in every aspect of what Google can do in Docs and Sheets, thinking about YouTube, I mean, and they've shipped a lot of product, just like you said, really quickly, especially over the last year, Google learn about is this product, it's live right now in Google Labs, which basically combines search and learning where you can, you know, search for any topic, and it will give you a Pure, you know, basically a learning plan and all the videos and all the things you need to learn it incredible There's a Gemini gem that's built to be a learning coach So within Gemini you can ask it I want to learn something and it'll sort of just totally envelop you in a learning mindset and then As you said, Learn LLM, we talked to the head of Google Research, they developed Learn LLM in many ways modeled after MedPALM, which is their medical specific learning model.

And they had made this medically specific learning model and it had proven to be incredibly effective as a medical model, AI model. It was just doing incredibly well and they said, you know what, we think we can do this for learning. They put this big team together to make Learn LLM and the news at the summit is, Learn LLM.

That Google's AI studio where you can literally get hands on with almost any of their models, but also actually call the APIs. You can call the APIs and have them talking to your product. If you're a developer, learn LM is now there. So that is big news. I think for the ed tech community, because it means they can, that anybody doing ed tech development.

Can you bring in the power of Google's learning model into their product? It's really powerful and they're really looking for feedback. I mean, they know that this space moves quickly. They're not trying to say, Hey, we're done. They're trying to say, Hey, let's figure out how to make this really work together.

So that's exciting. So other things, you know, other news out of Google over the last week, Google's Gemini. Chatbot introduced a memory feature. So that's a big deal. Of course, if you care about, you know, the concept of personalized differentiated, or I sometimes call it precision learning, the fact that the chat bot can remember user preferences, can remember life details, can remember what you've talked about.

Is a big step in the direction of personalized learning. And of course, you know, Google's also in the middle of this antitrust suit with the U S justice department. That's trying to spin out the Chrome browser. And just this week, they started talking about that Google's investment in anthropic. They're starting to question that and starting to say, wait a second, you know, Google has all of this AI power.

If they're also putting bets and all this money into other AI products, maybe there's a monopolistic issue there as well. So. Even while they're doing these incredible things. There's also some, you know, business scrutiny. So 

[00:11:39] Ben Kornell: meanwhile, you also have the antitrust investigation going on where, you know, they're talking about breaking up chrome browser and, you know, their deal with search and apple.

So, you know, I think this is one of those moments where we're going to have to do. Really pay close attention to how these big players all pan out, not just because they're the largest learning models and AI models, but also that it creates downstream opportunities as well as downstream friction for anyone else working in this space.

I will say too, for the K 12 education space, if you were building on top of learn LM, There's a perception of safety that is going to be higher than some of the other ones, whether that's earned or not, it's not technically sophisticated enough to say one way or the other, but this is also a really important thing to think about when you're thinking about your multi agentic workflows and which, you know, which API APIs are you using?

It does give me real hope that the cost is just going to continue to go down because Even a specialized LLM like learn LLM, they're putting out there with very, very low costs. And so, you know, it's a race to the bottom in terms of the cost profile. 

[00:13:01] Alex Sarlin: Totally. So by the way, all these interviews that we're talking about are coming out as ad tech insiders, podcasts, episodes in just the next week or two, we're doing a postcards from the event as well as interviews.

There was a moment where they talked about notebook LM. Talking to Steven Johnson, author, incredible author, and he's really, he's sort of an editorial director of Notebook LM. It's really, in many ways, comes out of some of his thinking about research and journalistic research. But he was saying, hey, you know, sometimes I'd step back and think maybe we're building a sort of AI first operating system with this.

And I thought that was such an interesting Way to look at it. And it just feels like, I mean, Google is going through lots of changes. They have this monopoly, you know, antitrust suit. They are also, you know, search is itself. The core business has been, it's changing, right? There are other people, new entrants entering search for the first time in a decade, right?

Well, perplexity and open AI search and things like that. But I think they are looking at it in such an interesting way. And I really look forward to everybody hearing some of these interviews. Cause I really feel like it's a. Upleveling of the entire conversation about AI, which is, I think, what you're sort of saying as well with LearnLM, Ben.

So, but, Google's not the only player in town. Let's talk about some of the other people, the other things going on around the AI space. One of the big stories this week was that OpenAI's video generator, Sora, Which has been, you know, announced, was announced six months ago, but has not been released, was leaked this week by basically a sort of somewhat renegade group of artists.

That's really interesting. A little bit of this sort of feels like we're entering a, I don't know, matrix style, like insurgent Luddite. I don't know what it looks like, but basically they released something that OpenAI did not want released as an act of rebellion, as sort of a little mini insurgency, basically saying, Hey, Open AI, their claim is that they've been exploiting testers and sort of using artists, you know, work without enough accreditation and permission and that kind of thing.

What did you make of that? 

[00:14:57] Ben Kornell: You know, big picture. I made it that the controversies and chaos continue to swirl around open AI. And in startup land, you know, that sometimes is the cost of doing business. But when you get to the size and scale that they are at now and the potential influence, it starts feeling like a meaningful business risk to them as well as potentially risk to humanity.

How ironic would it be if at the end of the day, whatever the AGI model that comes to rule us all is leaked because of, uh, You know, a group of people who are upset about, you know, attribution, like that's the idea that one could release source code or the video generator due to a concern of this nature, which I don't want to belittle, but like that shows that there's a lack of control and security.

In open AI, that should be concerning for everyone. And if these folks can leak it, how do you imagine in a world of espionage that things are going to stay locked down to the learning point? I think that we've been speculating about video generation as an incredible modality unlock for education. And what I'm seeing is.

A number of people are kind of figuring out this balance between crafted, really, really human centered curriculum and content creation and then AI generation as a layer of icing on that top of the cake to wit your article on the AI layers. But this idea that. We still need this like core content, but that AI creates these generative enablements to engage in that content and all of these new and creative ways.

So that it could be a podcast like notebook LM, or it could be a video, or it could be generated. And so I think we're starting to see how the world of high quality content. Is going to interplay with AI and really videos, the Holy Grail and kids are spending, you know, two to three hours a day on YouTube and Tik TOK.

So if we can unlock video generated learning enablement, that could be really, really powerful. I also saw some concerning stuff where, you know, on Tik TOK people are now. generating video that goes along with their lessons, you know, so basically they have an audio stream of their lesson with a video feed of people paying Minecraft.

And I do also feel like we're getting to a place where, you know, if we go with what the user's going to engage with most, we might end up leading ourselves to a dumbing down or watering down of all content and curriculum to kind of feed the ADHD, Like vibe of, you know, clickable short form video. And that is a, this is also a concern with Sora out in the wild.

Like, are we going to actually lose our edge with deep thinking and critical thought and spending time wrestling with content in a way that creates more cognitive lift? 

[00:18:08] Alex Sarlin: So such a great. Connection. And that stood out to me too. So this was an interesting news item, actually, uh, flagged in Claire's, that was amazing, you know, AI and education newsletter.

And just to back up what this thing is, is this concept of, they're calling it PDF to brain rot. And what they mean by that is you take a PDF of something you're actually want to study, right? A study material, something, a textbook chapter, a reading from a class, you upload it into something which turns it into audio, maybe it's like a Google notebook, LM, what turns it into a podcast, or it could just be something reading it.

But instead of a matching video, you match it with a video that sort of simulate something that's much more like a tick tocky, just the kind of thing you'd sort of zone out on with tick tock, like Minecraft gameplay, or they talk about ASMR. Like, I don't even know what that stands for, but that little like sound stuff that you do, I don't see this as a bad thing.

Maybe this is makes me crazy. But so when we talk about that, I totally agree with you. Video is the Holy grail. Video is where. Young people are, I think we've gone with this assumption that the way that video has to be, if we're going to be educational video has to be in certain types of formats. I mean, we've seen Prof Jim do be able to create, you know, educational video, quote unquote, which is a avatar talking and sort of newsroom style avatar talking in front of a slideshow talking or, you know, that kind of thing.

I think that this is actually a canary in a coal mine for a very different kind of learning, which I actually don't think is necessarily bad, which is. Hey, maybe, I mean, the question is what's more efficacious. You mentioned cognitive load. You're right. It's possible that cognitive load that this might actually create more cognitive load and make it harder to retain the material or make sense of it or, or make meaning out of it.

On the other hand, it might be that this is already how students study. They study with other things on, and maybe the Minecraft gameplay actually Is not really adding cognitive load. It's so familiar to them. Maybe it is actually a calming influence, and it makes them more able to focus. We don't know yet.

I mean, most of the research on multitasking says it doesn't really help. So I'm being a little facetious here. But at the same time, I think it's very interesting that students are already starting to put together basically education and entertainment And say, how can we patchwork and sort of Frankenstein together these different things that we're doing on a daily basis and see what works.

And I'm intrigued, personally. I don't see this as a, you know, going off the cliff. I see this as a, a new era of, wow, these new things are possible. And to your point about Sora, and by the way, Sora is just one of the games in town, right? Runways video thing is a video generator is amazing. Meta has a video generator now, you know, everybody's sort of interested in this.

There's a race for good video generation. When you can make AI generated video, to your point, Ben, nobody knows what the best AI generated video will look like. I mean, maybe it's movies, maybe it's commercials, maybe it's this, maybe it's incredibly calming, you know, gameplay of something really casual and relaxing so that you can actually pay attention to the listening without getting distracted, nobody knows.

So it's really interesting that the fact that we can create. Generate content in any way, um, opens up this world of what content works. And I think there's a whole lot of research to be done here. I just, I don't want to assume an answer. I think it's pretty interesting that people are trying to put together these weird types of video.

It also opens up the language piece, right? I mean, I always come back to this because I think it's so interesting. I mean, Microsoft announced something this week that stood out to me, which is They're trying to do a real time AI language interpreter. Inside Microsoft Teams, let's think about what that actually means across nine languages, and I assume that's probably the nine biggest languages.

That means that if you get on a Microsoft Teams call, just like a zoom call or anything else, you can speak different languages, and it will translate in real time while mimicking your actual voice. It will basically sound like you are talking in the language of the recipient, and you could have many people speaking different languages.

All sounding like themselves, hopefully keeping the same nuance and everything, but they can all understand each other and be on a meeting together. That is crazy. And I feel like it's related to video in that it's the ability to have this sort of hot media, like much more immersive experiences. With others that I don't know.

I mean, it's, it's obviously more human than video, but it's, it feels similar to me and that it's just a way for AI to make the world just feel smaller and more connected. It's what did you make of that? I mean, Microsoft had a bunch of news this week, but what did you make of that language interpreter idea?

[00:22:41] Ben Kornell: I mean, I think that it is. Inevitable. It's coming. This is where a year from now, it's going to look like, of course, everybody has that feature. You know, I've been working with Synthesia, which is a video app that allows you to take a video of yourself. And not only does it translate it into other languages, but it makes your lips move like you're speaking the other language.

So it looks super fluent. What could this mean? One, it could mean that The acquisition of the English language becomes less important to access career and job opportunities. In a remote world. And I'm a little bit skeptical of that because I do think that for human interaction, like a AI filter or interpreter layer still create some sort of barrier, but you know, if the cost difference is just so incredible around globalized workforce, you know, people are going to work through that.

I think the second implication on this is that. You know, our ability to localize new ed tech products is really profound and powerful. And so your workforce might be global, but also your total addressable market might be far more global with this kind of Insta translation. And that could include everything from.

the voice components, but it could also include, you know, immediately translating your entire website and your products and so on. So I'm pretty excited about that. When we start talking about what's the new portal to the internet, this is where ultimately I think this is all going to live in the browser, which makes the Chrome, you know, Firefox versus Safari debate really interesting.

Because ultimately, if all of this natively lives in your browser and allows for instant translation, you know, that's going to be a layer on top of the entire Internet. That doesn't even need to be innate to any product or platform. So that's my sense of where it's going to migrate for like kids. I'm just excited for my kids to grow up in a world where they can be, you know, having chats with people from 15 different countries and multiple languages.

And I'm excited for, you know, this is the techno optimist in me, a more connected world. 

[00:25:02] Alex Sarlin: Me too. I mean, I totally agree. And I agree with you that at least in the short term, there may be this sort of AI layer that is imperfect in between. But I think it's pretty, I mean, think about how like Columbia, how like Medellin or India, or, you know, all these places in the world that basically have raised their economic profile by basically having coders.

And coding is a common language, right? I mean, if you're in Medellin, and you know how to write Python, you can get a job. In an American tech company and make them way more money. This is the same thing, but for almost any job, right? If you are a social media marketer and you need to get on a call every week and you live in Tanzania and you don't speak English, who cares if you can do a great job?

Who cares? That's really weird and really exciting, because I think that, you know, these language barriers are a big deal. I mean, we've talked on this podcast for a long time about how the English language is the number one thing learned in the world right now, to your point before. Last time I've checked, that's what the, the British council has put out.

They said like 2 billion people plus are learning English at any given time, right? So, Why is that? Well, we know why that is, right? Because the English language is a gateway to a lot of different things right now, but if it weren't, a lot of things start changing and it's very, it changes social dynamics.

I mean, you'll have people dating, maybe even marrying each other. Without speaking the same language and using AI to translate it until, if they even learn the same language, you'll have international teams where people never learn the same language as each other, and it never matters. And you can have people absolutely everywhere that lowers costs, but it also creates opportunities for lots of different people.

It evens the playing field for a lot of people. I don't know. This is the tech optimist in me. And maybe it's. Overzealous, but I'm excited about that world. And yes, it might live at the browser level. It might live in the zooms and, and teams of the world. It might live in a device you wear on your, you know, on your throat, like in a hitchhiker's guide to the galaxy, but like, or in your ear, I guess, in that right, the babble fish, but it'll happen.

And it's going to happen a lot sooner than we think. It's crazy. I mean, even the Google folks are talking about how YouTube creators can now get their videos translated into dozens of languages, which just immediately wildly expands their audience and they love it. Of course, right? If you can be, uh, do unboxing videos in in Poland and they could be watched all over the world and in any language that just changes your economics enormously.

So it's really exciting. One more big AI thing we should talk about, which is Amazon and Anthropic. What's the news there? 

[00:27:38] Ben Kornell: Yeah. So we've been talking a little bit about Anthropic in the news with Google, but Amazon is pouring cash, full stop into Anthropic 4 billion, and they are basically picking their horse in the race, which I think is a super smart move from Amazon because.

If you look at how Anthropic has positioned itself, it is the B2B AI. It is not trying hard to be the consumer door to AI. It's trying hard to be that friend of the leaders of a company. And the features that they've rolled out, which allow for collaboration and, you know, building sketch products and all of this is very, very powerful on the kind of capability front.

One other thing that I think is relevant to Anthropic is, you know, their. Market cap and their kind of trajectory has been very different from open a eyes. And so you could actually see a way in which the ROI on this investment, just from a pure investor standpoint, because of chat, GPTs growth, there's a multiple placed on open AI.

That's a very, very high multiple. That it's not clear to me that it really translates into total valuation of the business. Anthropic has a much more reasonable valuation because they're selling. In a B to B fashion, that's very similar to sass, very well known and, you know, really has a sense of lock in and lifetime value where the consumer elements of opening.

I make it really risky for switching costs. So I think, you know, both from a strategic standpoint and from an investment standpoint, Amazon saying we want to be the B to B partner. Kind of ironic given that they have the world's largest store for consumers, but this is where AWS has led them. And I think this makes it a really powerful combo for ed tech providers 

[00:29:46] Alex Sarlin: as well.

That's a great point. Sort of noticing. Amazon strategy here might be a lot more in the infrastructure, like the AWS in the B2B. And, you know, one aspect of this that I think goes along with your point is that part of this deal, as I understand it, is that Anthropic is going to use Amazon's GPUs, Amazon's chips to do the training.

And, you know, so NVIDIA, which is, you know, goes in and out of being the most valuable company in the world right now, because of its, you know, almost monopoly on graphical processing units on the chips, the power AI. Amazon wants to be in that business very badly. Similar to what you're saying about AWS.

They know that if they have the chips that the whole world's AI runs on, that's, you know, endless profit, just as it has been for NVIDIA. So I think there's also a infrastructure play at the chip level of trying to say, okay, we're going to take one of the, you know, not very many leading. AI foundation model creators and sort of pull it onto our platform, onto our chip platform.

So there's definitely a lot of strategy there and we'll just see how it comes out. I mean, the combination of the Department of Justice looking into Google's investment in Anthropic, um, and maybe challenging it combined with Amazon putting another four, I think they had 8 billion in the past, like some huge numbers, right?

So the, you know, Amazon's just going all in on it's clearly their horse in the race in a very big way. Okay, finally, let's get to some very specific EdTech things. The first is a very sad one, which is that, yeah, so we've reported on this podcast for a while now, this sort of problem, this sad situation where LAUSD had been working with All Here, an AI education startup, to create this ed platform for parents and teachers and learners in LA.

It sort of fell apart spectacularly and it just got even worse this week where the all here founder Joanna Smith Griffin who we've interviewed and we've talked to who's like from everything I've seen is just an incredibly she's been great in my experience but. As of right now, she's basically under arrest for a whole bunch of different things basically about misrepresenting revenue and doing things that might have, it sounds like basically misrepresenting information as a way to get the big L.

A. Contract in terms of who they already had deals with and all sorts of things. And the claim is, you know, defrauding investors of 10 million by misrepresenting all here's experience. What do you make of this, Ben? 

[00:32:10] Ben Kornell: Yeah, this one is a tough one. And let's be clear, the EdTech Insiders team, we were excited about Ed in LA Unified.

We also flagged some of the risks. Never in a million years would we have thought of these as the risks. We would have thought that it's an implementation risk or collaborating with LA Unified. And we also have to go with innocent until proven guilty. So the charges and the claims are really beyond the fake it till you make it mentality that people have kind of glossed over on this one.

The charges or the acquisition is really around fraud. And, you know, this is a message to all founders out there. It doesn't pay to lie. You know, it doesn't pay to falsify things, no matter how good that Contract might seem or no matter how big of a deal it might feel. If your startup is going to shut down or fail, this is way worse than any of those outcomes.

And ultimately, you know, this is somebody who had a very, very high standing in the education community. And I think people are shocked. That this could happen. And so the repercussions here are kind of threefold. One is more investor diligence. There's going to be, you know, even more focus on, you know, finding out what's going on, verifying all of your contracts, all of that stuff.

So if you are raising around, expect more friction rather than less. And I know it's been hard just because given the investment cycles have slowed, There's already a lot on that plate anyways. Then number two, there's going to be buyer hesitation around, can we really go with this? Shouldn't we wait for somebody else to be a first mover?

And so I think it'll slow down market adoption of really innovative products. And then I think the third one is around the overall perception of ed tech. And this is where it hurts the most. You know, we have by juice, we have paper, we have these companies that had spectacular valuations and then have like plummeted.

But the kind of, there's a story around like Zerp and, you know, valuations that is a little bit overshadowed by the investigations into, you know, where they kind of came into by juice home and took papers. And then this criminal case against Joanna Smith with all here. So it's, That really creates like this negative halo around ed tech that we're going to have to navigate and deal with.

And these types of things, they don't go away for a long, long time. And so rather than be defensive, I think it's important to own it and figure out, okay, how do we make sure this doesn't happen again? 

[00:34:56] Alex Sarlin: I think two things also worth pointing out here, the LUSD was the tip of the spear here, but some of these claims actually proceed a lot of this.

They're basically investment from 2020. The claims are that, you know, all here was telling investors it was generating almost 4 million in revenue and working with this big set of education institutions, including, you know, Boston, New York, Baltimore, you know, all these big school districts and areas. And then.

Actually, they said the truth was they only were with two of those contractual relationships, and they were only making 11, 000 that year. That's the claim. So, you know, 11, 000 to 3. 7 million. So, to your point, Ben, about the sort of fake it till you make it, there's sort of, you know, there's certain Claims that are sort of maybe a little squishy the founders sometimes make they say oh this this deal is about to close we could expect this kind of, you know, revenue at X, but this is beyond that and they're saying she used some of the money for her own wedding and her like it's a pretty scandalous sort of scandal.

The other thing I think is worth pointing out and I don't want to, you know, dwell on this but it's like. It is interesting. You mentioned, you know, paper you mentioned by Jews. It also, you know, this is a leader of color. This is a female leader of color, and it still blows my mind. Sometimes it feels like there's still something wacky happening behind the scenes in this tech world where you have, you know, some of the female founders become these like, Personal scapegoats, right?

I mean, Elizabeth Holmes or the person who sold the company to JP Morgan, right? The ed tech leader. And yet, you know, we don't go around saying, Oh, Philip Cutler, Philip Cutler really screwed up, like, who's the head of paper? Like, I mean, he may have some business reputation to deal with, but he's not like made into, you know, Enemy number one.

We don't say that about by Joe. I mean, by Joe, arguably that was, you know, 220 billion valuation. That's now less than a billion. That's a whole lot more money than the 10 million that we're even talking about here. Yet he doesn't get villainized in quite the same way. So I 

[00:36:53] Ben Kornell: think those are a little the thing I'd be careful about.

These are really apples and oranges situations because we've got criminal charges filed on the fraud. You know, this is more akin to Elizabeth Holmes one or a Charlie one where there's actual made up numbers. If you look at, you know, paper is a great example of, Hey, you know, we're in a moment, should we really go for it?

And they just got out ahead of their skis, but there's zero evidence of fraud or wrongdoing there. And, you know, they use aggressive sales tactics, but, you know, ultimately all the numbers were to what we know, all the numbers were accurately reported, but I do think you're right on your bigger point, which is people are going to extrapolate from this.

And it's not only going to be about edtech. It's also going to be founders and who's successful and stereotypes and so on. And, you know, this is, you know, ultimately this is also a failure of those investors to really, really get in. And understand you and I actually have a common friend who looked at all here early days, and they said, I'm going to pass.

Something seems off here, and I was like, Really? This seems like a really big rocket ship. And they're like, No, I'm staying away. And that was a big signal to me. Like, Okay, I don't get that. But now we're seeing some of those things come to roost. You know, the other thing I would just say here is when doing big contracts with districts, the procurement process is fundamentally broken.

And if you need a case study, this is a great one. There was clearly no evidence that all here could deliver on these things, even if you thought the revenue was what it was. You know, we had reports coming in very, very early around Ed from internal people at all here saying this is not gonna work and concerns about data privacy concerns about the build.

So, you know, even if there hadn't been fraud, I think it glosses over what's. A core challenge in our space is procurement. 

[00:39:02] Alex Sarlin: Yeah, I agree. I'm trying to hold both ideas in my mind at the same time. And I know that there is a fundamental difference between, you know, criminal charges and fraud and sort of founder, you know, mistakes or, or doing something that, you know, doesn't just work in a business sense.

Yeah, but it's a tricky situation, and I think it doesn't make the sector look good, and it doesn't make the very few, you know, entrepreneurs, female entrepreneurs in the tech space and in the tech space, they're relatively few or leaders of color. I feel like this sets them back as well, and I think we should just be really.

thoughtful about this whole world. I know there's a being fair and a being too squishy and a being even handed, but this is a painful one, I think, no matter how you slice it. Let's talk about some happier news. There's some really interesting movement in higher ed this week, which is that We've been seeing these sort of record low numbers in terms of people feeling like college has a high return on investment.

People have been really concerned about tuition raises for a long time. And we saw a number of pretty big name schools announce this sort of basically free tuition for families who make under certain amounts of money a year. This is not a totally new idea. You know, places like Harvard have done this for a long time.

Harvard and Yale have done this for a long time. But we saw MIT. Carnegie Mellon, the university of Texas system, which is enormous St. John's college, a whole bunch of Brandeis, a whole bunch of schools, sort of at the same time say, Hey, for families making under X dollars a year, it's usually 75 to a hundred thousand dollars.

Sometimes a little more college is completely free. And in some cases, students making for families, making less even room and board and books and everything is free. So they're clearly trying to sort of change the narrative on college being You know, out of reach college being too expensive. The whole institution being elitist.

Do you think it's going to work, Ben? What do you think is happening? 

[00:40:53] Ben Kornell: Yeah, I'm very excited about this. One thing that we have to acknowledge is that there was a debacle with FAFSA and the affordability under the current regime of financial aid. Was already breaking. And so these policies are not only a response to the ROI on colleges, but also there are whole swaths of students in the last admissions class that didn't go or didn't get their financial aid in time to attend these universities.

And so I think this is a great proactive. I myself had like tremendous financial aid from Harvard undergrad because they had many of these policies. My mom was like earning 35 K a year and tuition was more than her total annual earnings. So I've personally benefited from these types of policies. I'd love to see them continue.

And ultimately when you pencil it all out, there's actually I think a great case to be made for the ROI for these universities. Cause some proportion of these. Students that are getting free access are going to become dedicated alumni and donors who have a vested interest in giving back to the university.

So I think it's an important thing. The European mindset would be, why isn't it free for everyone? Like this is ridiculous that it's even starting here. So I still think we have a long way to go to kind of match the Equitable options and pathways that Europe offers. But to me, this is a step in the right direction.

Notably also that some of these are red state, like institutions in red states. So it does also feel like a cross cutting, like something that we can all unify around and be. You know, bonded around. 

[00:42:38] Alex Sarlin: Yeah, the FAFSA point is fantastic. I'm sure that was definitely a big factor in why you basically have almost like a lost class, a whole bunch of people who didn't go to school because of the FAFSA disaster, and they want to learn them back because a lot of them are now not on the college path at all.

That's a great point. And I think another piece of this that could be relevant. I imagine it's part of the conversation as well is the Supreme Court decision to basically, you know, remove, you know, race based admissions in any way, or sort of, you know, really, really challenge, you know, a lot of the previous, you know, admissions policies.

And I think there's a feeling among, especially these, a lot of the selective schools like MIT and Carnegie Mellon, where they're saying, Oh, well, what do we do with this? We thought about taking away the SAT as a sort of gatekeeping issue, but then a lot of them put it back in, including MIT. And then we aren't allowed to use race as a factor, but it can go in the essay and it can go there.

And I think they're starting to have this mind shift, which is maybe healthy, which says, Hey, maybe it's really. Maybe we should sort of move away from race as the core thing and think about money. Think about poverty and class. This is sort of what Bernie Sanders has been saying for a decade now is the new split in American life is based on money rather than race.

And it feels like maybe there's some sort of push in that direction for these schools as well. I'm not saying that's the main reason they're doing it, but I think there's something interesting to say, you know what, let's draw a line in the sand and say people from families that make less than X dollars a year Transcribed We don't care no matter where they're from, get free access that will both support, you know, people of all races in poverty, but also may look and feel fairer to a, to a lot of the people who have been, you know, concerned about this kind of thing in the past.

So it's an interesting moment. I mean, I think we'll see what happens there and we'll see if they all keep the programs after, you know, this few years, I think they probably will. And I agree. It's a really good thing. It's good for equality. It's good for equity. It's good for the country. I still think there's further to go.

I'm a little bit of still on that European side of like, you know, keeping your sticker price at 50 plus thousand dollars a year and then saying, and if you make less than this, you can go for free. Arguably, you know, can create a whole other kind of resentment. So it's going to be interesting. So many great 

[00:44:51] Ben Kornell: things to chat about today.

So much momentum and energy in our space. And, you know, it just brings me back to the point around gratitude. Grateful for you, grateful for our EdTech Insiders crew, all of our community members, if you get a chance and you're looking for a way to up your level of commitment, please sign on as a EdTech Insiders Plus member.

You can check that out on our website. Sub stack, but it's been such a great month for edtech insiders and also such great month for edtech. So we hope you'll be following us. Cause if it happens in edtech, you'll hear about it here on edtech insiders. And now to our guests, who do we have coming up, 

[00:45:34] Alex Sarlin: Alex?

Yeah. Yeah. Happy Thanksgiving to everybody. I hope you had a great Thanksgiving weekend. Our guest today is Brett Weikart. He's the CEO of Skillfully, which does scenario based hiring, basically simulations so that people can try on jobs and do real job simulations as part of the hiring process. Very interesting direction for the world.

And it was a great interview too. So. Enjoy the interview with Brett Weickart. Brett Weickart, welcome to EdTech Insiders. 

[00:46:02] Brett Waikart: Hey, Alex. Great to be here. 

[00:46:04] Alex Sarlin: So we covered some of the funding round that you have gotten recently for Skillfully on the podcast, and it always struck me that you're doing something very, very interesting that I think it should be more broadly amplified, and I think it's really sort of a wave of the future for what hiring will look like and how skills are going to make their way into the hiring process.

Can you first just Kick us off by telling us about what is Skillfully and how you came to start it. 

[00:46:31] Brett Waikart: Yeah, thank you so much. We live in such an interesting time to be focused on this kind of intersection between education and employment. Skillfully as a company exists to help employers get back to what we think of as human centric.

hiring. The real challenge today, this is something that is generational for somebody who's working on the recruiting side of the table. They've never faced a challenge like this before. The advent of generative AI, of large language models, of freely accessible chat GPT. Back in the Early 2023. At this point, we were promised that content creation would be free, that we're in this beautiful new world where any type of content, no matter what it might be, can be freely created, freely generated by these tools, and we just get to sit back and be the creative mind behind that.

Well, what we've been founded on is the premise that, hey, a resume, a cover letter, standard application, credentials and paraphernalia. That's content just of another type. And what's played out over the last year is, is honestly really very similar to what we expected that on the supply side, on the job seeker side of the recruitment piece of the recruitment dynamic, it is now essentially free for any job seeker to use these tools.

Exactly right. To be able to take a job description, to customize their resume, to be perfectly keyword matched. And then also to, to render, Shakespearean prose of how this is the only job that they might have ever wanted that this is the one and a Shakespearean prose that these tools let them do 1000 times in an hour.

And then it automates the application of those sending those materials out. Now, in one instance, that's a good thing. We get new tools to help job seekers be seen in the best possible way. The problem comes when everyone does this. And when you're on the receiving end of all of this, Something starts happening.

All of these applications start sounding alike. They start being like, was this written by the same person? Everything. When everybody looks flawless on paper, there's very little hiring signal for an employer recruiter to make a hiring decision. What skillfully focuses on is that challenge where for 500 years, since the advent of the resume, we've all relied on someone's description.

Of their skills of their ability of their employability for a particular role. That's what we've needed to use. We can't actually see them do their work at the scale of people who, you know, how many folks will apply to a given role. So what's the next best thing? Everyone has a resume. Great. Everyone has a LinkedIn profile.

Everyone has a, the list goes on and on and on. The problem, though, comes when that no longer is useful for the employer. You have to shift. You have to go from described skills to what we focus on, to helping employers see demonstrated skills. So our product here is we basically give employers a platform that instead of putting up a job description and accepting a ton of resumes, they can put up a job description.

Tasks scenarios, workflows, the bits of work that somebody would be expected to navigate successfully if they're on the job and we give job seekers the opportunity to demonstrate their skills by going through those scenarios, going through those simulations. We don't care where. You may have learned your skills.

We don't care the path you took to that point, but Alex, this is where you and I speak the same language. If you are the most skilled for the skills that employer is looking for, you deserve that job. It doesn't matter. What's the top of your resume, what school you went to, who you know, it's that, that's the world we're moving into.

And that's what we build for. 

[00:50:18] Alex Sarlin: It's a very exciting vision. And you know, I want to double click on two parts of what you're talking about. Their first is, I think one that is not what I expect you to say. And I think it's a really interesting one, which is this idea that the sort of asymmetrical marketplace that is hiring has gotten really distorted and flooded.

By Jenny, I have read stats about people are applying to like 300 jobs instead of, you know, 30 or, you know, these really inflated numbers, because as you say, it's like spam email, right? It, if it costs you nothing to, to apply to another job, virtually no time or energy, why not, you know, sort of hit the lottery.

And I can imagine that floods the applicant tracking systems. It floods the hiring teams. Talk to us a little bit more about that. You are clearly like seeing that happen on the ground with your customers, with the hiring and HR departments you work with. What does that look like? How are people handling it now in the absence of skill based hiring solutions?

[00:51:09] Brett Waikart: And it's not just dozens or more applications. There was a tech crunch article a couple of weeks ago that, uh, essentially followed one of the person is a guest writer talked about how I use some of these tools where I clicked. I basically clicked execute on a single script and that bot applied to 2, 800 jobs over the course of a couple hours.

And they sat back with a cup of coffee and checked in on SportsCenter and they were going about their day. That is not a one off. That is not a trend that we can fight, that we should try to hold back to protect the sanctity of a resume, of that application. What we need to recognize is we're just in a completely different world now.

And this is the double edged sword. We talk about this as the AI arms race for hiring, where on either side of the table, to your point, job seekers armed up first, and they're like, Hey, these tools are wonderful. Watch this. I can apply to not just hundreds, thousands of different roles, and I'm going to win that numbers game.

Okay. Employers were kind of starting with ATS systems that are keyword matching, using a little bit of natural language, a little light AI themselves. Now they're needing to figure out, well, what's our response to this? Again, wild time to be alive, to be building in this space. But what we see though, we talk about like, we go back to history on this one.

We think a lot about, we love this story about when, uh, television. First became available. The very first syndicated programs were essentially radio hosts, radio programs, who they put up a camera, we got into costume, and then they just sat there and they just read the scripts that they read for the last 50 years for advent of radio.

And for them, that was what this age of television meant, like, Oh, wonder. And this was, we didn't know that cable or streaming or everything else was coming. Like this vibrancy was not yet visible. That's the moment we're in right now, where when people are like, Hey, I'm getting a thousand resumes. How can I use an AI tool to parse these?

You're in the age of television, working the radio playbook, what we are showing. And there's other companies out there like us that we're not alone in this, but this is the coming wave. I spent a ton of time on campuses talking to students about this new recruitment reality. But the goal is to be able to demonstrate.

Those skills. That is the priority. Find opportunities to demonstrate your ability. Don't try to think about if you do spend time and money and effort trying to figure out how to get your resume to break through. I guarantee that is going to be the most frustrating job search experience of your life.

The job search today is broken, not breaking but broken for both job seekers and employers, but there's hope. There is this wonderful light at the end of the tunnel that we're moving towards here, where this, these large language models that are commercially deployable, Gen AI tools that can be used not just as a chat interface for you answering questions, but are the building blocks of larger products.

We're going to see a brand new generation of those kinds of platforms. That are going to take over. And that's a good thing. That's a great thing for job seekers who want to be seen for their abilities, no matter where they may have gotten those skills. And this is moving towards a world that will be more fair, more inclusive, more skills centric, but this is the path that we're walking down.

This is what we need to navigate today. That's where we are. 

[00:54:35] Alex Sarlin: I want to ask about the fairness component of the sort of equity and fairness component next, but before we do, first off, I just have to give you huge props that radio television metaphor is one that I have. This is, that's a Marshall McLuhan metaphor about the, the content of any new media is the old media until the new media discovers its own.

Exactly. Right. And I've thought that is stuck in my mind for many years. I've always found it fascinating. I've never heard anybody. Tell it back to me. I mean, say it to me. And I totally agree. I think it's a really good metaphor for this exact moment. It's like, it's the, you know, make faster horses kind of thing, right?

It's like you get this incredibly, incredibly strong technology, powerful technology, and rather than seeing what it can do, that's truly different than what came before you use it to sort of turbocharge, you know, The old ways you've applied a thousand jobs instead of 10 jobs using the same process, you know, the cover letter and the resume, by the way, part of why the Gen AI is so good at cover letters and resumes is that they're so standardized.

It's one of the first things that they had learned how to do like perfectly because they look the same and have looked the same for many years. But what I want to actually ask you about before we get to the fairness is this concept that you're bringing up of, you know, signal of this is not only true of the hiring moment.

This has been true of the entire education ecosystem for a long time. We use assessments or applications as proxies to estimate, you know, Future success, you know, if somebody does this, and we've been talking about the podcast, but this is part of why I'm so excited about what you're doing at skillfully is the idea of removing the proxy saying, you know what, I don't care what you got on this test, you know, in high school, I don't care what you got on this, you know, how well you wrote your cover letter.

These were all proxies. The only reason we ever used them them is because we didn't have the manpower or, you know, the bandwidth to actually ask everybody to try the job or to try a week at college or a try a week at a selective, you know, bootcamp or something. It's all proxies. And AI, I think has the chance to really let us jump over the proxies and actually be like, Oh, you want to get into the school?

Come to the school, you know, virtually, in some way, artificially, try it, go talk to other students, go do your thing, go join clubs, go see what it's like, and you get to learn about it, but we also get to see how you are, we get to see who you are and how you did in these classes and how you felt, it could truly change, like it could basically Assessment and application, almost a thing of the past over time.

And it's, I think that you're probably on somewhat, maybe not as extreme, but somewhat of the same, uh, same bandwidth. How do you respond to that kind of, you know, Alex, 

[00:57:08] Brett Waikart: we should come back and have a longer three hour Joe Rogan style conversation around this. Cause we could go so, so deep. I appreciate you grounding this in history.

Let's do it again. That. The current education system that we have in the United States, which is the majority of the developed and developing world, is a school system that first came together post World War II. Where standardization of basically pushing volume through a system to prepare them for very fixed roles in the workplace, that was the priority.

There is still so much of that same mentality that is just implicitly ingrained in the school system today and I could not agree more. That this is the huge opportunity for education. What these tools allow for is individualization. Exactly what you're saying. It's the remedy to this very standardized, rote, mandatory attendance, fixed, major kind of pathway that folks take through education today.

What we can do. Is we can cut through a lot of those just decades of kind of like calcified student experience and what we're doing today. But, and listen, we do this work as well. We work with more than 50 different higher education systems, online universities, community colleges are incredibly eager to adopt this type of a model, but when we work with an employer.

What we do in the first step is we translate their job description and break it down into the specific skills they're looking to hire for. We don't stop there. We go from the skill to what we call the task orientation. How that skill is used, what that looks like when it shows up in someone's work. And then that's the basis that we use to determine, well, who's proficient for these particular roles.

The next step for us to take And again, this is where we're very excited to play a role. We can go and say, Hey, these are the skills. Let's say we work with 50 employers that are sitting around Kansas city. And we know that those employers are looking for this body of skills, but they're hiring today for those skills instead of them, a school teaching a curriculum that has been passed down year after year, decade after decade.

And it stayed so rigid. Let's use those skills as a map. And let's say, Hey, I see there's gaps. I see where we're strong. I see where we're reinforcing where we're missing the mark. Let me adjust that way to more directly teach exactly the skills As they are used, as they are paid for by employers today, when you enter the workforce, I couldn't agree more.

Alex, like this is the opportunity where we are at the cusp of what might be the single biggest historic shift and approach to education that we've seen in a hundred years. And it's, if we're going to get back to that fairness and that inclusivity piece. That's wonderful because also it gets wrapped up when you go to one of these schools and you go through one of these kind Of mandatory four year sequences of your education.

It's also a filter It's also a sign of hey, like you're a privilege your position Did you get to that school that right there becomes a qualifying factor implicitly for your future job? That is not fair. That is not the american dream basically creating that kind of an artificial filter what we want to build is a world where You Education has never been so democratized to learn.

A skill has never been so possible at so little money. It's so little time. It's so little cost. And if you can learn that skill and be the best with that skill, that's what should determine your employability. That's the larger vision. We're a public benefit corporation. We were founded this way because that vision that you very clearly have yourself for what education could be, what that bridge to employment could be.

That's what we're looking to bring about, but it's incredibly exciting. Like this could upend decades of just kind of incumbency of calcification of both education and employment. That's really what we focus on day to day. 

[01:01:02] Alex Sarlin: Yeah, I love it. And very much aligned with, with the way I see things. So let's talk about the fairness piece.

Cause this is a part that I struggle with a little internally, right? You know, we live in a world right now where these resumes, the application systems, there's sort of a whole lot of different moments in an educational life or in any citizen's life right now, where there's all of this sort of cultural capital that's going on.

That can be used to improve your chances of getting something. I mean, you mentioned, you know, the, who, you know, factor, but you also have people who, you know, have all these tutors or career coaches or hire people to write their resume for that. I hired somebody to make my LinkedIn for me years ago. I was just terrible at it.

And I hired somebody to do it for me. And it's like, you know, people take these. Shortcuts because they understand that those shortcuts are worth it. Right. And they have the means to do so. And I wonder, you know, I share your vision. I really do that in a world where, where hiring is truly skills based, you will get the best candidates and you'll get people and it will remove some of the sort of, uh, imperfect signal of being like, Oh, I just looked at, you know, Whether your university education was, you know, whether you went to like a selective school and that's a proxy for, you're probably smart.

So I'll hire you versus this other person. Like that's the brokenness of the current system. But if we get to a skills based system, will there be a replication of that same kind of. World where people with means either monetary or cultural means start to say, Oh, if I want to get a job at, you know, Microsoft, I have to pass this like, you know, two day, really awesome simulation where I have to do all these things.

I'm going to like, I'm going to hack it. I'm going to go to, I'm going to figure out a way to get through it and figure, you know, gamify it or go hire a tutor to do it for me or a tutor to like, How do we avoid that same thing happening again? Cause that's what tends to happen. We try to make these standardized tests to be more meritocratic and then people find ways around them.

How can we avoid that future with skills based hiring? That's 

[01:02:55] Brett Waikart: such a well framed question, first of all, and this is the work, right? Yeah. I bring a message of hope for this. I can speak to this a little bit, but I also want to acknowledge that we don't live in a world of readily available silver bullets.

The best that we can hope for is to continuously be directionally correct and continuously make progress in a direction that is more meritocratic, that is more inclusive and accessible. What I can say is that what is different today than maybe 10 years ago, maybe even five years ago. Those people in the senior seats, running the HR orgs, running the talent acquisition orgs at some of the world's largest companies.

I was just on the phone this morning with one of these companies that financial services industry, they are one of the best logos, one of the best bands, the best brands in the business. And they have a history had a history of hiring Ivy league show horses and folks that were just, however, those leaders at those companies.

Are some of the most progressively minded, creative and mission driven people where they see their mandate, not just as hiring the best people for their company as might be judged by the diploma on their wall or their LinkedIn profile, but they are invested in creating that future generation of their company.

That is very skills centric. That is very inclusive. That is the kind of opportunity that offers the kind of opportunity that can be life changing. Like we can take solace in the fact that the people with their finger on the button, making the hiring decisions and the decisions around what tools to use.

Share our world view. They share this world, this view on what's possible, what's within reach. And that's a good news right there. That alone means that if we continue partnering with those organizations, if they continue to see tools in the ecosystem that can reinforce and can actually make reality that vision they have, then we're going to make progress.

We will exceed the bar of where we've been, which is the most important part of the work. Um, It's really interesting when you start digging into this. So about four or five months ago, the journal came out with an article that was like, the resume is dead. The resume is dead. House is on fire. This is awful.

There's it's, you know, we're going back to a world where now the only choice for the employer is to hire by their connections, to hire from their network. And that's the only thing that works these days. No, no, that right. As soon as that happened, hands waving. No, no, please. There's a, there is a different way.

This is where folks like Salman Khan, the founder of Khan Academy. He has been doing some incredible writing recently around AI's promise for breaking through what you're talking about. These layers of pedigree, of privilege, these filters that have held so many people back, and actually making more accessible the type of skill development, skill learning that would make a difference for this model of employment that we're talking about.

Again, we're not looking for a silver bullet, we're just simply looking to be directionally correct, and the fact that there are folks in these hiring seats. That there are educators, that there are higher education online university community college state school leaders who see this coming and are looking to make that adjustment.

That's what we need to support. That's why your show is so important because it highlights that. But I'd say that those edge cases of people finding ways to game the system of pockets of unfairness are in an inevitability that shouldn't dissuade us from the work. Oh, sure. But are simply something that we just need to stay proactive and on the lookout for.

[01:06:40] Alex Sarlin: A hundred percent. And I wasn't bringing it up as some kind of, you know, not of course. Oh yeah, sure. I know. Something to anticipate because there's so much incentive, you know, to doing that, you know, any selective process, there's a lot of incentive for people who have The opportunity to take an advantage to try to do so.

That's the varsity blues scandal. I think that was a terrific answer. What I'm sort of hearing you say is that, yes, there will be some distortion sort of naturally from that, but there's ways to actually, you know, in case the entire skills based system in mission driven, uh, Thinking so that you can sort of design a system that tries to be fair, that tries to sort of be equitable and fair.

I mean, even something as simple as a blind system, right? Which we've never been able to really do before, but the ability to say, I'm going to hire this person because of their score on this simulation, I don't know their name, so I can't be biased against their name. I don't know what school they went to.

I can't know. I mean, there's, there's evidence that if, you know, somebody says they played lacrosse in high school, they're more likely to be hired because that's a, you know, a rich, you know, community. You know, like a proxy for all these things. So it's like, if you actually hide some of the personal characteristics that people use to bias and instead really, truly focus on the actual skills, maybe that would also serve to be another leg of the sort of stool to keep up meritocratic hiring, but I think it's a great point.

And you're being, I think you're being very fair to see how the future might unfold in that way. So I have one more question for you here, because I think, you know, AI is evolving very quickly and the generative AI landscape that you've mentioned, you know, people used to make cover letters and resumes, it's very quickly, you know, going to be able to make immersive experiences where we're really not that far from it.

The first video is right around the corner and then immersion. We just saw runway today, or, you know, this week launch a feature where you can. Make a video and it will, you know, like basically puppet, you can like move yourself around and it will make the AI, you know, move yourself around basically motion capture in AI video.

This stuff changed so quickly. So as somebody doing simulations, I'm curious what you think that will look like in, you know, the next two, three, you know, five years, what will a simulation of a job or, you know, a skill based task based simulation look like as the AI technologies improve all around it. 

[01:08:52] Brett Waikart: I would reframe that ever so slightly.

Please, 

[01:08:54] Alex Sarlin: go ahead. 

[01:08:55] Brett Waikart: I would No, no, this is the fun part. This is what we think about all the time. You're asking a wonderful question. I don't think it's a question of AI capabilities as a system. It's not a question of compute as the limiting factor right now. The limiting factor is the distribution of Other forms of technology of AR of VR, the underlying logic for our simulations.

Our simulations today are delivered entirely through the desktop. They can go through the phone because listen, those are the screens that every single person is working off of today. And you need to be able to live where you have that broadest level of distribution. 20 years ago, the idea was like, Hey, everyone has a resume.

So I'm going to focus on resumes right now. We're saying, Hey, everyone can interact with. Their smartphone with a tablet with their computer. That's where we're going to live. So the frontier is shifted. The frontier will shift again. As soon as we get to another form of interacting another form factor to interact with technology and whether that's glasses or snow goggles or something, we don't know yet the AI piece.

The ability to create tactile, interactive, dynamic, like, rich experiences is divorced from the actual medium that those experiences are delivered through. And yeah, the AI, the ability of compute per dollar is going to continue rocketing up this exponential curve where we just cannot even understand.

It's hard to conceive of where we're going to be 10 years from now, even two years from now. But what we'll see that next shift in what a simulation is going to look like is not going to be because of an AI technology. It's going to be based off of a form of media that we have access to a broad access to now that are.

In the future that we don't now, so I'd say that's the dependency that we're looking at. 

[01:10:48] Alex Sarlin: So very fair. Let me ask a slightly different version then, because I think your point about the distribution of technology is totally right. So when I just think of, you know, back mapping, if I wanted to hire somebody to work with me in the jobs that I do on a daily basis, well, what do I do on a daily basis?

I'm constantly on. You know, google sheets. I'm constantly on google docs. I'm on slack. I'm on, you know, I work in superhuman. I use all these different tools and I get on calls, right? I get on zoom calls and talk like that. Those technologies are accessible from a phone or a tablet. Do you anticipate a future where simulations will basically be tool based simulations where you're literally Doing the tasks of the job in the tools that the jobs are that wouldn't require a VR.

Is that what you already do? Is that something you anticipate? 

[01:11:34] Brett Waikart: That's what we already do. That is the approach. So what we can do right now, especially for any kind of work that is done through a computer, Because we're delivering simulations through the desktop, we can perfectly model just about any type of tool, any type of implement that you would be using.

Like that is where we can live. And that is the opportunity today. That's where we have an incredible product and engineering team that they're continuing to push the boundaries there. And that's where we're growing in terms of the depth of our product today. What I'd separate From what you're saying is, let's separate skills from tools.

Tools will continuously change. And they may change from desktop to some other wearable, some other device. That's something we don't know, but we can assume that's going to likely happen. The skills, they will be stable. But you're, Alex, just from this time together, from us talking beforehand, you're an excellent communicator.

You have a wonderful sense of how to basically draw out in a conversation with somebody interesting facts that might have relevance to your audience or to the themes that you want to be able to talk about. You're able to do that across a bunch of different types of tools, but really your faculty with those interpersonal skills is what makes you an excellent person for your job.

And that is the same spectrum that we would grade somebody else doing a similar job who may be better or worse. That is separate. That is agnostic from the tools. The technology layer. My layers job is to ensure that these simulations are grounded in the tools of today. That is a curve that we are firmly on and will continue to stand on as that curve moves out.

The skills will always be what's most important. And so the ability to get at that skill, to go through the tools, to get at the core of what your abilities are, that is the work. That is really the value that we can then produce, that we can create for the employer, who's hiring based off of who's going to be best for the job.

And then for the job seeker, who has every right to be proud of the fact that I have the skills for this job. I can do this job. They just can't see me yet. They just don't understand. Like I'm, they haven't recognized that ability. That's where we have the ability to focus on skill, to elevate that and to really make a difference in terms of how hiring is done.

[01:13:50] Alex Sarlin: That's very exciting to hear. I think that that interaction between, you know, tools, technologies and skills and the actual, you know, how you use them, what you bring to them is a very, very, very important set of interactions. My. Personally, I think the skills and the tools often actually sort of bleed together like, uh, you know, new tools engender new skills or require new skills.

And especially with AI, that's true. But I agree with you that the skills outlast the tools, right? The tools continually change some of this. 

[01:14:18] Brett Waikart: I like that. That's a good way to put it. Yeah. That's a wonderful framing. I agree. Yeah. Well said. 

[01:14:22] Alex Sarlin: I mean, it's a future that I am really excited, you know, to be in.

I'm really excited for my, you know, two year old to grow up in a world where it's not about, you know, SAT tutors. It's not about resume coaches or programs that let you apply to 10, 000 jobs. I mean, what a depressing way to try to make life decisions, right? I mean, literally, like you're going to get the job that you've applied to.

5, 000 jobs. And you know, you get three interviews and then take a job. Like what an awful way to get a job. It's horrible. And, and for hiring managers have to filter through thousands of applicants. You have no idea how to sort them. The applicant tracking systems way behind, you know, we're in a transitional period.

I think we agree. And I really am very excited about the future that skillful is, is moving towards. And it caught my eye. As soon as I sort of heard of you and what you were all doing, it was like, This is the right direction. I know you're not the only one in skills based hiring, but I think you're doing something very, very interesting.

So we'll save more conversation for our three hour Joe Rogan style getting wings at a table, whatever it is. I really appreciate your time. This is Brett Weikart, CEO and co founder of Skillfully and that's skillful. ly, right? That's correct. Yeah. So check it out at skillful. ly, skillfully, public benefit corporation, just transforming hiring by focusing on people's real abilities instead of their resumes.

Thanks for being here with us on edtech insiders. Thanks so much, Alex. This has been a blast. Thanks for listening to this episode of edtech insiders. If you liked the podcast, remember to rate it and share it with others in the edtech community. For those who want even more edtech insiders. Subscribe to the free EdTech Insiders newsletter on Substack.

People on this episode