Edtech Insiders
Edtech Insiders
New Year, New Ideas with Google Part 1: Postcards from the "Learning in the AI Era" Event
In this special episode, Edtech Insiders brings you to the heart of Google’s "Learning in the AI Era" event, held on 11/20/2024, in Mountain View, California. Hear exclusive interviews with the following Google leaders:
- Ben Gomes, SVP, Learning & Sustainability, Google
- Maureen Heymans, VP of Engineering, Learning & Sustainability, Google
- Shantanu Sinha, VP & GM, Google for Education
- Jennie Magiera, Global Head of Education Impact, Google
- Steven Johnson, Editorial Director of NotebookLM and Google Labs
- Yossi Matias, Vice President, Google & Head of Google Research
- Obum Ekeke, Head of Education Partnerships, Google DeepMind
- Jonathan Katzman, Director, Product Management, YouTube Learning
- Lisa Gevelber, Founder, Grow With Google & CMO, Americas Region
and audience members:
- Isabelle Hau, Stanford Accelerator for Learning
- Libby Hills, Jacobs Foundation
- Kristin Fracchia
- Adele Smolansky, AI-Learners
- Ishan Gupta, Paper
From AI-powered learning to tools transforming education, this episode explores the future of EdTech.
The full interviews with each guest will be posted daily this week. Stay tuned for deeper insights from these inspiring leaders!
✨ Episode Highlights:
[00:02:55] Ben Gomes' insights into AI-driven tools like Read Along and personalized learning.
[00:09:28] Maureen on integrating AI into classrooms and bridging in-class and at-home learning.
[00:19:24] Shantanu's reflections on AI-generated content and its implications for education.
[00:21:30] Jennie on educators embracing AI to enhance learning for their students.
[00:29:22] Steven on the origins of NotebookLM and its evolution into a transformative tool.
[00:41:23] Yossi on the development of fine-tuned AI models like Gemini and their impact on learning.
[00:49:41] Obum on making AI education more accessible globally and diversifying the workforce.
[00:59:49] Jonathan on the rise of educational creators and their role in modern learning.
[01:07:08] Lisa on the overview of Grow with Google and its impact on workforce development.
😎 Stay updated with Edtech Insiders!
🎉 Presenting Sponsor:
This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.
[00:00:00] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry. From funding rounds, to impact, to AI developments across early childhood, K12, higher ed, and
[00:00:17] Ben Kornell: work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod, check out our newsletter, and also our event calendar.
And to go deeper, check out EdTech Insiders Plus, where you can learn more. Get premium content, access to our WhatsApp channel, early access to events, and back channel insights from Alex and Ben. Hope you enjoyed today's pod.
[00:00:44] Alex Sarlin: In this very, very special episode, we are doing postcards from the latest Google AI summit. This was a really amazing event on the Google campus in Mountain View in California. And this episode is composed of a number of short, short Interviews. And the way it works is we have Sarah Morin, who's our community manager, who was on the floor of the event, interviewing all sorts of interesting people who are there to understand more about what Google is doing and to come together as a community.
And then we're interspersing them with short clips of interviews that we did with all eight of the Google learning. Leads, that's the learning lead of YouTube. That's the head of Google classroom. That's the head of Google research. I mean, really incredible people. Now we have episodes coming up very shortly, which are the entire interviews with each of these Google leads.
And they were truly amazing. Very clear glimpse into the future of AI and technology, but enjoy these clips. You can get a little bit of a sampler of all these amazing, really brilliant Google leads, as well as a little bit of a chance to feel like what it was like on the floor of the Google learning with AI summit.
Enjoy.
[00:01:57] Sarah Morin: All right. If you could introduce yourself and share one of
[00:02:00] Isabelle Hau: your takeaways from your morning at Google. Isabelle Howe, I'm the executive director of the Stanford Accelerator for Learning, and really enjoyed the last piece of this discussion this morning around the future of reading, which led me to a lot of, you know, really interesting thoughts around where are we with AI?
Do we really need, you know, reading and writing? What are the core skills and competencies? That, uh, our children will need in the future.
[00:02:27] Sarah Morin: Yeah, is there anything that you're working on at the Stanford Accelerator right now that you want to share that kind of ties into those themes you're
[00:02:33] Isabelle Hau: noticing? So we are doing a lot of work on early literacy for sure.
One of the areas of work that I'm really excited about is collaboration with AI. So we have, uh, uh, big projects going on across, uh, students and faculty looking at how AI can be used for collaboration. Amazing. Thank you so much. Thank you.
[00:02:55] Alex Sarlin: Ben Gomes leads the Learning and Sustainability Organization at Google and is an advisor to the Google Search organization.
He oversees the teams focused on harnessing technology to scale education around the world. The learning team supports educators and learners around the world with products including Google Classroom, Google Scholar, Google Arts and Culture. And an AI based tutoring app, Read Along. They are also creating academic and learning experiences on Google Search and YouTube.
As one of Google's first principal engineers and an early Google Fellow, Ben also led the Google Search organization, which included Google Search, Google Assistant, and Google News for more than 20 years. Prior to joining Google, Ben earned his BS from Case Western University in Ohio and his PhD in computer science from UC Berkeley.
He was born in Dar es Salaam, Tanzania and raised in Bangalore, India. I want to ask you about your history because you have a long tenure at Google. You started in something very different than education. Tell us about your Google history and what brought you to this sort of education specific type of role.
[00:04:02] Ben Gomes: So I've been at Google a long time and I was in search for about 20 years. And, you know, from the early days of search and search is a fascinating technical problem and focuses on language. But, you know, for me, I've always been interested in education and learning just because there's no way I would be where I am today.
If I had not gone to a good school, if my mother was not a school teacher, all of those things were formative for me. And I kept going back to India to visit schools to try and see, what kind of a role can technology play in schools? And so that is something that's an ongoing interest for me, even while I was working in search during vacations and other times.
So after 20 years, I wanted to work on learning or education, and I thought about Various ways to do that. And then I realized after conversations with many people that Google is a place where many of the products are really learning oriented. You know, our mission is to organize the world's information, make a university accessible and useful.
Well, one of the most useful uses of information is to transmit learning, right? Yes. And people are, Unsurprisingly, using our products to learn, whether it's search, whether it's YouTube, Classroom, Google Scholar, Google Arts and Culture, many of these things are used primarily or importantly for learning.
So I was like, well, if I already have a background, what Google can do. Can I make use of my background here, as well as learn more about learning, work with teachers and pedagogical experts to bring more of that learning into our product. And then when the tsunami of AI hit, you know, it became important to also think about, well, what new things can we build?
[00:05:35] Alex Sarlin: Yes.
[00:05:36] Ben Gomes: And so we're thinking about new, new ways to explore the space. You saw, many of you have seen Notebook Alan, which is an exploration of the space of like, which clearly has applications for learning. We're working on this lab experiment called Learn About, which again, is trying to break up. a complex topic and give you scaffolding during learning and providing these pedagogical bits in the context of learning to make you stop and think, to get you to be more actively engaged.
And so there's, that's just the beginning, right? So there's so much more that's possible there. Yes.
[00:06:06] Alex Sarlin: Yes. You mentioned Google arts and culture and there's also Google books. I mean, you know, Google has been trying to sort of chronicle the world's information for, for, for Yeah. But a lot of it, as you sort of mentioned, it's amazing, but it's, it's sort of out there to be found.
Right. Somebody can go look inside a museum in the Prado if they are so inclined, or if they have a teacher sending them on a field trip, but it doesn't sort of come to them. Um, and it feels like that's an incredible opportunity here. If somebody is searching for something or in their email, there's so many.
entry points where people might find it interesting to access some of that information. I'm curious how you see that ecosystem playing, sort of proactively pushing learning to people.
[00:06:43] Ben Gomes: I think that's a really interesting question because I think, you know, it is true that many of our products, people come to us with a particular need, but I think if we can stimulate their curiosity in that moment, we can take them to deeper experiences where they learn more.
I just saw, you mentioned Google Arts and Culture. I saw this wonderful experiment that the team has done. Which is giving you an audio guide when you're in a particular location virtually. Now, many of us are not going to get to visit many of these amazing locations. But to actually be there with something that's a guide based on exactly what you're looking at is really quite wonderful.
This is really kind of incredible as an experience, right? So to get to experience things that you otherwise couldn't, I think there is a challenge because so many new things have been created. How do we actually bring them to users at the right moment and in the right way? But you know, it's a challenge that we've been working on for a long while.
And so I think I have confidence that we'll get there. It'll take us a little bit of time to figure it out because we want to make sure we make, keeping the user experience really good, to give them what they actually want and not just what we think they want. So we have to work through this carefully, but I think there's a lot of potential to bring in many of these experiences to our products over
[00:07:54] Alex Sarlin: time.
Absolutely. I had a personal experience like that. I was in the Metropolitan Museum of Art in New York, standing at the Temple of Dendur, and I suddenly said, I wonder what would happen if I asked AI to explain this to me in real time. And it was like, not only could it explain it, it could tailor it to what I really wanted to know.
I could say, I'm looking, and I realized, Oh, wow. Every piece of art in this whole museum, I could say, I'm looking at this piece and it'll tell me the history of it. I'll tell me about the artist and it was like this little personal epiphany, but I mean, everybody should have that ability. It's incredible. It's a phone. It's in the phone in your pocket can do this for you already.
[00:08:28] Ben Gomes: It is absolutely incredible. I mean, especially when you think about the context of art, people come from very different backgrounds. There are people who have studied art history in college, and there are people who have never been exposed to a certain kind of art until much later in life.
And yet, art can unlock wonderful things in our minds. if approached the right way. So it's almost a perfect example of like art of bringing the thing that you want to learn in a personalized way to each individual based on what your interests are. What is it in your mind that could relate to this work of art?
Right? Because for many people, I think, see art as maybe an elite pursuit, but it is, it shouldn't be, it really should be accessible to everybody. And it should be accessible in a way that, you know, people help them change their lives because it can bring joy, it can bring insight, it can bring a whole host of things that you otherwise might not have access to.
But that's where I think AI can play this role of bridging that gap. And I'm beginning to see the beginnings of that.
[00:09:28] Alex Sarlin: In this next conversation, we're speaking with Maureen Haymans, the VP of Engineering in the Learning and Sustainability Organization at Google. Maureen leads initiatives to revolutionize learning and education within Google products such as Search, YouTube, Classroom, and more.
Her work centers on harnessing AI to To create personalized tutors for every student and teaching assistants to empower every teacher, ultimately democratizing access to quality education so anyone can learn anything. Additionally, she is a lead in Google sustainability organization, harnessing information and AI innovation to identify solutions for combating climate change and helping build a sustainable future for everyone.
Previously, she led the Google app and discovery engineering team and worked on improving Google ranking. for many years. She started in Google in 2003 after getting her master's degree in computer science at the University of California, Santa Barbara. Maureen is originally from Belgium and moved to the United States in 2001.
She completed her master's degree in engineering with a specialization in applied mathematics and a minor in computer science at the Université Catholique de Louvain. How do you
[00:10:42] Ben Kornell: think about the interaction between learning And education and education systems and how does that come into play in how you build your products, tools and systems for those different stakeholders?
[00:10:55] Maureen Heymans: Yeah, that's a great question. So, of course, you know, we think teachers are central to the learning process, right? And so that's why we. As we think about learning, we really make sure that, you know, those tools are vetted by teachers and, and we really empower teachers with the right tools. So
that's where we are partnering closely with the classroom team on providing a lot of those tools in the classroom.
Both like tools that can improve teacher productivity, lesson plan, you know, practice, but also learner tools that they can decide that they want to enable in the classroom. At the same time, we know that, you know, Students are in the classroom for few hours a day, and then they still need to do a lot of learning outside the classroom.
And you know, a lot of time what they do is they go to some Google product to get that support. Right. Right. And, uh, whether this is for the homework, you know, we see a lot of homework queries on search and Gemini, but also because it couldn't understand a concept. And so they go to YouTube. And so we feel like that's a continuation of the learning, right?
And, and that's where we can support them 24/7. And they have that tutor that can help them, that can be completed, judgment free and can really over time adapt to the learning needs, right? And, and so that's where we've been super excited to, to work on YouTube and Gemini, so we can extend that learning outside the classroom.
But, you know, of course, trying to connect it back when we can, uh, with the classroom. So, something that, for example, we are looking into is we would love to adapt learning based on the student's textbook, right? Or the curriculum that the teacher provided. So imagine that a student could, you know, either say, that's my textbook.
And we have a partnership with the publisher, but also I'd take pictures of the curriculum or the notes from the classroom or, or whatever content that the teacher provided. And now we can automatically generate some study plan to help them get prepared for a test. Or we can generate practice problems based on what they study in the classroom.
So you can imagine that, you know, for those students that, you know, might not have understood everything in the classroom or need to continue practicing, we can continue that learning journey outside the classroom. So I think that that's super exciting, you know, and if we can connect the two even more strongly, I think that makes it even more powerful.
[00:13:15] Alex Sarlin: Yeah, it's a great point. Students are sort of already bridging the gap between Formal education and informal learning by Googling.
[00:13:22] Ben Kornell: It's a preexisting user behavior. So you're really
[00:13:24] Alex Sarlin: enhancing something that students are already doing. Yeah. Yeah. The YouTube folks said that, you know, a huge percentage of college students come back after their first day of class and use YouTube to supplement their learning and learn what they're, you know, sort of make sense of it.
So the idea of making that easier to connect and accurate and make sure it actually ties into. I mean, it's always accurate, but you know, ties into what they're actually learning in the context they're learning.
[00:13:46] Maureen Heymans: Yeah. And, and I think especially if we can ground that on how do we, they learn in the classroom that's even more powerful because sometimes it's multiple ways to explain the same concept or like, based on your district or the country you live in.
It's going to be explained in different ways. You know, the way my kids are learning, I read medics is completely different to the way I learned it. So, but now if we can ground that to the way that they have learned in the classroom would make it easier for them to build up on that prior knowledge.
[00:14:13] Alex Sarlin: You've mentioned project based learning and one tool that Google has put out that I think it's flown a little under the radar, but it's really fascinating is this learning tool. Yes. It sort of bridges search LearnLM. Can you tell us about that tool and sort of the origin of it and what it does?
[00:14:28] Maureen Heymans: Yeah, that's a tool I'm super excited about.
I'm proud to see my team build it. So super cool. But really the way we, we look at this was that, okay, if you look at search, you know, we have the power of the web, right? The web is so rich. There's so much great content. How can we pair that with the power of AI and then also infuse learning principles?
And so that's how we came up with, uh, Learn About. Looking at, uh, How people learn the best, right? And so as part of this experience, you can have a conversation with a tutor. And we definitely see through our research that that's a big part of why students are so attracted to AI tools. They want to be able to ask follow up questions, to have a conversation.
So this is a conversational experience. But we also try to provide you with a bunch of tools that can help with learning. So first we have this lesson plan on the left, which is kind of a scaffolding. So you can get a sense of the space, you know, you can get a guided conversation, but then we also, you know, in the experience, bring those what they call learning resources.
So it will have things like, you know, interactive lists, so which kind of break down problems into subconcepts. I will bring things like why it matters, which I think for me is so critical to explain why this concept we had learning in school, you know, matters in the real world. It will bring misconception.
It will bring vocabulary. And it even brings what we call lightweight practice. So that little quiz to make it fun because everybody likes to, to do a quiz. Right. And so, yeah, by bringing all those, you know, learning principles in the product and then bringing the best of the web and the best of AI, That brings conversation, you know, the team has managed to build this learn about experience, and it's been pretty amazing to see the positive reaction, like reactions, both internally and externally, and how people are using it both for curiosity, but also for learning.
And, and so Diffie excited to bring it to more users. It's available on Google labs, but we did, if you want to. Make it accessible, accessible.
[00:16:25] Alex Sarlin: Yeah. And it sort of starts out looking almost like a search experience. You say, what do we want to learn about? And then you realize, wait, no, this is a conversation and it's generating all of these really, you know, guided specific, you know, tools and resources to help you learn really fascinating.
I mean, you know, we're in the field and didn't come across it until relatively recently. I don't know how long it's been out.
[00:16:44] Maureen Heymans: Yeah. I mean, that was also a fun project because we decided in this world of fast evolving technology, we really need to embrace this culture of rapid prototyping in the team. And we're like, you know, things are moving so fast that if you have a six month plan, by the time you're done, it's obsolete.
So we really tried to do a rapid sprint where we kind of every week to try it out, it's some, you know, UXR and to get feedback and, and really iterate quickly. So the team was able to do this. Build this in like, I would say like get a quick demo in like two or three months and then iterate and get it out on, on labs in a little bit less time and a little bit extra months.
And, and so, yeah, so it was really fascinating how quickly they were able to iterate. One thing I will mention that. Super cool. It's actually most of this was built with just prompting and symbol technique, like rag. And we didn't need, of course, build on amazing models that I got from Google. So, but the team didn't need to do like expensive fine tuning.
And, you know, we just leveraged the underlying model, prompting, you know, the fact that the models are good, that instruction following and, and really play with that big prompt and said, Oh, you know, it's, it's fun to see that those code, that's just a big prompt. Right. And so that's also enabled us to move super fast.
[00:18:00] Sarah Morin: All right. If you could introduce yourself and share one thing from your morning at Google that you thought was really fascinating or stood out to you. Definitely.
[00:18:07] Libby Hills: So I'm Libby Hills from the Jacobs Foundation. By night, I'm also a podcaster. So I've just released an episode with some of the Google and LearnLM folk on our podcast at technical.
So it's also great to meet people in person here today. And one thing that I find really thought provoking and interesting about today is specifically relates to K 12, which is what I spent a lot of time, uh, Thinking about and working on, and this kind of real conundrum around how to think about this trade off when it comes to K 12, between products that respond to students immediate need, and what they're looking for when they come to a product, which might be, you know, homework help, let's say.
I'm responding to students need for that, you know, immediate request for homework help, but at the same time, delivering some sort of a learning benefit for students, because we all know that, you know, really deep and meaningful learning requires some productive struggle, which isn't always that fun. And so I think it's a really interesting problem, but if we're thinking about products that are designed to go.
Straight for students, uh, K 12 students, you know, 13, 14 year olds. What is that right balance between, uh, giving them what they want, but also maintaining some sort of, you know, learning benefit, given that, um, that learning benefit isn't always easy to deliver. Amazing. Thank you so much, Libby. Awesome.
Thanks.
[00:19:24] Alex Sarlin: In our next conversation, we talk to Shantanu Sinha and Jenny Magera. Shantanu is the VP and GM of Google for Education. His organization develops products like Google Classroom and Read Along for over 150 million teachers and learners around the world. Before Google, Shantanu was founding president and chief operating officer at Khan Academy.
where he helped make personalized learning globally accessible and free. Shantanu is also a founding board member of Khan Lab School, where his three children attend school. Shantanu has leveraged his consulting experience at McKinsey, along with his computer science, math, and cognitive science background from MIT, to make an impact on education.
Jenny Magiera is the global head of education impact at Google. She's also the best selling author of Courageous Adventures and the founder of the non profit, Our Voice Alliance, whose mission is to elevate marginalized voices and perspectives to improve equity and empathy in education. She also serves on the board of Saga Education.
Previously, she was the Chief Innovation Officer for the Des Plaines School District, the Digital Learning Coordinator at ESL. AUSL, a Chicago public schools teacher and a research assistant in Carol Dweck's Columbia University lab, a White House champion for change, ISTE impact award winner, she runs it, Working Mothers of the Year, and TEDx speaker, Jenny works to improve education globally.
She served on the TWG for the U. S. Department of Education's National Ed Tech Plan, the NAEP Delivery and Technology Panel, Teach AI Advisory Committee, and has been featured on NBC's Education Nation. C SPAN's reimagining education, TEDx, and NPR. Her proudest achievement is her daughters, Lucy and Nora. So, you speak to educators all over the world.
I'm curious, for some of those who are really embracing that role of, oh, I'm going to do amazing things with this, what are some of the things that you're seeing them sort of jump into and say, oh, I'm not just a passive consumer of this, I'm going to do something amazing for my students. What kind of things have you seen?
[00:21:30] Jennie Magiera: So many. So many. I love this question. I was actually thinking when I was listening to you talk about like how prevalent all the different use cases are and how we're seeing like this wide range of like, we don't even know how they're going to use it. Totally. And I remember when I was a fourth grade teacher on the south side of Chicago using Google Docs when it was consumer Google Docs before it was Workspace.
Like that's such like an open, it's literally a blank canvas. Literally. Open a Google Doc and it's literally a blank canvas. And I was a math teacher. I taught self contained fourth grade math using Google Docs. And it just felt so magical. And I think it's because educators are a species of MacGyvers.
We're so under resourced that you can give us a light bulb, a piece of duct tape, and a toothpick, and we'll build a car, right? And so, and so like, you know, AI all of a sudden is so much more than a toothpick and a roll of duct tape. It's a Ferrari. So it's like, instead of saying like, build a car, it's like, okay, here's a fully built Lamborghini.
Now what are you going to do? And what they're doing is incredible. We have this pilot network of educators around the world that we're calling the Google AI edu fellows. And the, how might we statement we went into
this with was If we equip educators with the most powerful versions of LearnLM, of Gemini practice sets that we have, and then give them support, community, resources for three months, what will come out the other end?
And we started it because we've realized that as a global education team serving the world sometimes, Because our headquarters are in the United States. Sometimes we start in the U S but let's be more equitable. So we started in Japan and Korea, which they've been doing some really wild things there with AI forever.
And I don't say that just as a Korean person. We went at the end of the showcase and we're in Tokyo and there's a music teacher and the music teacher was like crying, sharing their showcase. And they were saying how they have been teaching the same music class for decades. And in the time allotment, the time band, what they get, because they have to teach composition and music, they get so much work on the technical bits.
They never get to the human music composition. When they brought Gemini into the classroom, they said, okay, let me use Gemini to short circuit this path to technical mastery. And then let's like, get right to the human part that AI can't. And they said that in the decades of teaching music, they have never, ever seen more flumen.
emotional pieces of music than when they brought AI in. And AI wasn't creating the emotional bits. These were the humans, the students. AI was giving them a shortcut so they had time within a very concrete set. You have X weeks of this course to dive more deeply into it. And that wasn't the, like when I, you know, I don't know, I wasn't there when they invented Gemini, but I don't think that they were like, let me help grade seven music teacher in Tokyo get Human compositions.
And I just, it was one of the most powerful moments because I was not expecting to go hear from a music teacher during the same showcase.
[00:24:43] Ben Kornell: Yeah. There's a way in which the AI is actually creating the space to be more human than we've been able to be. Top middle school, 150 papers to grade. By the time you're on the 20th paper.
You kind of have some repetitive feedback for people, and this is really opening up educator capabilities, by the way, for those of you playing at home, we've got the hair club for men reference, and we've got the MacGyver reference. So if you're got your bingo card, just be ready. You know, I'm really fascinated at this conference.
Talking about this reverence for educators and their role in the process. Another thing that people have really been debating is the role of content. And when open education resources were kind of launched more than a decade ago, many predicted this is the end of paid content, you know, content had been King and now all the textbook companies are going away.
And in a weird twist, it was the reverse, like content remained incredibly valuable. And now with AI generated content, similar predictions are happening. And in so many ways, what you've built at Google Classroom and across the Google ecosystem is an incredible delivery system for content, but how do you think about the importance or lack thereof, or the role of content in education going forward, especially as you're supporting schools?
[00:26:06] Shantanu Sinha: Yeah. Great question. And I think content is really interesting because I think there's a few different trends that you're starting to see. One is the kind of much more accessible creation of content, which is, I think that's a trend we started to see with OER and with YouTube, for example, right? Where anybody can put a video and start to teach something and that just continued every time.
You know. Short form video now, like you're seeing more and more people participating, podcasts, podcasts, et cetera. And honestly, teachers are some of the most prolific content creators every day. How much content are they creating, but often being able to share that with the world, being able to get it out there, what was in all those tools and how more and more of those tools are coming.
Now, I think with AI, you're seeing a different dimension to this, which is you have transformation. Of content, right? We can take a textbook or we can take a PDF article and you can transform that into a podcast. Right. And AI can allow you to do that. It can allow you to move content in different form factors.
And I think we're in the very early stages of that. There's a few amazing demos that blow our minds away when we see it. But if I think about where the next few years are headed, you really can imagine a world where. All content, you can move from text to video to interactive dialogue to all of this seamlessly.
So it's going to create a world where the creation and the consumption is much more democratized in a lot of different ways. Now, what does that mean for the role of content? I think ultimately, the importance of Having high quality content, having high quality grounded content remains with AI, right? And I think that's still going to be true, right?
When you're working with an AI chatbot, you will be far more confident if you were doing physics and you knew it was grounded in the physics OpenStacks book than if you're not. And I think that role is going to really remain. But I think it's also going to make. This stuff much more accessible for a lot of different people to consume in a lot of different ways
[00:28:09] Alex Sarlin: in this next conversation We had the privilege of talking to Steven Johnson.
Who's the editorial director of notebook LM and Google labs? Described by the Wall Street Journal as quote one of the most persuasive advocates for the role of collaboration in innovation Johnson is a best selling author of 14 books on science technology and the history of innovation including where good ideas come from You The Ghost Map, and his latest, The Infernal Machine.
He's also the co creator and host of the Emmy winning PBS BBC television series How We Got to Now and Extra Life. And is a contributing writer at the New York Times Magazine. In addition to his work as an author and television host, Johnson co created the first web only online magazine, Feed, the Webby Award winning community site Plastic.
com, and the hyper local news service Outside. in, which was acquired by AOL in 2010. He was awarded Newhouse School Mirror Award in 2009 for his journalism, and he recently received the Pioneer Award in positive psychology from the University of Pennsylvania. He lives in Marin County, California, and Brooklyn, New York, with his wife and three sons.
[00:29:22] Steven Johnson: So in early 2022, I wrote a piece for the times magazine that was basically about GPT 3 and kind of saying like people, computers have mastered language. Like this is forget about AGI and all these other things. Like just this fluency is going to be revolutionary for everything we do with computers. And which was a bizarrely controversial piece at the time.
I get a lot of pushback for it, but two people at Google who had just, you know, Co founded, really, Google Labs, played before, who's since left, and Josh Woodward now runs labs. Had read a bunch of my books, and they read that piece, and they were like, I wonder if we could get Steven to come to labs part time, and basically, like, build this software that he's always wanted.
He's been chasing his whole life, now powered by language models. Amazing. So they just kinda like cold called me out of the blue, like she cold emailed me and we're like, Hey, we , we upgrade the idea. And they pitched me on this idea and I said, that sounds really fun, like, where, like, where do I report? So I started part-time in the summer of 2022, like kind of five months before the chat should bt moment.
And we just had like basically three of us in the early days just building a little prototype. And it was this idea that's part of the lab's culture actually at Google, which is to like co-create with. People who are not necessarily technologists. So, like, if you're going to make a tool for thinking and writing and research, like, have a writer in the room at the beginning.
So, it's a slightly different way to develop the software. And, yeah, we've just been kind of, like, hacking away at it ever since then. I ended up, like, after about a year, I was like, okay, I'm thinking about this 120 percent of the time. So, like, I might as well go full time, because now I'm a full time employee here.
That's a strange twist of fate, but it's been real fun.
[00:31:03] Ben Kornell: Notebook LM has started to have a life of its own, you know, online people are liking it to Gutenberg printing press as like a moment in time. That might be a little overstatement, no offense, but you know, when did you know that it was taking off and that this wasn't just a fun beta, but actually something that was providing real value to.
Millions of users lives.
[00:31:24] Steven Johnson: Yeah, it was really like, that is exactly the kind of evolution of my thinking, which is like, when I first got here, I was like, could we make a prototype that might influence Google like internally and also be like fun for me to use? You know, that was kind of that we did very quickly.
And then at some point last year. I think we were all like, could we make something that, you know, millions of people would use and find like helpful and, and help them have better ideas or learn faster and things like that. And, and that really, it started to have, we went international part of is completely, it's, you know, it's native in 100 languages.
And so you can have a conversation and, you know, Spanish about documents that are written in Japanese. And, you know, so we, you know, We did two things in kind of June. We, we moved on to the new Gemini pro model. We rolled out these inline citations. So we should say that like, you know, notebook is all about you upload your own documents, whatever material you need to do your work.
And the AI then effectively becomes an expert in the material that you've uploaded. And every answer is grounded in that source material. And we have one of the features that really we have that is state of the art is we have these inline citations so that every answer has little footnotes and you can click directly on the footnote and go directly to the original passage and read it.
So you're always like, it's a deeper way of exploring other material rather than just like, I don't know where the model got that information. It sounds plausible. In June, we had kind of rolled out all those features. And that was the point at which we were like, Oh, it's really working. We started to see it go viral.
It went viral in Japan first. Like Japan was for a brief period of time, our like biggest market, which was crazy. And we were just hearing from like lots of different kinds of people were figuring out how to make it, you know, like there were corporate uses or obviously student, you know, educator uses role playing game enthusiasts were like putting their like Dungeons and Dragons campaigns in there and using that way.
So you could see like. It was starting to resonate, but the problem with it was that the best way to really appreciate how useful it was, was to load a complicated set of documents, ask a very nuanced question, get a very sophisticated answer, click on the citations, all that stuff, which is very powerful when you do it.
It is not something that plays very well on TikTok. Right? Like, it's not a viral thing. And so over the summer, we started developing this audio overviews feature that will, instead of answering questions about your sources, it will turn them into an engaging 10 minute. Podcast style conversation. Yeah, we
[00:33:52] Ben Kornell: call it the ed tech insiders killer. Thank you. Thank you for that.
[00:33:58] Steven Johnson: We can get into that. We can get into that. So we started testing it internally inside of Google and people were just like, what the hell, you know? And so this summer, as we were developing it, we were like, okay, we're pretty confident it was going to be a hit. I think we did not anticipate that it would become quite the global phenomenon that it was, that it became, but that was, yeah, I think when that started, when that dropped in early September and we were suddenly like being talked about in like late night talk show, you know, we're just like in the zeitgeist in this way.
Um, and I think, you know, there were a couple of things happening with, with the audio overviews. It was interesting. It's an interesting innovation story, actually, because like, I'm like a big believer in the jobs to be done kind of philosophy, like figure out what the user unmet needs are and build around that.
We do a lot of that with notebook, but I, you could have interviewed like a thousand people who were talking students or whoever and ask them, like, what do you need? And no, no one would have said, like, I need a simulated podcast about my material. Like I just wouldn't have come up. And so it turns out to be one of those places where like the technology actually drove the exposed a kind of new possibility that ended up being really useful and magical in this way.
I think part of it is obviously that the underlying audio tech is really good, like the voices, the intonation of voices, all the subtle things that it does with the voices, like what I'm doing right now. And I feel like that, you
know, basically no computer in the world could do until these models came along, the voice models came along.
But the other thing, I think people were experiencing like. Most normal people had not mainstream consumers had not actually experienced an AI that was grounded in their information. And so it was actually a combination, I think, of audio tech. And then this idea of like, I gave it my journals and it generated this very sophisticated conversation about like me.
And so that was, I think, a big kind of eye opener for people as well. And so now like, so now we do have millions of users. And now our like ambition is like, you know, we really think that notebook is a genuinely like kind of new kind of platform for interacting with, with AI and, and with ideas and, and maybe hopefully a way of, it could become a, a new kind of marketplace for it.
Like what, you know, what happens if you sell kind of compilations of knowledge that can be explored or transformed in various ways inside a notebook. Yeah. So our biggest problem right now is like, we just have too many things that we want to do.
[00:36:30] Sarah Morin: So if you could introduce yourself and share one thing that you've learned from the morning so far at Google that is standing out in your mind, or is a really key
[00:36:38] Kristin Fracchia: takeaway from the event so far?
Yeah. So hi, I'm Kristin Fracchia. I've worked in education and ed tech for almost 20 years, starting as a teacher myself, I now have a five year old in the school system. So I'm so personally invested in this, but I've been working to help ed tech companies grow and super into. How AI is going to change the future.
And I will say that one thing that's been interesting to me from the Google event so far is how many times it was reiterated on stage with the panel of Google leaders that we don't know what's going to happen in two years or five years. And as a planner, I find that kind of frightening, honestly, because, you know, not being able to know what to expect, even from the experts on the AI, not knowing where this is going makes things really difficult.
I mean, it's a paradigm that I'm not used to dealing with, being able to plan. You know, what should we do in education over the next couple of years? And to really not know where this is going, I think that that was really mind blowing to me. And I think it really shows how we need to shift to just assume that you know, A.
I. Is going to be here, but we don't know exactly what it's gonna look like. And we really need to focus on how we help ourselves. Students really kind of adapt to a world of unpredictability that we live in right now. That feels very new to me.
[00:37:43] Sarah Morin: Yeah, thanks for saying that. And on a higher level, is that something that you're thinking about in your work on the day to day that your team is navigating?
How do you feel like that point kind of ties into your, yeah, your day to day work now?
[00:37:54] Kristin Fracchia: And on that really practical level? Yeah, so one thing I do and helping to, you know, advise some startups working in this space is I think that we talked very high level about AI, right? AI being everywhere. This is the AI era.
But a lot of people and, you know, students kids, they don't know actually what that means. And so one thing that I was taking away today are some of the tools for just really understanding what is generative AI. How does the back end work? You know, what is this tool actually doing and how can I use it is so important.
And that's something really practical we could focus on. Because as there's developments in the AI, students and teachers will understand, you know, um, Where it came from, right? Like, where did it start? What are these developments? How can we utilize them better? And so if you're like me and kind of having that anxiety of like, what does it mean to live in the AI age?
The best thing I think you can do is utilize some of these really great tools to understand exactly what is generative AI, how does it work, how do large language models work, right? Because we're here to, you know, We're here to be educators and to help this generation, you know, learn how to utilize things and tell fact from fiction.
And that's so true. You know, you can apply that analogy to LLMs right now. So why does it hallucinate, right? Like really understanding the backend. And so, you know, there's really just really understanding kind of basic AI literacy is something that, you know, I think all of us can, can do. And, you know, we should be encouraging, you know, how do we get the best resources into our classrooms in front of students on really understanding what this means so that they can figure out this uncertain future for us.
Yes. Thanks so much, Kristin.
[00:39:14] Alex Sarlin: Our next conversation is with Yossi Matias, the head of Google research and a vice president at Google. Under Yossi's leadership, world class global teams are learning breakthrough research on foundational machine learning and algorithms, computing systems, and quantum science AI for societal impact in health, climate, sustainability, and the environment.
And education and the advancement of generative AI driving real world impact and shaping the future of technology. Yossi was previously on the Google search leadership for over a decade, driving strategic features and technologies and pioneered conversational AI innovations to help transform the phone experience and help remove barriers of modality and languages.
He was also the founding lead of the Google Center in Israel and supported other global sites. He founded and spearheaded initiatives such as Google's AI for Social Good, Crisis Response, the Google for Startups Accelerator, cultural and social initiatives, and programs fostering Startups, sustainability, STEM, and AI literacy for youth.
Prior to Google, Yossi was on the computer science faculty of Tel Aviv University. He was a visiting professor at Stanford and a research scientist at Bell Labs. Prolific computer scientist with publications in diverse fields, Yossi is a recipient of the Godel Prize, the ACM Kanellakis Theory and Practice Award, and is an ACM Fellow.
He's a world renowned expert in artificial intelligence and has a track record of impact driven breakthrough research innovation, advancing society centered AI to help address global challenges with impactful and transformative technologies. He's committed to advancing research and AI to help improve lives, transform society, and create a better future for all.
Before we get into the AI and learning, I think, you know, you do research across many fields, and I think the health example is one that feels really intriguing. So Google Gemini is a general purpose model, but you've done, and Google has lots of different models, but you've fine tuned models for particular use cases, including health care.
Tell us about that story, and then, and then maybe we can segue into how some of that same thinking can work in education.
[00:41:23] Yossi Matias: So, of course, what we've seen in the past few years with large language models made possible by transformer and other techniques is the ability to build these foundational models that then we can use for all sorts of applications or use cases where to understand the language and to be able to generate content is a critical aspect.
And again, this is. Pretty exciting because I've been working in search as part of the Google search leadership for over a decade. And at some point realized that conversational experience is going to be so critical for the future of how we access to information. And with larger English models, suddenly this is accelerating.
Now, as we started looking into those language models, we started asking ourselves, so if we want to actually use them within various domains, obviously, they were already making a lot of progress. The question was, can we do better now in the health space? There's actually a benchmark, which is essentially a US style style of questions of the US medical exams.
For years, there was steady progress, but no one actually managed to get into a passing score using AI. And we had a team that looked into, in a concerted effort, to look into, can we fine tune a language model? Fine tuning means essentially taking the language model, adding to that more training data and doing additional capabilities so that actually it learns to operate better for various tasks.
Can we do that for particular use cases of this medical domain? And, uh, what we. Could show is that actually this function model that we called met palm was actually for the first time past the U. S. Medical exams telequestion, which got a lot of excitement, of course, but talking about the pace by the time we actually had this work published in nature, we already had a better model that already passed those.
Medical exam style questions in an expert level 85 percent and soon after we had met Gemini essentially taking a Gemini model and fine tuning it for healthcare information that passed it 91 percent and adding to that also all sorts of capabilities to be able to ask to answer questions, which had images within them.
So I have it multi model, so think about putting a city image and then getting explanation what it is. Which is pretty phenomenal, but interestingly enough, by the time we published the paper, we already had actually players in the health care system starting to try out and see how could they use it for their various health care applications in a way that could actually be beneficial.
To people in a pretty large scale so this is an example for not only the opportunity to take and adjust. Language models but also to have impact that scale and again this is still kind of working progress in many areas but the opportunities are quite clear in fact this was actually an inspiration for us to look into the question well can we actually do something similar for education and learning.
Can we actually know when you think about learning and education obviously if you just take any of the amazing language models if you take Gemini and other models you can already experience various activities that you can say oh I could see how that can help with learning I can actually ask questions I can get a digest of what of a document.
And obviously, you know, we just demonstrated or made it possible for people to learn about, for example, papers from using the light of a notebook LM and illuminate, which many people are excited and rightfully so. So, obviously, we already have these capabilities, but when you think about the more comprehensive opportunity with learning and education.
And you ask yourself, what are these capabilities that we'd like to have? And can I get the models to actually build into them more of these capabilities? Then we ask ourselves, can we actually build this kind of function model for learning? And that's how LearnerLAM came to be. And it turns out that learning lamb indeed when he could actually have this additional capabilities for example one way to learn material is to have some quizzes and to engage and to be able to have this kind of for example when you ask a question in learning setting sometime you don't want to give the full answer perhaps you'd like to give it.
Marshall answer and then encourage to look into the next one now the way to get there is actually to work with the with the experts with the educators and the way to work with the with the experts in the community is both in in order to set up the goals. To set up the capabilities that you'd like to get to, but also importantly, to set up how you evaluate these models, because in many cases, the way you build these models is by setting up these objective function.
What do I want to solve for? How do I actually measure progress and success much in the same way that we do in any educational system? And in order to do that properly, that's what we actually need to work with the ecosystem. And once we do that, and suddenly we have these models that we could measure against other models and base models and see, oh, we can actually give a lot of advantage.
By doing that, we can suddenly unlock new capabilities. We can now provide more opportunities. And now from this point, you can start dreaming about how to actually put the news, how to make learning and education more equitable, how to help out. Teachers, educators, tutors, parents, how it can actually fit into systems.
And importantly, where is it going to lead us in years ahead? I mean, because we're pretty nascent in this technology. And once we start seeing that the opportunities are there, then the question is, well, where are we going to be in the future and how can we drive there? In a way that would have really positive impact on, you know, on, on society.
[00:47:04] Alex Sarlin: Yeah. In this magic cycle, it's sort of the distance between research and application is, is shrunk. You can really take something, prove it out, and then actually get, you know, use it in the real world and test the results
[00:47:16] Yossi Matias: of that. It's a great question. And so. One of the things that, you know, my point of view on research is that you don't have this huge separation between the research and the application.
Of course, you have the foundational research, you have the applied research, you have the deployment. And traditionally, some research labs in the world sometimes would say, Oh, we're just going to do the research and then we're going to pass it on to somebody else to do that. That's not how we do that. So, you know, I think that part of making this magic cycle actually work best is quite often is to have it as a much more ongoing way.
Now, it's true that initially we start with the very basic questions that we try to make progress on the research side using the scientific method, publish papers, uh, you know, in Top conferences, if it's nature or science or in the other venues, ICML and such. So that's really important, of course, to make tangible progress.
But in order then to take those ideas and start applying them quite often, these are the same teams often that do that. And then, of course, eventually work with product teams in order to deploy them. So I think that the nature of Google research is that we have a much more closer collaboration between the developing of the foundations to actually taking it and applying it, especially important in times where there's so.
So rapid progress, you're not kind of finding a formula and then you'll hand it over to somebody to just implement, right? It's all the time getting the know how and plus when you're doing the theory, of course, you're trying to ask yourself more isolated questions sometimes in order to be able to set to have tangible kind of answers.
And once you want to apply it, then you need, of course, to, to see, to learn how to do the biggest impact for whatever you'd like to do. So I think this part of the magic cycle is indeed to find the right inflow, which is ongoing, and especially in a high pace, this is going on. So we have teams that are at the same time looking into how to deploy and research results and breakthroughs that have been developed in previous months or sometimes years and how to get them into.
You know impacting the experiences or the capabilities of people and users and at the same time also looking into the questions of research of the next the
next questions that would like to do and this applies to every area that we're doing.
[00:49:41] Alex Sarlin: In our next conversation we had the great privilege to speak to Obum Ekeke who's the head of education partnerships.
For Google DeepMind, Obum is a leader in educational innovation with over 20 years of experience. In 2022, Her Majesty the Queen awarded him an Officer of the British Empire for his work in computing and AI education and his efforts to champion diversity in tech. Um, currently leads education efforts at Google DeepMind, where he focuses on making AI knowledge more accessible, diversifying the AI workforce, and creating impactful learning experiences for everyone.
His educational programs have helped millions of students worldwide develop the essential skills needed to thrive in a rapidly changing world. Previously, he co founded Google's Educator Groups, now active in over 60 countries. In 2019, the Financial Times recognized him as one of the UK's top 10 most influential ethnic minority leaders in tech.
Aboum also serves on the Governing Council of the University of Essex and is a trustee of UK Youth.
[00:50:51] Obum Ekeke: So the way we viewed our work in education was ultimately the goal is, how do we make sure that this amazing technology AI works for everybody?
[00:51:00] Alex Sarlin: Yeah.
[00:51:00] Obum Ekeke: Whether you're in Latin America or in Africa or here in the United States or wherever you are, and what communities, what cultures you come from, how do we make sure that we build AI as a tool, not just a tool we are building, but a tool that is built.
Because the ecosystem was for people everywhere. So for us in education, we then took a long term end to end approach to it. So first is at the very early years, which is where the experience AI comes in. How do we make sure that we are empowering teachers and learners with that foundational knowledge?
Acknowledging that the goal there is not for them to necessarily go to university, study AI, but for them to know it and not feel it. Right. To be able to play a role in contributing, building AI or just applying it or using it, right. And then it could also be that actually a certain percentage of those learners today, we spark their curiosity enough to be able to say, Hey, when I go to the university, I'm going to study computer science or AI, or some of these courses, eventually leads to AI.
So that's at the early years. Um, experience AI was developed for 11 and above, and then at the university level, right, we then said, Oh, how can we actually support those who are already on their journey to AI, whether they are studying STEM or some more focused computer science, maths or engineering courses?
How can we support the undergraduate students to go? Those who want to go to study at the postgraduate level, masters or PhD, what role can we play in supporting them? So we partner with, um, universities all over the world. To fund scholarships and provide mentorship, match those scholars to Google DeepMind employees as their mentors.
And really focusing on students from underrepresented groups who may not ordinarily have access to studying quality AI postdoc level. So that's, that's at the middle ground, get people from undergraduates. And programs from underrepresented groups empower them to go on and do masters or PhD level. And also at the end of the pyramid is actually going to, how do we support if you have a PhD?
How do we support you to go to a postdoc level and really become a leader? In AI and leadership in AI could mean different things, right? It could be you stay back in academia and become a professor and actually helped train the next generation of research and engineers, or you go to industry and work or you go run your own startup, but at the end of the day.
The goal is really how do we diversify the ecosystem and make the broader ecosystem of people that are building and developing these tools or practitioners that are using these tools much more representative of society and get their voices and perspective to contribute to building an AI ecosystem ecosystem that will work for everyone.
[00:53:51] Alex Sarlin: It's an amazing vision. I remember reading, you know, about, there's been a sort of computer science gap, like a gender gap for a long time, for example. And there was some interesting research that said that women were more likely to study computer science if the field sort of moved away from the sort of like, Loan coder, just trying to make something happen by themselves and sort of make money and change the world to like a more socially minded, like, Oh, coding can change the world can change health that can change all these things.
And I feel like AI is a field where it's sort of, it's even more true. I think that like ethical considerations that they're going to be social goods and social bads coming out of the AI leadership. I'm curious if you think that's going to. Accelerate the trajectory of underrepresented groups going into AI versus how slowly.
Unfortunately, they went into computer science over the last couple of decades.
[00:54:40] Obum Ekeke: Yeah, I believe so. And it's not just me believing. So we also heard that from across the ecosystem ecosystem. When we were building the experience AI program for young people. One of the, we kept asking these educators, you have so many resources out there, what's going to be different about this thing, right?
And the example they kept giving, they gave to us at the moment is actually, as Google DeepMind, you have something that is more relatable or will be more relatable to your point. Two different audiences at the time. That was like two, three years ago. We had launched for food, which is this great protein folding.
Yeah. So if you use that, if you bring that into teaching people about AI, it will be more relatable to girls in the classroom and to other diverse communities. And there are so many examples around. The role that AI is playing in climate sustainability in education. And these are more relatable to these diverse communities that you typically have coding, which at the time, again, there are different ways of teaching coding now, but at the time sounded more abstract, you know, and it was all based on games and all that.
So I think we kind of leverage on that opportunity to make. Yeah, it's much more relatable. You can point to these use cases that people are addressing problems that people face on a day to day basis. I'm from Nigeria when I was growing up. We had all sorts of growing up. I had all sorts of diseases, outbreak around crop and so many other things in my community.
And that was one of the reasons I got into AI because I could see, I had a very high gig that this thing can change and tackle some of the problems I had growing up in healthcare, in agriculture, and so many things. And that's the same message today. If you go back to my community, more people like me can relate to that, you know, and they will be really excited about taking up on these technologies.
Again, the ultimate goal is to make access to quality AI education, much more accessible and relatable to people everywhere. It doesn't necessarily mean that they go on and become scientists in the future. And but they will play a critical role in one form or the other, whether as practitioners and using other AI as a tool in their various disciplines or at the forefront of groundbreaking research in AI scientific engineers.
[00:57:07] Sarah Morin: Welcome to the EdTech Insiders podcast. You can introduce yourself and share a little bit about what you've learned at Google today and what your key takeaways are.
[00:57:15] Adele Smolansky: Yeah, thanks. So my name is Adele Smolanski, and I'm the CEO and founder of AI Learners. Um, at AI Learners, we're really focused on making learning more accessible and engaging for students with disabilities.
We help students with math, literacy, and social skills, and personalizing learning using artificial intelligence. to align with students interests, goals, and abilities. It's been a really interesting event so far to especially hear people talk about the feedback loop when it comes to developing new learning tools for students and teachers.
Every day I work with students and teachers and really ensuring that then users are benefiting from what we're creating and we're not just trying to create something cool with AI, but really something that's effective for our audience. So I'm really glad that people at this event Both people at Google and everyone else in the audience is actually thinking about that loop and engaging users within the entire experience.
[00:58:01] Sarah Morin: Yeah, that's amazing. And one of the points that was brought up today was that distinction between there's so many cool things we can do with AI and how do we focus on what the, like, the true impact and the true learning value is. Do you want to share a little bit about how that comes up in your work, what you're thinking about that, and, and any thoughts there?
[00:58:17] Adele Smolansky: Yeah, so, I mean, my company's name is AI Learners, um, so everyone always thinks, Oh yeah, we're doing something with AI, and that's a big question that we get all the time. Something that I really like to emphasize is that AI is just something that enhances our product, and it's not the core focus of our product.
Our core focus is really creating something that can save teachers time, can help them become more efficient, and then ultimately focus on the learners. And then for students, it's to have something that's engaging and fun. Accessible for them, and that technical accessibility a I can really enhance, but it's the enhancement and the core focus is how we can create an experience that it's effective for kids and that they're really enjoying learning.
And as many people were here talking about that curiosity. So we're helping students really build that curiosity with learning. And AI comes on top of it, but we're not starting with it. We're starting with engagement and the learning.
[00:59:03] Alex Sarlin: Amazing. Thanks so much. In this next conversation, we are talking to Jonathan Katzmann, the director of product management for YouTube Learning.
At YouTube, Jonathan leads all aspects of YouTube's work on learning initiatives across school, work, and life. Prior to joining Google, Jonathan was the founding chief product officer of Minerva Project. Minerva became both a fully accredited university. Offering both undergraduate and master degrees and an ed tech company and Minerva project worked with existing and new universities around the world to bring new programs to life based on Minerva's technology and curriculum expertise.
Prior to Minerva, Jonathan was at a mix of successful startups and large tech companies.
[00:59:49] Jonathan Katzmann: There were several things I found at YouTube early on. One, if you look at the content on YouTube, the what we call ed YouTubers, the educational creators. The content they make and
what naturally becomes the content that becomes the most recommended because it's the most watched is the best content out there because it's really well done and those educators actually know how to produce compelling content.
They know how to break it up into bite sized chunks. They know how to walk you through different types of mental models. They know how to make it do dual coding, like all the things we talk about with teachers. They share their passion. They share their passion. Yeah. And the fact of the matter is like, you're never gonna see a better lesson on like any math topic.
Right. Then like, what, say three blue, one brown, like do you want YouTube?
[01:00:34] Alex Sarlin: Yeah.
[01:00:34] Jonathan Katzmann: And that was already happening. And. The biggest, if you the line that Susan always like to say, and the one that really grabbed me, she's like, we have the biggest library in the world. We have the modern day library about Xandria, but the books are all on the floor.
How are we going to pick them up, enable people to find the right one at the right level? And as you go through it, and, you know, we get into the now and now we have AI. So it's not just about. They were finding the right book, the right video to watch, but also how to actually encourage that active learning.
[01:01:08] Alex Sarlin: And even though the books were on the floor, so to speak, you still had people coming to YouTube to learn as vast droves. What do you attribute that to, given that YouTube wasn't sort of optimized for, you know, sort of pedagogical learning styles?
[01:01:21] Jonathan Katzmann: Yeah. So I think a lot of it was just this content that was already out there and was truly better.
And the fact of the matter is, like, if you were in high school or college and you didn't understand what was happening in class, YouTube really was the best resource and easiest resource to go out and learn that material. Yes. So it amazes me that when I went through high school and college, I'm like, Oh, yeah, this would have been really useful.
Yeah. And this is just before I got to YouTube, when the pandemic happened, we already obviously had a lot of educational content, but it really went on an exponential curve through the pandemic. Yeah. And that's when people really saw the power of what learning could be. And we did some very subtle things during the pandemic to really help with that around slightly boosting some of that academic content, making sure if you're watching a piece of academic content, most likely the things that you'll see in the side panel that like watch next would be other academic content and not music videos.
So there are some subtle things done, but we hadn't really put together a whole theory about what learning on YouTube could be.
[01:02:22] Alex Sarlin: Yeah. What did it look like when the pandemic came down, schools emptied out and people turned to online tools for learning and YouTube is sort of the go to online tool for learning, especially for students, but for teachers as well, and a place where teachers could publish for their own students.
Like, I'm curious what it was like at that moment being inside the, the YouTube learning world.
[01:02:42] Jonathan Katzmann: Yeah. So for the record, I was not yet in the YouTube world. So I joined literally at the tail end of the pandemic. So still in working home scenario before the full pivot, but from what I've seen and heard is that one, you know, it's actually interesting, even on the teacher side, there was a lot of teachers teaching teachers about how to teach online, right?
So I think you guys saw even there, you know, it was just, there was such a crying need for this content online as people needed alternate versus. Or something different than whatever they were getting via whatever online learning they were happening during the pandemic.
[01:03:16] Alex Sarlin: Yeah, just to flashback to the YouTube creators, I'm a huge fan of many of the big YouTube creators, and I've actually shown YouTube videos for all the reasons you said to professors.
I had at one point had a job at Coursera training. new online professors to teach online. And I would show them YouTube videos because I'd say, look, these people have figured out how to get ideas across in a way that is incredibly engaged, although, you know, that just the passion, the quick cuts, the mental models, the sort of the metaphors.
And I'm curious, you know, when you know that you have a set of educators or creators that may or may not be formal educators at any way who are doing this kind of thing, How do you work with them to sort of help more people understand how much you can do with video in education?
[01:04:00] Jonathan Katzmann: Yeah. So one of the questions Ben gum asked me all the time is like, shouldn't we be teaching all these educational creators, the science of learning principles?
Like they sort of seem to have figured it out. I don't think we need to teach them. They do like the good ones. Like clearly I've done the research, you know, what's great. So we have a partnership team. I think you guys have interviewed Katie Kirks previously. Yes. And her team really reaches out and works with these creators, helps let them know what are areas where we could use more material.
We also try to find up and coming creators who want to get on the platform and help them succeed. And we often in America can put a very American hat, but you have to remember, YouTube is very global, and it's not just about what's happening in America, but it's how do we do this in multiple languages and other countries.
And in YouTube, the other thing that happens, and this goes back a little bit to what you were saying about COVID too, is In America, YouTube for most students, you know, it's like a very nice to have thing and can be a dramatic part of their education and some other countries. YouTube literally is school.
So, you know, if you want to talk about a difference in COVID, especially for some of these other countries, like India, Argentina, Brazil, I've heard from many country managers, like YouTube literally is school for 90 plus 95 percent of the population and that is something that is amazing that we're able to provide and it's like a huge difference from what's happening here.
[01:05:28] Alex Sarlin: In our next conversation, we had the great privilege to speak to Lisa Gevelber, who is the founder of Grow with Google and the CMO for the Americas region at Google. Lisa founded and leads
Grow With Google, Google's 1 billion commitment to economic opportunity. Since 2017, Grow With Google has helped over 10 million Americans develop new skills to grow their careers or businesses.
One of her most significant contributions is the Google Career Certificates. which provide people access to in demand, well paying jobs, regardless of educational background or work experience. Since 2021, these certificates have provided significant upward mobility to half a million job seekers globally.
In 2022, Lisa was named in the inaugural Forbes Future of Work 50, honoring leaders whose impact, reach, and creativity has the potential to affect millions of workers. Lisa also leads Google for startups, which helps level the playing field for underrepresented founders across every corner of the world.
She has been the chief marketing officer for the Americas region at Google for the past 13 years and has over 30 years of experience in general management, marketing and product management, including over 20 of those in Silicon Valley. Her career spans from early stage startups to fortune 50 companies, including Intuit and Procter and Gamble.
We're here with Lisa Gevelber, founder of Grow with Google. Which is Google's outreach to support workforce training. And well, she can explain it better than I can. Welcome to the podcast. So for those who don't know about Grow with Google, as the founder, tell us a little bit about how it works.
[01:07:08] Lisa Gevelber: Yeah, absolutely.
So Grow with Google is Google's big economic opportunity initiative. And really it's based on our fundamental belief that the opportunities that are created by technology should truly be available to everyone. Our main program in Grow with Google is called the Google Career Certificates. And this is a program that'll train people regardless of your work experience or your educational background, all the skills you need for an entry level job and one of several really high paying, in demand job fields, things like cyber security, data analytics.
IT support, project management, and even digital marketing. And we're really excited because this program helps people go either from low wage work into a higher paying job, or it helps career switchers. You know, there are not that many programs out there that within a short period of time, you can really completely enter a brand new field with a high quality, industry recognized credential.
So obviously the Google career certificate program comes with a certificate from Google when you graduate for the program and really highly recognized by employers of all types.
[01:08:14] Ben Kornell: Yeah. I'm curious. AI is both like content in the program, but also is it part of the process of the program? And how have you rethought the growth Google program over the last couple of years?
[01:08:26] Lisa Gevelber: Yeah, you know, we've been doing a lot on AI actually. So we were just talking about the Google career certificate. So maybe I'll start there. We've built modules into every one of those certificates that talks about how you would use AI in that job field. So if you're a data analyst, how does AI play a role in being a data analyst?
If you're an IT support technician, how does it play a role for that job? So really quite specific to the job field applied AI, if you will. So that's really important. And we see interest from people who've even been in the field for a long time and learning how you apply AI. So that's one example. The other is helping people learn basic AI skills to make AI really work for them.
The thing about AI is it can be very useful, but. Really only if you know how to use it. Um, and so what we're hearing from a lot of folks is that they're really eager to learn to prompt better. And actually a few weeks ago, we announced our brand new course, just in prompting. It's a short course called Google prompting essentials.
It's available online on the Coursera platform, and it teaches you how to be smart about prompting, because we all know that the better your inputs are, the better output you get from these models. So we're really excited about Google prompting essentials, and it's the Second course in our series, the first one was called Google AI essentials, and that's everything about what is AI and how does it work?
And how do you have to be thoughtful about it? But how do you also really use these tools? What we know to be true is that the people who use AI tools and get better results. They can do things like get things done faster and don't we all want to get things done faster? So we're really excited about putting these tools in everybody's hands.
[01:10:02] Alex Sarlin: Yeah. And, and, you know, I think you've reached over a million learners. Is that right? Through this career certificates,
[01:10:06] Lisa Gevelber: we have reached a huge volume of people with a career certificates. And we're so, so excited about the economic opportunity and mobility that that is delivering for people all over the world.
[01:10:17] Alex Sarlin: Yeah. One thing that I find really fascinating about the Grow with Google platform or Grow with Google concept is that you create the content, but you also provide sort of wraparound services in various contexts, too. So there are some people who see Prompting Essentials, Oh, I should take that. I'm going to take that and I can do it completely on my own.
And there are others who might not have heard about it or might not have the supports to make it through that or to know how to use it in their career. And you actually sort of cover both use cases. You want to talk a little bit about. Those kinds of supports.
[01:10:45] Lisa Gevelber: Yeah. I'm so glad you brought that up actually.
So we do a whole bunch of things, not just to give you the skills to do well in the job, but we do a bunch of things to help you actually get a job. And I think that is actually one of the biggest differentiators. So in the Google career certificate program, we have an entire employer consortium. So thousands of employers are our graduates, which includes actually over 150 big national employers.
So that's the first thing is that we have a lot of employers who have jobs waiting for you when you finish. We even have our own job board, but we do a bunch of things to help you get a job. So we provide job specific templates. So if you took the career certificate in it support or in cybersecurity, we help you understand what a good resume looks like in that job field.
We also have this very cool tool called the interview warmup tool that helps you practice interviewing. Cause we know that doesn't always come naturally to people and that practice does make a difference. And then we do a whole bunch of things within the certificate where we talk about real job scenarios that you get to practice.
We also help you create assets that you can take with you to an interview. So if you're doing the UX design certificate, you create a website and a web app, and you can bring a portfolio with you to a job interview. And even the same for data analytics. Actually, you do a project as part of the course where you actually do a full data analysis and create data visualizations for it using publicly available data sources.
And then you have that as proof of what you know how to do. You can bring it to an interview and talk through the analysis and your visualization. So. We really do kind of try to provide a whole suite of support for people, not just for learning the skills to be successful at the job, but also in helping people get the job.
And I think that is a real differentiator. And then of course, employers know the Google career certificates. We've been around for a while now, and they've hired lots of folks who have them. So they know quality and rigorous. And so that really helps as well.
[01:12:34] Sarah Morin: All right, if you could introduce yourself and share one thing from the morning at Google that you thought was interesting.
My name is Ishan Gupta. I am Vice President of Customer Success at Paper, which is a K 12 focused online tutoring services company. This has been a fun event. I think, you know, still just half day and I think my big takeaway has been just The number of initiatives that Google has along the intersection of AI and learning, whether it is a notebook LM, whether it is learn about or YouTube, and I'm just excited to go home and try some of these features after today.
[01:13:06] Sarah Morin: Yeah. Which of these features do you think are most interesting, whether it's Your work at paper, whether it's just your personal life, any of the above, some of the tools that you think are most interesting to you.
I've been in ed tech for a long time. And I remember a conversation at Udacity when I was managing all student operations in 2018 with the CEO saying, I wish there was an online tutor who could supplement all our mentors and take tier one tutoring away.
And today it's almost opposite. Like I wish there are humans involved in tutoring, right? That's where we are heading to. So to me, It is exciting to see how AI makes infinite possibilities around just the intersection of human based tutoring and online tutoring mentoring models. So personally, for me, I want to go home and dive into learn about, which is something on Google labs and really see How I can learn some concepts that I've been wanting to learn for some months now.
[01:13:56] Alex Sarlin: Amazing. Thank you so much. Thank you. Thanks for listening to this episode of ed tech insiders. If you liked the podcast, remember to rate it and share it with others in the ed tech community. For those who want even more ed tech insider, subscribe to the free ed tech insiders newsletter on Substack.