Edtech Insiders

Securly’s Vision for Safer, Smarter School Technology with CEO, Tammy Wincup

Alex Sarlin Season 10

Send us a text

Tammy Wincup has served for over 25 years as a business executive at the intersection of technology and education. Tammy is currently the Chief Executive Officer of Securly, the leading digital safety and wellness company serving over 22,000 K-12 schools globally to keep students safe, secure, and ready to learn. 

Before Securly, Tammy was a partner at Rethink Education, an impact venture fund investing in global education technology. For almost a decade, Tammy was the Chief Operating Officer (COO) at EVERFI, a leading education technology company. She also served as the President of Revolution Foods and the Founding President of Protocol, a media company covering the intersection of technology and policy.

💡 5 Things You’ll Learn in This Episode:

  1. Why Securly sees digital safety as a shared responsibility between schools, families, and communities
  2. How schools can implement AI without compromising student privacy
  3. What real-time data from 40% of U.S. schools reveals about device use, social media, and AI trends
  4. Why human-in-the-loop systems matter for student mental health monitoring
  5. The emerging role of CTOs as strategy leaders in K-12 districts

✨ Episode Highlights:

[00:02:29] Tammy Wincup explains Securly’s mission to protect students across digital environments
[00:04:42] How classroom management and device use connect to digital wellness
[00:09:20] Phone bans lead to 30% spikes in school device usage
[00:14:16] What Securly learns from 40% of U.S. K-12 schools
[00:16:08] Why Tammy urges districts: “Don’t block AI — manage it transparently”
[00:24:31] Using anonymous trends to flag real student wellness risks
[00:34:03] How AI puts school tech leaders back in strategic roles
[00:46:20] Tammy’s go-to sources for spotting big edtech shifts 

😎 Stay updated with Edtech Insiders! 

🎉 Presenting Sponsor/s:

This season of Edtech Insiders is brought to you by Starbridge. Every year, K-12 districts and higher ed institutions spend over half a trillion dollars—but most sales teams miss the signals. Starbridge tracks early signs like board minutes, budget drafts, and strategic plans, then helps you turn them into personalized outreach—fast. Win the deal before it hits the RFP stage. That’s how top edtech teams stay ahead.

This season of Edtech Insiders is once again brought to you by Tuck Advisors, the M&A firm for EdTech companies. Run by serial entrepreneurs with over 25 years of experience founding, investing in, and selling companies, Tuck believes you deserve M&A advisors who work as hard as you do.

[00:00:00] Tammy Wincup: How do we support these students in their academic learning? And our job is to follow the policies of the district and the state. Our job is to enable whatever policies they and their school boards and their elected officials have come up with to support those. But always putting the student first and having the student voice there.

And I think that that is a shift. That is really important and that everybody should be holding the entire industry up to.

[00:00:30] Alex Sarlin: Welcome to EdTech Insiders, the top podcast covering the education technology industry from funding rounds to impact to AI developments across early childhood K 12 higher ed and work. You'll find it all here at EdTech Insiders. Remember to subscribe to the pod. Check out our newsletter and also our event calendar.

And to go deeper, check out EdTech Insiders Plus where you can get premium content access to our WhatsApp channel, early access to events and back channel insights from Alex and Ben. Hope you enjoyed today's pod.

We have a very special guest for this episode of EdTech Insiders and we're talking to Tammy Wincup. She is a. Seasoned and incredibly, really, really amazing ed tech veteran from a number of different amazing companies. She's currently the chief executive officer of Securly, the leading digital safety and wellness company.

It serves over 22,000 K 12 schools globally to keep students safe, secure, and ready to learn before Securly. Tammy was a partner at Rethink Education and Impact Venture Fund investing in global education technology for almost a decade. She was the chief operating Officer at EverFi. Leading educational technology company.

We all know EverFi. She also served as the president of Revolution Foods and the founding president of Protocol, a media company covering the intersection of technology and policy and Revolution Foods. Also an education company actually. So Tammy Wincup, welcome to a Tech Insiders. 

[00:02:04] Tammy Wincup: Thank you, Alex. It's good to see you.

I'm glad to be here. 

[00:02:07] Alex Sarlin: It's great to see you. So we had a chance to sit down and chat at A-S-U-G-S. You're just doing incredibly interesting work in a space that I think the EdTech world knows is incredibly vital and important. It's central to what has happening in schools, but I don't feel like we talk about it enough on the podcast.

So before we get into anything else, tell us about what you're doing at Securly and what it is. It's a huge set of different tools. 

[00:02:29] Tammy Wincup: Yeah. We are focused at Securly on really student safety, security, and wellness. Specifically around their digital footprint and the company has been around for 14 years, was founded by great founders that really looked at cloud-based technology and said, how do we use that to really understand and help schools keep kids safe?

What's evolved tremendously since you and I probably came into Ed Tech is school's responsibility for this topic. 

[00:02:59] Alex Sarlin: Yes. 

[00:02:59] Tammy Wincup: I remember you and I chatting when I started in ed tech kind of in the. I won't even tell you how long ago, but certainly in the early 2000 tens, I remember creating at EverFi, one of the very first digital citizenship classes.

Right. And this was at a time when there was not one-to-one where I'd go visit K 12 schools and the laptops were stored away, lock and key. 

[00:03:24] Alex Sarlin: I remember 

[00:03:24] Tammy Wincup: gonna one school and there was like. Laptops being kept in an old pool because they just didn't know what to do with 

[00:03:30] Alex Sarlin: them. 

[00:03:31] Tammy Wincup: And really, if you fast forward from that, we are really, really focused with K 12 schools globally around how do we look at all the vectors that a school is using from a technology perspective.

And yes, filter them, but also as importantly, look at how students are using. Digital resources and arm the district with an ability to kind of see how to teach better with that and keep them safe. So I certainly joined in the last year really focused on, I feel like I've spent 20 plus years putting digital learning into kids' hands, and I feel a huge amount of responsibility now in this air to ensure that we're doing it safely.

[00:04:10] Alex Sarlin: Yep. Logistically, Securly offers things like device management, it offers web filtering, right? But it also, as you mentioned, has wellness monitoring ways to actually make sure that students are behaviorally and mentally happy and safe in the classroom. This daily safety and wellness insights, this ed tech usage and budget analysis, this classroom management, it's really sort of across the board, all these different areas of safety.

Wellness and digital wellness in the classroom, which is interesting how these all come together into a single platform. 

[00:04:42] Tammy Wincup: Yeah. It's really evolved. Right? When you think about, I go back to kind of the responsibility of a K 12 school district, right At no time when we were probably in school, when the responsibility of a K 12 administrator to really understand a student's.

Not just wellness, but digital wellness in terms of what was going on. And now, unfortunately, I think fortunately and unfortunately with the positive and the good of technology also comes risk. And I think our job is to make school administrators jobs easier and to be able to prioritize what they are.

So for example, certainly on the filter side of the house, I think we have really kind of innovated around how to think about that and we can talk more about it, but. To your question on safety and wellness, I think that we look very carefully about how to do two things. Number one is how to hold student privacy as number one and student safety.

Like you can have both at the same time. I'm not sure that everyone in the industry believes that or does that well, but I think we really, really focus on how you do both of those at the same time. But a school district. Big or small has a lot of students to understand what is happening. And this has to be a partnership with community and parents and caregivers.

And the reality is we have that data, we have that information, and we need to look at the trends in it in a responsible way. And that's really where we are focused, is how do we look at this rolling average of wellness. And how do we identify for teachers, for counselors, for school resource officers, things in the digital footprint that might signal that this is a student we need to reach out and help.

[00:06:30] Alex Sarlin: Right? Yeah. Let's talk about those signals and predictive analytics, but I just wanna double click on something you're saying, which I think is really sparking some ideas for me, this idea of, it used to be that technology in schools was very controlled and contained right? That this laptop. Pool you're talking about, or the carts that would come into the classrooms and then go back out for sort of computer lab time.

And in the last decade, two decades, technology is just so woven into everybody's life, both in K 12 and certainly higher ed, and certainly the workforce, but there's still this need to sort of balance our personal use of technology, which is. Free flowing Wild West. And then the use of technology in formal environments or state controlled environments, especially schools with kids under 13, and that combination of in the last few years, there's been this huge pushback on social media because there's all this evidence that it's really harming people's mental health.

So then what is the responsibility of schools when it comes to something like that, which is. Usually not the purview of schools at all, and we're seeing this huge pushback on phones in school at all. So I guess before we get into the predictive, I'd love to just hear you talk about that tension between all of our personal uses of technology, which is ubiquitous, and then the controlled use of technology in formal settings and how there are.

All these policies in place, we all feel them in the workforce, of course, with all the different logins and passwords and protections in it. And then of course, students now have their school devices and their personal devices. It's a weird feeling, and I think it's new for all of us, new for humanity. 

[00:08:01] Tammy Wincup: It really is.

And it's merged, right? To my point, back when we created the first digital citizenship class. It was like, this is what you do with tech in school and that's it. Right, right. And I think because those worlds have merged and the school device, so this conversation around what happens with personal devices, cell phone bands, smartphone bands, et cetera, is really interesting.

Right. Because in some ways it actually. Creates this idea that they're separate. The idea that what I do on my school issue device is very different than what I do on my personal device and in general, I think there are some data to show that that is true. But what we have seen in the last year as we've watched states create cell phone bans.

Is that we pulled data from a couple of the states that were early adopters in cell phone bans, for which I understand and like, and support. You know, our job is to support states and districts with whatever policies they have and give them the flexibility to understand what's happening. Not surprisingly, what we saw in those states and specifically some of the big districts, is when the mobile phone or the smartphone.

Is not in the building with the students. Their school issue device usage goes up by 30%. 

[00:09:20] Alex Sarlin: Hmm, interesting. So 

[00:09:21] Tammy Wincup: the work that we are doing on those school issue devices becomes that much more important. 

[00:09:26] Alex Sarlin: Yeah. 

[00:09:27] Tammy Wincup: Because. You have during this set period of time, a desire by the student to either try and circumvent, right?

We spend a lot of time with fabulous students that are really, really smart that I wanna hire every day, who can find ways to circumvent everything, 

[00:09:44] Alex Sarlin: right? 

[00:09:45] Tammy Wincup: To actually really think about, okay, what does it mean now to have this be the only device that's in a building, potentially with a student. I think it also.

Begins to think about, then we see more and more districts using what we include within our filter product, which is our home app, which is the ability for parents and caregivers to actually see the digital footprint. Of their student on their school device, and they can add other devices to it, but it, it allows the parent and the caregiver to actually understand what is happening.

Now, most districts correctly, I think, say, Hey, look like you can see the home app. We're responsible for the digital footprint when you're in our school. But when they're home, you should be able to see what the digital footprint is of that school issued device as well. And so we are seeing an increase in the districts that are offering that as an option to parents and caregivers.

And I think that that is just an example of this merging of us all recognizing that what we used to do in a physical space, many times students are learning, adopting. In this digital environment and us having an understanding of what that looks like is really important to a student's academic health.

Also just maturity and we have that responsibility and that that wasn't there, you know, when you were a teacher before understanding, you know, it used to be that like, was everybody on page 42 in the textbook? Right? And you could walk around and see, we've now enabled that classroom management across.

Digital devices in a way that is actually, I think, really helpful to look for best practices, Alex, because I can look and say in a district, Hey, a really experienced teacher might use two to three digital resources within a classroom period. And. Other teachers might use a lot more. And is that really the way, it's a way to actually, in this world where we're saying maybe we're spending too much time on digital devices in classes to really understand what the best practices are.

So we've kind of wandered across like filtering and wellness and classroom management. But back to your kind of question, they're all tied together now. 

[00:12:02] Alex Sarlin: Yeah, they are. 

[00:12:03] Tammy Wincup: I think what's really changed is even two to three years ago. I think we did a disservice to districts as an industry. We came from an ed tech industry and said, we're gonna give you a gazillion single point solutions 

[00:12:17] Alex Sarlin: and we're 

[00:12:18] Tammy Wincup: not gonna connect any of them.

Yes, you k12, educator, administrator, you're gonna have to digest all of them. And what we at Securly, I think have successfully done is say, no, no, no. This is an a spectrum of digital safety. And it's our job to put it together to provide a holistic picture in a district. 

[00:12:39] Alex Sarlin: A hundred percent. And that holistic picture can come from both sides.

It can be like an Okta style dashboard where students can go to everything they need to go or, or educators. But it can also come from the data as you're saying. I mean, one of the interesting. Double-edged swords of monitored technology is that it is surveillance. I mean, not in a bad way, but it's watching.

It's making sure that students don't go to inappropriate sites. It's making sure that students are digitally well and that they're on target with their academics and that their parents have access to what they're seeing, and that all is very safe and strong. It also sort of. Creates a different feeling in terms of the relationship with the technology, but it gives all of us in ed tech a huge amount of data.

And if it is centralized, if you do have a sort of technology solution that is centralized where people are doing, using the different tools in the same place where the same devices, whether they're at school or at home, are still feeding into. Dashboards that can be accessed by teachers and parents. You can start to see these incredible patterns.

We mentioned 22,000 K 12 schools in the intro, and that's like about 40% I believe, of US schools. Yeah. Yes. So, and that is a, a huge number of schools. So you Securly is in 40% of all US schools, you're seeing these digital wellness, safety, privacy, usage patterns. At scale, and I'm curious what kind of thing you've been seeing as you look across these different types of information, the classroom management, the digital wellness check-ins, the filtering, what are students trying to get to, and of course AI is a new part of this equation.

What are you seeing across 40% of US schools that you think might be interesting for our listeners to have in their minds as they think about EdTech? 

[00:14:16] Tammy Wincup: Great question. And I do think. The view that we have is actually really interesting, especially right now in this conversation in the US about decisions being shifted down to states.

[00:14:28] Alex Sarlin: Mm-hmm. 

[00:14:28] Tammy Wincup: And specifically around school choice and the potential for more homeschooling or ESAs or stuff like that. I think there's a misconception that. Some of this safety view is just for like big districts. I think that one of the things that we know is that often it is some of our smaller districts or digital or homeschool or cyber schools or digital learning schools, that we actually see some of the best practices and interesting work on that.

[00:14:55] Alex Sarlin: Really interesting. 

[00:14:56] Tammy Wincup: It's not just about the number of students you have, it's really understanding how we're using technology. So that's one. I think that this becomes really relevant, not just. As we have big districts, but also we have small micro schools, it becomes really important to look at too. So that's number one.

[00:15:11] Alex Sarlin: Yeah. 

[00:15:11] Tammy Wincup: On the trend side of the house, we're really paying attention to kind of three, we talked about one, which is what happens when you take the smartphone outta the school? How does a district react to kind of what then they have as the responsibility of ensuring happens or doesn't happen on the school issue device?

And we talked about that. The second is certainly we have more data on kind of what is happening on school issue devices on social media, right? It has been around, we see it, and at this point what we're seeing is that about 80% of our schools block all social media at the domain level. That is not surprising.

That trend has gone up over time. I think as people have recognized kind of the implications of spending too much time there and we are seeing that happening. The new trend and where we are really focused at Securly is certainly on. What I believe will become like air, which is ai, 

[00:16:08] Alex Sarlin: and I 

[00:16:09] Tammy Wincup: have become just a real advocate for telling K 12 schools for working with CTOs and really sharing with superintendents, please don't block ai.

I think that we are about to go into a school year where what we saw over the last year was a huge number that we're just blocking the LLMs at the domain level. For students, and clearly what we also saw was that was one of the biggest areas that was being circumvented. So like those two don't go together.

The second thing is we also though believe that. AI and virtual assistants and chatbots need to just be another vector. 

[00:16:51] Alex Sarlin: Yep. 

[00:16:52] Tammy Wincup: That districts trust Securly to help put guardrails and transparency around. And so that is where we have really in the last six months, doubled down and now are able to provide that very similar to kind of a filter perspective.

To AI specifically to kind of LMS like Gemini or Chat GPT or Cloud, like we have chosen a few that we see usage on. We will also next school year be able to do that into the actual apps that we are seeing, kind of the EdTech apps that we are seeing gain traction. And why I think that's so important is we need to be actually to executive orders to all the states that have actually said we need to actually be.

Helping students understand how to use it and also how to process when not to use it. And the only way to do that is to actually have some transparency in it. The LLMs right now are not very good at providing transparency. Securly is really good at it. And so that is an area that we are spending a lot of time.

And again, to me that is like. The baseline of what a district should expect from Securly, that that is part of, as new technology vectors come into play, we should continue to be the trusted partner that says, okay, you wanna put guardrails around it. You want to be sure that they're not putting inappropriate prompts in or getting inappropriate answers back.

Fine, we can help do that. But as importantly. Within our classroom management, if you wanna be able to see, hey, what percentage of the prompts that my students put in in the classroom were around the topic we were talking about. 

[00:18:35] Alex Sarlin: Right. 

[00:18:36] Tammy Wincup: Let's provide some of that information in a way. Yeah, and And candidly, I think this is where big tech needs ed tech.

[00:18:43] Alex Sarlin: Oh, for sure. 

[00:18:43] Tammy Wincup: Because I think that even if the LLMs choose to do it themselves. They were never created nor will ever focus exclusively on K 12. I think we've seen that with YouTube. We've seen that with other things. And so I think there will always be a really important role that EdTech plays in that. 

[00:19:03] Alex Sarlin: I wanna stay on this topic 'cause this is so rich and so interesting and certainly to me, and I think to a lot of our listeners as well, because as you say, I mean historically it reminds me of.

Wikipedia. Remember the early days when it was like Wikipedia was considered this go-to solution for students when they wanted to sort of quickly get information about something and nobody knew what the rules were gonna be? Some teachers agreed with it, some didn't. But schools had to decide whether Wikipedia was allowed in schools, whether it was something that people could look for, which seems.

So quaint and so silly. Now looking back on it, but really we're going through the same thing again. You know, we know how powerful AI is. It can answer any question. We know that and we know that The foundation Frontier model providers, the Google's, the Philanthropics, open ais, they can't focus on the K 12 user as their core user, right?

They can't say when you. First arrive on the Claude homepage. Are you 13 or under? If you are, we have to deal with you in this certain way. It's just, I mean, technically you're not supposed to be 13 or under at all on these things, but it's just like they're not gonna build out that use case fully. They don't understand it fully.

It's not their core audience. Just like with YouTube or versus YouTube kids versus YouTube schools. So. Intermediaries like Securly, like anybody who sort of can help make sense. And I say anybody, but there aren't that many companies or tools that can do this. They can sort of help make sense of the AI usage.

Say don't ban chat. Bt don't ban really powerful EdTech tools that use LLMs. Allow them but monitor them, make sense of them, make sense of how students are, are using them. See how they're using them. Reactive. Not in a bad way, but instead of just shutting the whole thing down as a overreaction, allow people to learn from it, allow them to use it, and then see if things are going off the rails in ways that are predictive of problems or of distraction or of anything else.

I respect it enormously and I think it's exactly how that world to work. And I think your point about big tech needing ed tech, let's unpack that 'cause I totally agree. And having talked to the Google people, having talked to some of the top open AI people about this. They know that they just, they do.

Everybody's trying to figure out how to put the pieces together. 

[00:21:15] Tammy Wincup: They do. And look, I'm just really grateful that we have really, really great relationships, I think with the Google's, Microsoft's, apples of the world, et cetera. Right. And they all take a different approach to it. Yeah. 

[00:21:24] Alex Sarlin: So 

[00:21:25] Tammy Wincup: in some ways they're letting us hack on top.

But because they, you know, I think in general, the safety industry, the digital safety industry is one that we know. Assist K 12 schools in a way in particular that we need. I think that there is one topic, though that is an extension of this that I am really, really focused on, and Securly has always been really focused on, but I think it requires an evil doubling down.

So one of the things that, across this ecosystem, right, so we talked about kind of how a decade ago everybody was like, well, I have pieces of this. K 12, district Administrator, overtax, CTO, director of Administrators. You deal with how to deal with it. Certainly we are seeing these trends in states, which I take really seriously around student privacy and I wanna talk about that because I think it's just a super important piece for us within the safety world to master the digital footprint we know is really important in telling us things about how students are learning.

How they're spending their time where they potentially need support. On the flip side, they're kids and they make mistakes, and we don't need those pieces following students unless the district or the parent makes that decision. So one of the things that you see happening in states, Ohio, Wisconsin, others are this idea of what needs to be anonymous, right?

Like what needs to be unmasked. And we take that really, really seriously. We have an enhanced privacy mode across our products. That we encourage districts to turn on, regardless of what state you're in, to mass those identities, to be able to look for the trends. So you can see the trends without seeing individual students, right, until there's a cadence of self-harm or violence or bullying that would allow the correct administrator to actually unmask that.

And unfortunately that happens too much, right? In the sense that like every week there are students that are crying for help with their digital footprint, specifically around suicide that we have been a part of helping to stop that is our North Star is looking for those trends, not when we're in the moment.

But frankly when we see them over time, and I think that that requires a district to actually want that information. And sometimes we see that they do and sometimes we see that they don't. And so like the ability to kind of say, look, this is not gonna go in the SIS, this isn't gonna go in like some permanent record.

How do we support these students in their academic learning? Our job is to follow the policies of the district and the state. Our job is to enable whatever policies they and their school boards and their elected officials have come up with to support those. But always putting the student first and having the student voice there.

And I think that that is a shift that is really important and that everybody should be holding the entire industry up to. 

[00:24:31] Alex Sarlin: Yeah, let's dive into these ideas. I think these are so important. So it's funny just to hear you evoke the permanent record, right? The classic fiscal threat about the permanent record.

Because in the age of ai, the concept of a permanent record is one that I think we actually haven't really gotten our heads around the idea of, you know, what activities that a student engages. In their online, or you know, their digital tool suite as a student. I totally agree. You know, some aspect of it should be anonymous, even though it's surveilled.

Even though it is monitored. Right. You know, if a student is on their school device and they're Googling things about self-harm or if they're bullying other students on social media, all those things, it's not like we wanna turn a blind eye to them, although maybe some state policies do that. They don't wanna sort of own that.

But generally we, we don't want to. Turn a blind eye to that, but at the same time, we don't wanna over index on every student doing exactly the right thing all the time with their digital devices because they spend all their time seven to eight hours a day on their digital devices. That's the average. I think it's eight and a half hours a day on digital device for teenagers.

I. In 2025 on a digital device, eight and a half hours a day, and it's your school issued device, probably a good percentage of that. You know, the idea of watching for any search, any prompt in an AI that might be problematic and being ready to pounce creates a really strange. Vibe. You know, it just feels like a police state.

But at the same time, we don't wanna ignore these things. And if there's patterns of self-harm, patterns of bullying, if there's, you know, rising crises, which absolutely to your point, happen all the time, because this is a technology enhanced world. Everything that happens happens through technology. We can't turn a blind eye to it.

So it feels like you're thinking. In a very nuanced way about where that line should lie between anonymous. You know, we're watching, but we're not waiting to pounce on you. We're not waiting to to flag something. We're not waiting to add something to a permanent record, whatever that means right now. But at the same time, what signals do push it over the line so that as responsible adults, we have to make, you know, jump in.

Tell us more about how you find that line. That feels like the question of the age. 

[00:26:35] Tammy Wincup: And look, everyone interprets it differently, which is why I think it's important that we talk about it. You know what's fascinating is technology plus human in the loop is actually really good at this. 

[00:26:46] Alex Sarlin: Yeah, right? 

[00:26:46] Tammy Wincup: Because what I oftentimes, like if you think about a digital footprint, and we're just talking about our wellness piece right now, so not on just the overarching filtering and not on the classroom management, but certainly on kind of the, what we call Securly aware.

Along with our on-call team. So when those two things are paired together, right, the digital inference of looking for pattern recognition around self-harm, violence, behavioral threat, cyber bullying is actually really good. 

[00:27:14] Alex Sarlin: I'm sure 

[00:27:15] Tammy Wincup: it doesn't mean there aren't false positives and negatives. There will always be, but when you do that and you add a human view on top of it for just the ones.

Like it can look for timeliness quite well. I think going back to privacy, we're always struck by others in the industry that are like saving those images and having them reviewed by people or looking at everything or, you know, alerting on every time there's a word suicide like. To me, the industry needs to shift away from that, and the differentiation that CTOs and districts need to see on that is real.

I think oftentimes, like everybody kinda says, oh well, everybody's equal in the industry. The reality is how you use the tech is very different. And so this inference around the idea that I can, we had a real student who was working in a Google Doc and looking at suicide stuff and they were like that, and then spent 45 minutes on a math problem right afterwards.

[00:28:15] Alex Sarlin: Hmm. If 

[00:28:16] Tammy Wincup: we had alerted and like sent the world in within those seven minutes, like, because you know how many times I'm like, I wanna kill myself. Working on this problem, right? Like we have to actually have inference around the whole piece versus just over. It's kind of like crying wolf, like we have a real balance right around being able to do that.

Well, nobody gets it right all the time, unfortunately. But we do have, I think to use the LLMs in a way, and the inference engines to look for pattern. Recognition and that's actually where technology, I think can be very, very helpful. 

[00:28:52] Alex Sarlin: Yep. 

[00:28:52] Tammy Wincup: What we always suggest though is look, we are burdening administrators and teachers who likely came in, many of them for the academic teaching and learning side.

And so often what we hear is what's happening right now on the safety side is it's like one more thing. 

[00:29:10] Alex Sarlin: Yeah. That 

[00:29:11] Tammy Wincup: K 12 districts are being asked to take on. And so our job is to make that easier. Our job is to help think about how we take some of that burden on and after hours, how we, it's not one more thing that a principal has to deal with by themselves.

[00:29:27] Alex Sarlin: Right? 

[00:29:28] Tammy Wincup: And that I think is true across the board. So 

[00:29:31] Alex Sarlin: one of the byproducts of this technological age is that. So many things can happen in so many different places. And the idea that a student in school could be engaged in social media bullying or self-harm, there's this open question that I think, I mean, it's not open in terms of policy, but I think it's open in terms of just our whole understanding of how the world works right now is that the school's responsibility, right?

And I mean, in many cases it has to be, or it should be. And in others, it's a little hard to know because it's like, are they on their personal device or the school device? If they're on the school device, you know, are they doing the workarounds and finding ways into sites that they're not supposed to be into?

And if they are, you know, is there responsibility there? I mean, it just, this whole world gets very legalistic. It gets very moral, right? Because obviously if you have one dashboard about which of your students are at risk for suicide, and the other about, you know, how they're doing on their standards of math, you know, one.

It clearly takes precedence there, but should it for a teacher, to your point, right? I mean, if you're the student's teacher, your core goal is making sure that they're learning. But of course their mental health is also part, it's just, I mean, even talking about it makes my head hurt, and I'm sure that teachers, educators, administrators, principals, school leaders are.

Looking at this world where they're like technology. Both, you know, consumer tech and ed tech to a lesser extent have infiltrated our students' lives in this enormous way. And now we are catching all of the complexities that come with that in terms of data, in terms of privacy, in terms of interaction and threat and self harm.

And I think people like yourselves and what you're doing with Securly, you know, some others in the ed tech space. Can take some of that burden back off of them and flag things that are predictive or problematic or trends that they should keep an eye out on, but not keep them buried in the muck of digital wellness, which is at least as complicated as education.

[00:31:31] Tammy Wincup: It is and, and in some ways I think we have to give less data, right? So this industry kind of started, which is like, let's just flag everything, everything on the filter, everything in classroom and everything. And what we have really taken a different approach, which is let's give you a 60 day rolling average, right?

Like, let's look at kind of who is sitting in those spaces. Not just once, you know, because otherwise you're just constantly reacting. You're a teacher constantly reacting to what's happening in the hallways, what's happening kind of in your class, what's happening outside of there. And I think we have to kind of pull up and actually say, okay, what's the trend?

I. Who is kind of in this bucket over the course of a month or two months, and how do we then give to the counselor, the teacher that this is where you should pay attention to. It's like offense versus defense, right? Yes, yes. Everyone's kind of playing defense with all of this information, and I think our job from a technology standpoint and ed tech perspective is to pull back and actually let them play offense on kind of safety and wellness.

And so that's, that's certainly one of the areas that we are really focused on. 

[00:32:39] Alex Sarlin: In your offense, defense metaphor, there's also nuance there because I would imagine that some schools in the original AI moment when they said, AI is here, it can answer questions. Student's gonna use it to cheat. Oh, we're banning it.

No Chad, GBT, no, Gemini, they probably considered themselves playing offense, right? I mean, that's what gets tricky about this moment, right? Offense getting ahead, quote unquote of problems like that by doing blanket bands is a version of offense, but it's. Too much. Right? I, I think it's too much because it's, it's denying students the ability to use incredibly powerful tools that are gonna shape our future.

But like, how do you help schools decide how much offense to play? You know, where to get proactive, where to get ahead of trends and where it would become too much and you actually have too much, you're banning too many tools or too many technologies. 

[00:33:26] Tammy Wincup: Well, what's fabulous now in this kind of third movement of technology, right?

I started in kind of web 1.0. Started in technology and web 1.0. We saw it kind of there. We saw it in cloud-based and the move to cloud-based technology. We saw it in terms of social apps and social media, and now we're in ai. And my deepest hope for education is that the time it takes us to digest.

Those movements has to shorten. And we have a lot of the lessons from Web 1.0 cloud computing and social to apply immediately 

[00:34:03] Alex Sarlin: here, and 

[00:34:04] Tammy Wincup: I think in general, forward looking, CTOs and superintendents are doing that really successfully. Our job is to support them in that as we go into the 25 26. School year as more districts, we have a great advisory board who everyone said like next year is the year that everyone is rolling out kind of acceptable use policies for students and teachers.

Again, our job at SECURLY is to make sure that we're providing. The training wheels, if you will, the guardrails around transparency on that so they can experiment. And I think in many cases, like districts are gonna get it right And in some ways the ability to kind of do that faster is gonna be forced on us.

Right. It's gonna be forced on us that even the work that we are doing around our Securly AI chat are gonna have to evolve every three months as we see. Mm-hmm. What's happening. That's our job. Our job is to kind of make sure that that happens, but in general, I think for the first time, probably since cloud.

CTOs and districts are going to be, and back at the table on a strategy perspective. In some ways, my experience has been that they kind of became in between these big movements, became kind of purchasers of technology. Now, I think, and I, the more that I talk to great directors of technology and K 12, they are becoming strategists again.

And I think that that's exciting. I think we have these fabulous people in these districts. Who were kind of the help desk. In between these movements, and now they're back at the table really saying like, here is how we roll out this information. Here's how we have to look at safety. They become partners with counselors.

They become partners with principals in a different way than. When they were just kind of trying to put hardware into kids' hands. 

[00:35:57] Alex Sarlin: Yep. 

[00:35:58] Tammy Wincup: That's an exciting moment for education. 

[00:36:01] Alex Sarlin: It is exciting. I think it's exciting for the strategic thinking within technology departments, within the school teams, basically to have to think about how technology.

Changing role yet again. The changing role, the technology's gonna play in the classroom. I have a super high level question for you, but I am so curious about your reaction to it, which is that, you know, I think one of the things that in this sort of second wave you're talking about the social media user generated content, you know, YouTube, web 2.0.

One of the things that was very clear is that you had some consumer applications of that that are really. Increasingly toxic, let's put it that way. Ones that are really not, don't have a great role in school, and this we could talk about them for for ages, but let's just say there's a bunch of them. And as we learn more about the effects of social media, I think people are more and more, as you mentioned, 80% banning social, starting to ban social media.

Now there's more and more realization that, you know, even regular social media, the Instagrams and tiktoks and all of that can have really bad effects. AI is at this really strange moment where our current federal government is basically trying to deregulate it, keep it as deregulated as possible.

There's a bill in Congress right now. By the time this is published, it will either be law or not. That basically says states can't regulate AI for the next 10 years. Now, if that's true, then the consumer side of ai, the AI girlfriend apps, let's put it that way, right? Among many other use cases are going to get incredibly popular and really off the rails.

Meanwhile, the AI ed tech world, as we talk about on the show all the time, is going to get incredibly powerful and and amazing, right? Hopefully more amazing than not. But I'm very optimistic about AI and ed tech. So. Kind of wild west, really crazy uses of AI in consumer and social media and these really powerful uses of AI in EdTech.

How can these strategy meetings that you're mentioning draw the line in a way that allows the EdTech use cases and the positive use cases and really gets ahead of the negative ones so that you don't have. The social media moment happening again in five years where people go, oh my God. You know, students are spending six hours a day on AI apps that have nothing to do with school and it's hurting their mental health and X, Y, Z.

Like how do we get ahead of that moment? 

[00:38:12] Tammy Wincup: It's a great question. I'm not sure I have like a perfect or even a well thought out point of view. I mean, what would you share though is on the safety and wellness side? Like I look at prevention all the way over to emergency response and recovery, right? So if you've got prevention, you've got inference.

You've got all the way to like emergency response, general response to recovery. On the safety side, it transcends digital to physical safety, right? Mm-hmm. So like those are kind of the pieces that we're talking about is like, what is the digital footprint tell us. About physical and mental safety. And I do think that our definition of kind of digital citizenship we learned through social media has to evolve fast.

[00:38:59] Alex Sarlin: Yes. 

[00:38:59] Tammy Wincup: Right. And so it's where. Again, not with my Securly hat on, but my kind of parent hat of a 

[00:39:07] Alex Sarlin: Right. 

[00:39:07] Tammy Wincup: You know, a parent of three is like, this is not something we can just throw and say schools need to solve. 

[00:39:12] Alex Sarlin: Right. Right. It shouldn't be, 

[00:39:14] Tammy Wincup: and this is why I go back to this idea that. Of our home app of this kind of like every district and state is gonna do it differently.

I think that what social media and what we saw in filtering and classroom management around social media is the norms of a district. Communicating what their norms are with the parent and caregiver community becomes really important. And I think you begin to see the same parental rights conversation coming back in terms of the AI perspective.

[00:39:44] Alex Sarlin: Mm. Interesting. 

[00:39:46] Tammy Wincup: Schools are gonna set their guidelines around how you can. Or what are the tools you should be using in school? But that is going to be totally different outside of school. 

[00:39:59] Alex Sarlin: Yes. 

[00:39:59] Tammy Wincup: And so I think the same way, you know, what the school can able to box is gonna be different than what happens outside.

Yeah. And so we, we know this and so part of this conversation has to happen in the beginning, which is, hey, we can have a student acceptable use policy within schools. But hey, communities and parents, yes, and everybody else, you should set one too and write, learn that right on the social media side of the house.

And so I think that that's a conversation that is gonna be hopefully happening earlier. But again, this is like a trend. The students are gonna find these things way earlier than us parents and adults. And so part of this is, where's the dialogue on that? Around kind of prevention and digital citizenship now?

Yeah, I, I think it's just gonna evolve, but it's gonna happen fast, 

[00:40:51] Alex Sarlin: no question. 

[00:40:52] Tammy Wincup: And that is gonna be exciting and require a lot, unfortunately, of us as communities 

[00:41:00] Alex Sarlin: a hundred percent. And it's, we gonna require a level of agility for us in the ed tech space and the tech space that I don't think we're used to.

So, I mean, just quick anecdote before I know we have some final questions. I was just talking yesterday to a few different companies that do co-design, that basically work with students and educators to design products together. And one of the things they brought up that had never occurred to me, but it was like, oh, was that.

I'm sure it's super relevant to you. You probably know a ton about this, is that there are certain phrases on TikTok that everybody under the age of, you know, 18 knows exactly what they mean, but a classic monitoring program might not know what they mean because they don't use any traditionally trigger words and.

The fact that, you know, this was true throughout all of history, kids would write notes to each other and they'd have slang in them, and then the teacher would ask them to read aloud and they wouldn't know what the slang means. This is not like a new problem, but it's, the scale of it is enormous. I'm curious, uh, lemme just ask you fast before our final questions.

Like, do you see a world in which Securly is actually bringing in real time data about what is happening in the world? What's happening on social media to be able to be agile in what it's monitoring in terms of AI use cases? Usage. 

[00:42:08] Tammy Wincup: I mean, we have to do that now because we have to be looking for, we are constantly looking for that across social media and across sites around.

That's great. What are the key words? And we focus very, look. It's very easy to be like, we should be across, we focus very much on self-harm. Violence and bullying, right? Like those are the areas that we are looking for inference around kind of how they are being looked. The reality is you could go really, really deep across other areas.

We have to focus there 'cause we think those are the areas that schools are the makes sense for. And it's hard to keep up. Right? And we're not gonna get it a hundred percent right all the time. But I can tell you technology can be very helpful in that, versus me as a parent just needing to know this is 

[00:42:52] Alex Sarlin: a great use case for ai, right?

Watching TikTok all the time and saying, Hey, this phrase means self-harm. You might not have heard of it, but it does. So if. I mean, it's a great use of AI and a great use of the human in the loop version of AI that you mentioned as well. You know, that you could flag possible things and then have teachers or school leaders or you know, administrators, tech administrators being like alerted.

So we have our final questions and I'm very curious about your answers. I know we're almost at time here. What's the most exciting trend you see in the EdTech landscape right now that you feel like listeners should keep an eye on? You have a long history in EdTech, so I'm sure you have a lot of ideas about this.

[00:43:28] Tammy Wincup: Well, you know, it's changing. So one we've already talked about a little bit, which is I do think it won't happen overnight, but I do think. Where a lot of these AI ed tech specific AI tools will get us is if used correctly, it will bring a teacher back to what I think many want, which is to be that relationship with a student.

Yes. To help on critical thinking, to help digest information versus just be a kind of rote memorization. We've been talking about how do we move to critical thinking away from kind of just rote memorization and testing for a long time. I think that this. Is the moment where if we seize it correctly, we will get there.

[00:44:06] Alex Sarlin: AI can be Socratic, right? AI can force you to think deeper. We just need to tell it to do that. 

[00:44:11] Tammy Wincup: You know, if you look at the WEF report that came out this week on the skills that are most gonna be needed, like they are the skills that are about where I think in general teachers wanna go, which is not just like.

Memorization or skill development. But what do I do with those skills development? Number one, you know, with my previous kind of investor hat on, I think that a lot of these tech enabled services, like high impact tutoring and things like that, I think those will evolve greatly. 

[00:44:38] Alex Sarlin: Hmm, 

[00:44:39] Tammy Wincup: to provide the human relationship, but the actual tutoring virtually will get much better.

And so it will put the humans back, I think. Teaching the value added skills on top of the information, which will be really great. Those are two areas. I think. One area that I actually think I have seen is this connection, which is somewhat related Securly, is this real connection between education and health.

Yeah, I think I was just gonna get tighter as we think about that loop of what we're asking K 12 schools to do, I think, you know, we'll continue to get tighter and, and we'll be better. 

[00:45:16] Alex Sarlin: That ties into the idea that, you know, if AI is doing more of the administrative work, more of the sort of road grading, more of the tutoring and teachers in the classroom are doing more of the actual deep thinking and relationship building and getting kids to really understand themselves, understand the world, which is I think a really great vision for the future of this.

Then the connection between education and health is huge because what they're doing in the classroom is not just getting facts. They're not just learning procedures. They're actually. Becoming people again, which is something I, I think, you know, K 12 has not always focused on, unfortunately, it hasn't had the time, frankly, to focus on for a long time.

So this is a sort of, frankly, for me, utopian vision of ai where AI gets so much of the rote. Pieces, the administrative work out of the way that teachers and students can focus on. You know, figuring themselves out, working together, learning each other, understanding the world together, all of these really exciting ways to look at things.

And then of course, health and digital wellness is a huge part of that. What are some resources you would recommend for people who want to go deeper into ed tech? You've been in this space a lot. 

[00:46:20] Tammy Wincup: I think one of the areas that I believe that all these vertical kind of tech, whether ed tech or health tech, one of the areas that I really appreciate when I was at Protocol, looking at broader tech and policy was the idea that I don't think we look enough in ed tech at Consumer Trends.

Yep. So I think that there are two folks that I love, well, Mary Meeker came out of retirement last week to produce her bond capital, not retirement, but she came back to produce the AI report out of bond capital. And I just, every year I find it to be just, I. Really, really important, and if we look at that through an education lens, I think we can get ahead.

Same with Benedict Evans. I think following kind of the trends that they see on the consumer side of the house and actually really asking ourselves what does that mean for K 12 and for Ed, we don't do that enough. We kind of stay in our ed tech bubble. We kind of look at what's happening just in education.

I think those are two great resources. And then certainly our common friend, John Bailey, I think is just phenomenal at digesting that transition between what's happening. And I think he's just a great friend, but also somebody who I think has really experimented with a lot of the AI models in a way that many of us in EdTech have not yet.

[00:47:36] Alex Sarlin: Yeah. Fantastic. As always, we'll put the links to all of those resources in the show notes. The Mary Mer report just came out the week we're recording this episode and it is 370 slides of Yeah. 

[00:47:48] Tammy Wincup: Digesting it. But it's pretty awesome. 

[00:47:50] Alex Sarlin: Yeah, you have to get AI to digest it. I mean, can. Thank you so much. This is Tammy Wincup, the CEO of Securly, which is doing amazing work with digital safety and wellness, as well as classroom management, as well as AI chats to 40% of the K12 schools.

Thank you so much for being here with us, and I'm looking forward to our next conversation. 

[00:48:10] Tammy Wincup: I'm as well, thanks again for what you do. We'll talk soon. 

[00:48:13] Alex Sarlin: Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the Free EdTech Insiders Newsletter on substack.

People on this episode