Skip to Content
Artificial intelligence

Podcast: Who watches AI watching students?

A boy wrote about his suicide attempt. He didn’t realize his school's software was watching.

June 29, 2022

While schools commonly use AI to sift through students' digital lives and flag keywords that may be considered concerning, critics ask at what cost to privacy.

We meet:

  • Jeff Patterson, CEO of Gaggle
  • Mark Keierleber, investigative reporter at The 74
  • Teeth Logsdon-Wallace, student
  • Elizabeth Laird, director of Equity in Civic Technology at Center for Democracy & Technology

Sounds:

  • "Your Heart is a Muscle the Size of Your Fist" from the band Ramshackle Glory's 2011 album Live the Dream.
  • "Spying or protecting students? CBS46 Investigates school surveillance software" from CBS46 in Atlanta, GA on February 14, 2022.
  • "Student Surveillance Software: Schools know what your child is doing online. Do you?" from WSPA7 News in Greenville, SC on May 5, 2021.
  • "Spying or protecting students? CBS46 Investigates school surveillance software" from News 5 in Cleveland, OH on February 5, 2020.

Credits:

This episode was produced by Anthony Green and Emma Cillekens with reporting from Mark Keierleber. It was edited by Jennifer Strong and Michael Reilly, and mixed by Garret Lang with original music from Jacob Gorski. Art by Stephanie Arnett.

Full transcript:

[TR ID]  

Jennifer: Did you ever use an excuse to get out of homework? 

One day after school my daughter couldn’t open hers…  and she was pretty sure the school’s software was to blame. She said, ‘maybe it found words in the assignment that aren’t allowed, and the software took it down’.

Her excuse wouldn’t have worked when I was a kid, but these days it’s pretty common for schools to use software to sift through their students' digital lives… looking for keywords of concern… like the word, “kill”. 

But the most common way that word is flagged… is with the book, To Kill A Mockingbird.

So, context is king… and it’s not something algorithms are necessarily great at.

In her case, the class was learning about what it’s like to be a girl in Afghanistan… and the concerning key words that were flagged were real… just not for reasons that software can address.

WSPA 7News Anchor: Violence, suicide, child pornography; you might think these are all very "adult" issues, but schools are catching a growing number of red flags for questionable content on students' computers. 

CBS 46 News Anchor: Spying on students or protecting them? A new type of technology lets Metro Atlanta schools monitor your kids on school-issued devices. But some say the software is too intrusive, allowing teachers personal and private thoughts. 

ABC News5 Cleveland Anchor: Mike can't directly monitor apps like Facebook, Instagram and Twitter, only parents can do that. But Mike can keep an eye on email and what they do on school computers. 

Jennifer: But it’s not just school devices. For example, if your kid’s school is using this software and you happen to open class assignments on your phone, (like many parents do in order to help with homework)… well, it’s probably running there too.

And while advocates praise this type of technology for catching risky behaviors... critics ask: at what cost?

I’m Jennifer Strong and, this episode, we ask: who’s watching the machines watching our kids?

[titles]

[music]

Jennifer: Schools in the US get money from the government to pay for things like internet access and computers. And in return, they’re required to have a plan to keep kids safe online.

A law called the Children’s Internet Protection Act makes schools responsible for protecting kids from harmful content while they’re getting an education. 

But more than 20 years after it passed, how to achieve that in practical terms remains a big open question that a variety of startups and other tech companies are trying to solve.

One of them is Gaggle… it's currently used in about fifteen-hundred school districts… and it uses AI, machine learning and also people to monitor content and flag things that could be concerning on school accounts and devices…

Jeff Patterson: So Gaggle’s an early warning system to detect kids in crisis before a tragedy happens. So we're monitoring the school's digital platforms, which is mostly Google Workspaces and Office 365. And we're looking for any indications of bullying, threats of fighting, bringing weapons to school or school violence.

Jennifer: Jeff Patterson is its founder and chief executive. 

Jeff Patterson: Last school year, we pulled in over 10.1 billion items. 200 million of those had words that were concerning words or phrases like “gun” or “shoot,” Then we use lots of different machine learning algorithms to eliminate things from that list that aren't relevant. Obviously “I'm going to go shoot hoops” is nothing to be concerned about, but that still left 40 million items that had to be human reviewed.

Jennifer: The review happens in a few stages… and the first provides a quick label of good - or - bad. 

Jeff Patterson: And they're making very fast decisions. Then there's our second level of highly trained people who are then looking at it, classifying it as how urgent is it? What type is it? Does this warrant an email or does it warrant more of an immediate phone call?  

Jennifer: It’s a battle of speed and accuracy. One, he admits, Gaggle can’t always win.

Jeff Patterson: And the reality is we can't find everything possible because that would be too costly. Right. We'd have to have 10 levels of human review. And that is obviously not something that can be afforded.

Jennifer: Patterson founded the company almost 25 years ago. He still owns 100% of it and he’s extremely mission-driven. He told me his product prevented fourteen-hundred student deaths by suicide just last school year.

Jeff Patterson: A middle school boy pulled out his Google Chromebook and started creating a Google Doc. And the document said, “I'm tired of faking my feelings. I've got no one who loves me. Not even my family, my only choice left is suicide.”

So what we do at Gaggle is we sit behind the school's digital platform, the tools, Google, Office 365 and others. We're pulling in all the email, the Google Docs, the chats, everything the kids are communicating on. We run it through technology that highlights things that are concerning. And then we have a team of people that's working 24 hours a day that's reviewing those items. And clearly this was a credible threat. So we called our emergency contact, who is the school principal.  

Jennifer: The principal called the boy’s home and the assistant principal called the police… and this is controversial in a few ways.

It’s a Friday night… and a boy typed words to himself in his bedroom. And the software can flag these words in real time, even if they aren’t sent or shared with anyone else.

And now, the police are involved. 

Jeff Patterson: When the principal got a hold of the parents, the parents said, “Our son is fine. We just finished dinner. He's upstairs in his room.” But when they went to look, he wasn't there. And that's why it was important that the assistant principal got a hold of the local authorities. Because in this New Jersey town, they knew where to go. The trains run through on their way to New York City. And there's a spot where two kids had previously jumped in front of the train and sure enough, they found this boy walking to that spot… and it all happened in 15 minutes.

Jennifer: He says he encourages schools to tell parents and students about the software… but not all of them do, or do it in a way that they fully grasp what’s going on. 

Like, if school email or messaging apps are used on a cell phone, then the software is running on that device. It’s also pretty common for parents, including me, to sign-in to Google classroom on a phone and check kids homework.

Meanwhile, the threat landscape that companies like Gaggle are supposed to keep up with is constantly changing... Like Tik Tok challenges that encourage kids to act out in certain ways… 

NBC News Anchor: You may have seen these images on your social media feeds in the last few days. Bathrooms damaged. Toilets broken. Even keyboards in urinals. It’s part of a new viral challenge on TikTok.  It is called the devious licks challenge and apparently encourages people to vandalize their schools.

Jeff Patterson: So we had to suddenly start creating systems to identify those TikTok videos on threats of school, vandalism, or slap-a-teacher. 

Jennifer: And there aren’t set rules for how to go about… any of this. 

Jeff Patterson: There's no guidebook to hacking your way through the jungle. What we do is when we find something concerning, we call our contacts or email our contacts at the school system. The educators decide what is the appropriate response to all these situations and sometimes they don't make the right decisions, but that's an opportunity for us all to sort of help them make those right decisions.

[music/chapter change]

Mark Keierleber: So for the last few years I've been reporting on America's nearly 3 billion school security industry, which includes a range of products pitched as school safety solutions, such as cameras, metal detectors and digital tools that monitor students' behaviors on social media.

Jennifer: Mark Keierleber is an education reporter for the nonprofit newsite The 74. He was covering the Minneapolis school district as it cut ties with police after the murder of George Floyd.

Mark Keierleber: I was reporting on that change when I learned that the district had spent more than $355,000 to contract with Gaggle. I submitted a public records request with the Minneapolis school district to try to learn, you know, how one school district had been implementing this tool. So through a public records request, I received roughly 1300 incident reports over a six month period that really gave me a pretty unprecedented window into how Gaggle monitors kids on their school issued technology. If kids are talking about feeling depressed in a journal entry, for example, it could send an alert to the school district saying, hey, this kid maybe exhibiting signs of self-harm or suicide. And it's ultimately up to the school districts to decide how to respond to the materials that Gaggle sends them. 

Jennifer: For him, it raises significant questions… about not just student privacy, but also whether a surveillance system is really the most appropriate or effective way to identify children who are struggling with their mental health.

Mark Keierleber: Kids need to have a certain level of trust in order to, to come forward with adults and say, “Hey, I'm, I'm experiencing this problem.” Well, if they feel like they're being surveilled, and that they're being monitored and could get into trouble for the things that they're saying or doing, that they may be less willing to come forward with some of those issues.

Teeth Logsdon-Wallace: I'd call it spyware, but that's kind of more of a personal feeling about it. Like, the school never told students that this is a software that was on our computers. 

Teeth Logsdon-Wallace: Hi, my name is Teeth logsdon-wallace. I'm 14 and I'm an eighth grader in Minneapolis.

Jennifer: Mark introduced me after they met while he was reporting this story. The eighth grader has struggled in the past and, in 2020, attempted suicide. 

Teeth Logsdon-Wallace: So I did go to the hospital and then I went to a like program that was like a month and a half long. It was a partial hospitalization program. And that was very helpful and I am definitely in a much, much better place now mentally. And so fast forward to, like, September 2021 when, in social studies, my teacher asked, as a way to get to know all of his students, for us to make a presentation sharing like pieces of media that are important to us. And so one of the things I chose with this song: Your Heart Is a Muscle the Size of Your Fist by the band Ramshackle Glory.

[Upsot: Your Heart Is a Muscle the Size of Your Fist by Ramshackle Glory]

“Dalia never showed me nothing but kindness

She would say, "I know how sad you get

And some days I still get that way, but it gets better

It gets better, it gets better. [BEGIN FADE UNDER] 

Sweetie, it gets better, I promise you…"

Teeth Logsdon-Wallace: Music was like a good coping mechanism after my suicide attempt. And it was something that kind of really just like helped me with like all the feelings and stuff. And was one of the songs I listened to a lot during music therapy at the program I was at. And so I wrote like a shorter version of that in my slideshow, on my school computer and Gaggle, like, just didn't get any of the context, just the word suicide and like flagged that. And then of course both me, my mom, when we found out the software was on there, were understandably shocked and horrified because that was not something that I meant for anyone besides that one teacher to see because I decided to trust him and tell him this thing. And that was for him to see, only. It felt like a huge violation of privacy because I was just trying to share a vulnerable moment with one specific person. And then a whole bunch of people who I didn't even know could see it, saw it. And it just felt so disgusting. 

Jennifer: Even though he says his generation knows their public data is being collected and tracked online… This feels different. 

Teeth Logsdon-Wallace: Students deserve privacy. If someone is trusting a specific teacher with something they're saying, that does not mean they're trusting this third party company. This does not mean they're trusting the entire school administration.

Teeth Logsdon-Wallace: When it comes to like direct messages to friends and like messaging each other in like what's supposed to be private spaces, that's something where I feel like generally my generation, we feel like it's like we should at least be able to have these private communications that stay private and are not seen. 

Jennifer: And the Minneapolis school district he attends just announced plans to end its relationship with Gaggle, citing budget cuts.

For his part, Gaggle Founder Jeff Patterson says it’s ultimately the schools that decide what words and behaviors are not ok… and how to respond.

And these things change with time. For example, in the 19-90s he recalls debates about whether even words such as crap should be flagged by the software… because schools didn’t want kids to use even mild swear words... but these days, he says, most schools choose to not even receive alerts about much stronger language.

Schools also decide how alerts are shared and who can see them… and he says it’s the role of educators to communicate these decisions with students and parents about what tools they decide to use… and how they choose to use them. 

Finally, he says the purpose of Gaggle is to prevent student tragedies. He doesn’t recommend schools use it as a disciplinary tool… but that’s not up to him.

[music]

Jennifer: You can find links to our reporting in the show notes... and you can support our journalism by going to tech review dot com slash subscribe.

We’ll be back… right after this.

[midroll] 

Jennifer: Schools are responsible for protecting kids from harmful content online while they’re getting an education… but it’s not as straightforward as it might seem. 

Who should decide what words are harmful?

And even before remote schooling became a thing, kids' devices and online homework made the line between home and school pretty blurry.

For many children, a school-issued device may be the only one they have… which during a pandemic lockdown, provided a communication lifeline with family and friends… but also meant even the most personal messages could be scanned and flagged by school software—like the software we’ve been talking about, from Gaggle.

The company says it can respond to a case of suicide or self harm within 17 minutes. 

Mark Keierleber: You know, one could say, wow, that was a really great intervention, right? For somebody who may be considering taking their life by suicide, getting them that help that intervention could save their life. But for Teeth, that's not the feeling that he had at all. He had no idea that Gaggle existed. 

Jennifer: That’s journalist Mark Keierleber, speaking about the student we met earlier.

Mark Keierleber: He had no idea that the materials that he was posting in his Google Drive were being sifted by an algorithm that content moderators, paid as little as $10 an hour, who he didn't know, were reading over his shoulder and were frankly telling his mom on him, right, about something that he had written. And so for Teeth this really caused a major trust issue with his school. 

Jennifer: These systems work by flagging key words… but what words should be flagged… and who should be responsible for picking them?  

Mark Keierleber: So one of the issues that has received a lot of pushback is the fact that Gaggle’s dictionary of keywords include the words gay and lesbian. And when I asked gaggle, well, why does your dictionary include that word? They say, well, LGBTQ students are more likely to you know die by suicide. So they need that extra support to make sure that we're keeping them safe. But for a lot of folks within that community, this really, really angered them. This idea that they were being disproportionately subjected to digital surveillance. 

Jennifer: And this has real consequences… like when a Minneapolis student was outed to his parents because Gaggle’s system flagged his language.

Mark Keierleber: So that's problematic in itself. And you know I can think of all kinds of scenarios where issues like that could come up. Like where we’re talking about issues of sexual violence too, right? If a student is talking about the fact that they were a victim of a crime, who's to say that the parents weren't the perpetrators?

[music transition] 

Elizabeth Laird: These programs, they're working 24 hours a day, seven days a week. That is not how schools work. 

I am Elizabeth Laird. I'm the director of equity and civic technology at an organization called the center for democracy and technology.

Jennifer: She’s been researching how people use these digital surveillance tools… and says these alerts are most likely to happen outside of school hours—when there may not be anyone to respond.

Elizabeth Laird: And so you sort of have two choices. One is, um, there's no response. Or some schools have made the decision to have those alerts shared outside of the school system with, with organizations that may be working all the time and that can include law enforcement,

Elizabeth Laird: The way that this is set up, it can really vary in terms of who decides, and then how much training do they actually have to respond to something and in a way that is in the interest of the child, I think, is a really important question that we just haven't talked about enough. And honestly, like some of that came out after our research. We didn't know that there were places that had alerts set up to automatically go to law enforcement. And from our conversation with parents, they don't think that's even possible, much less actually happening. 

Jennifer: As is often the case… laws that deal with this predate the internet, and just don’t work that well.

Elizabeth Laird: The main federal law that protects student privacy is quite old. It's called The Family Educational Rights and Privacy Act and it's from 1974. So just think about the state of data and technology in 1974. Like there's no internet. We're talking about protecting records that are in a filing cabinet and, you know, locking the key like that was the mindset in 1974 around what it meant to keep things private. 

Jennifer: So… basically… the responsibility of deciding what to track… what not to… how to handle all that data … and how these systems (and their human coworkers) are audited… all of that basically sits with the companies making the products. 

Elizabeth Laird: So I think there's another question of, of those folks. Who are they? Do they have mental health backgrounds? How are they trained? Do they know about how to keep this information private and secure? What kind of controls do the companies have in place to make sure that information isn't inadvertently shared, that they can't save local copies of it? Especially… This is information about minors. 

Jennifer: Some of this may soon change… Senators Elizabeth Warren and Ed Markey recently called for clarification on how schools should monitor students’ activities online, saying the widespread use of digital surveillance tools may threaten civil rights, in addition to privacy and security. 

A joint report from the senators surveyed companies in the industry which saw the use of their tools grow during the pandemic—including Gaggle, Bark, GoGuardian, and Securly—and it revealed that none of them had conducted any kind of auditing or analysis of their algorithms to see if they disproportionately flag some students. 

They cited privacy as the reason… though their products monitor student activity 24 hours a day, seven days a week.

Meanwhile, some privacy advocates say the push for additional guidance from regulators.. may actually lead to additional monitoring… such as students' social media accounts… which schools aren’t currently required to watch.

And all of this leaves us in a bit of a tricky place. 

Because who’s watching the machines watching our kids? 

[credits] 

Jennifer: This episode was produced by Anthony Green and Emma Cillekens with reporting from Mark Keierleber. It was edited by me and Michael Reilly, mixed by Garret Lang… with original music from Jacob Gorski.  

If you have an idea for a story or something you’d like to hear, please drop a note to podcasts at technology review dot com.

Thanks for listening… I’m Jennifer Strong.

Deep Dive

Artificial intelligence

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

What’s next for generative video

OpenAI's Sora has raised the bar for AI moviemaking. Here are four things to bear in mind as we wrap our heads around what's coming.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.