Skip to content Skip to footer

School Surveillance Earns Tech Companies Billions. Students Pay the Price.

Tech firms have outfitted classrooms with data-gathering surveillance technologies that undermine education and privacy.

A 120-degree camera in a classroom at the Mildred Avenue K-8 School in Boston, Massachusetts, on September 9, 2020.

Students in the United States are being watched. With dubious promises of greater security and enhanced learning, tech companies have outfitted classrooms across the U.S. with devices and technologies that allow for constant surveillance and data gathering. Firms such as Gaggle, Securly and Bark (to name a few) now collect data from tens of thousands of K-12 students. Despite their reach, they are not required to disclose how they use that data, or guarantee its safety from hackers and other bad actors. If that sounds like the stuff of science fiction, it’s because for a long time it was.

In their new book, Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools (Routledge, August 2024), Nolan Higdon and Allison Butler show how all-encompassing surveillance is now all too real, and everything from basic privacy rights to educational quality is at stake.

Higdon is a founding member of the Critical Media Literacy Conference of the Americas, a Project Censored national judge, author and lecturer at Merrill College and the Education Department at the University of California, Santa Cruz. Higdon’s areas of concentration include podcasting, digital culture, news media history, propaganda and critical media literacy.

Butler is a senior lecturer and director of the Media Literacy Certificate Program in the Department of Communication at the University of Massachusetts Amherst. She is also co-director of Mass Media Literacy, where she develops and runs training programs for teachers covering critical media literacy in K-12 schools; vice president on the board of the Media Freedom Foundation; and a spokesperson for Project Censored. Her research focuses on critical media literacy and critiques of surveillance technologies in education.

In this exclusive Truthout interview, Higdon and Butler discuss how this surveillance operates and why we should all be concerned about the widespread spying on U.S. students.

How is surveillance happening in schools, who is doing the spying, and how pervasive is it?

Nolan Higdon: Surveillance in schools is everywhere these days, and it shows up in so many different forms. From “smart” devices like TVs and cameras in classrooms, to educational platforms like Canvas or Moodle, and even through required communications like email and Zoom — surveillance is all around us. Plus, with school-issued devices like MacBooks, the monitoring doesn’t stop when students leave the classroom. Tools like Bark allow schools to keep an eye on students even after they’ve gone home. But it’s not just students being watched; instructors are being tracked and monitored, too.

A portrait of Nolan Higdon, coauthor of Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools (Routledge, August 2024).

During the COVID-19 lockdowns, we all got used to this kind of surveillance, especially with video-conferencing platforms like Zoom, which allowed third parties to monitor classroom meetings from start to finish. What’s even more concerning is that all this data being collected isn’t just staying within the school. It’s being shared with education tech companies, government entities, data brokers, equity firms, and pretty much anyone else who can buy or access it.

Tools like Bark allow schools to keep an eye on students even after they’ve gone home … instructors are being tracked and monitored, too.

Allison Butler: Surveillance technologies in education are incredibly pervasive. Any classroom, any school, that makes use of any digital technology — so, all of them — are being surveilled in some way or another. This surveillance is conducted by giant, multibillion-dollar corporations. There is no way to avoid them — but that doesn’t mean we are without hope! As we discuss in the book, there are ways for teachers and families to push back. Yes, we are all being surveilled, and this is the way it is right now, but that does not mean it is the way it ought to be.

How is the data that is collected from students, teachers, and others within the school system used?

A portrait of Allison Butler, coauthor of Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools (Routledge, August 2024).

Butler: A lot of the data captured works as a form of research for advertising and marketing. We’ve all probably seen this in our own web searches: I look up X topic, and within moments, I get advertisements for that topic in my social media feed, in pop-up ads and banner ads across the sites where I spend the most time. By looking up information online, in a way, I am laboring for these sites: I provide them with search words, interests and information about myself. This serves to build targeted advertising and further insulates me inside my own information bubble. The harm for this with students, and one of our areas of particular concern, is that copious amounts of data are captured from minors, often without their proactive consent.

This technology often promised greater security in an era in which parents, teachers and students are anxious about school shootings and violence. Has it delivered on its promise of greater safety?

Higdon: It’s unclear whether these tools are actually making schools safer. Of course, we can’t really test the “what if” scenario — what if these surveillance tools weren’t in place, would things be less safe? But what we do know, from incidents like Columbine to Uvalde, is that these tools often end up just recording events rather than preventing violence.

Butler: I would add that technologies often foment the fears and anxieties of parents, teachers and students. We’re seeing this now with debates over whether or not schools can ban cellphones. The argument against banning cellphones in schools, even when folks acknowledge how distracting they can be, is the belief that children are safer with them, that there is enough evidence of violence and danger in our schools, and the cellphone is a tether that parents have to their children for several hours a day.

This is a particular cruelty of the tech companies: Playing on parents’ and teachers’ fears about keeping children safe. We are promised a quick solution (a particular technology) to a complex problem (there is violence in our society). When we are directed to look at technology as the only solution, we are uninvited from looking at other intersections or ways of thinking that might be worthwhile, such as greater work in supporting people’s mental health, or greater extracurricular options across the school day.

You write that students of color are far more likely to be surveilled than their white counterparts. Talk about that.

Higdon: Students of color have always faced more scrutiny compared to their white peers — it’s just another challenge they have to deal with. The ed-tech industry has done a good job of marketing their products as objective and bias-free, even labeling them as Diversity, Equity, and Inclusion-compliant. But in reality, these tools are only as unbiased as the people who create them. Research has also shown that these tools disproportionately label students of color as criminals and incorrectly flag LGBTQIA+ students as having mental health issues, which are sometimes viewed as warning signs for potential school shooters. Instead of promoting safety, these tools seem to be documenting violence and unfairly punishing innocent students, particularly those from the most vulnerable communities.

Cover of Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools (Routledge, August 2024).

Butler: Generative artificial intelligence (AI) is not (yet?) sentient. GenAI are imbued with the knowledge and biases of their creators. This means we have to look behind the scenes of who builds these technologies and what their implicit biases might be. This is a place where a lot of blame is passed around. It’s easy to initially blame the tech companies, whose executive-level staffs are often white men. The tech companies, however, blame the education system, claiming that there are not enough students of color moving through computer science majors. It’s probably a vicious combination of both: The tech industry’s toxic reputation means it is not an enticing place for women or people of color to work, so these folks might take their computer science degrees elsewhere.

While technology was in schools long before COVID, you say that it ramped up significantly when schools were shut down during the pandemic. How did that happen?

Higdon: The tech industry has done a great job of convincing us that their platforms — like social media and email — are “free.” But the truth is, they come at a cost: our privacy. These companies make money from our data, and all the content and information we share online is basically unpaid labor. So, when the COVID-19 lockdowns hit, a lot of people just assumed that using Zoom, Canvas and Moodle for online learning was a “free” alternative to in-person classes. In reality, we were giving up even more of our labor and privacy to an industry that ended up making record profits.

Butler: So many of us were so justifiably scared when COVID hit and restrictions came fast and furious (assuming we lived in states or communities that followed lockdown procedures). For those of us who did not want our teaching and learning to have too much of an interruption, we grasped on quickly and then settled in for the long haul with these technologies. I think many of us probably had misgivings about the rapid increase in technology use, and we all probably have stories about Zoom bombings, or when our dogs interrupted a meeting, or when the Wi-Fi went out unexpectedly. But we were desperate to keep our students and our classes progressing and desperate for some level of human interaction.

You are both longtime educators, and you argue that surveillance technology undermines the education process. How so?

Higdon: Education needs privacy. Learning is a messy process where people need the freedom to think out loud and make mistakes. Teachers and students both need a space where they can speak freely without fear. But when they know that what they say or do in the classroom could be taken out of context and used against them, they’re less likely to take risks. Instead of real learning, we end up with students just repeating what’s popular, which goes against the whole point of education.

Butler: Education needs trust; students and teachers deserve to build a relationship grounded in trust and mutual respect. This cannot happen when surveillance technologies, especially those that promise to catch plagiarism, are a third party in that relationship. When teachers ask students to submit work via digital platforms, such as Turnitin, this signals a lack of trust and sends the message that teachers do not trust their students to complete their work honestly. I’m not so naïve to think that students don’t cheat; students have found ways to cheat well before there were digital technologies built to stop them and students who feel they need to complete their work dishonestly will find a way. Submitting high-stakes coursework can be stressful; add to that the burden that your teacher does not even think that it’s yours to begin with? This undermines the entire teacher-student relationship.

I don’t blame individual teachers; they are often sold a message, particularly one about convenience and time-saving, that makes use of these technologies tempting. The pressure of trying to come up with creative, impactful course work and getting that graded in a timely manner is real. What teachers deserve to know is that companies like Turnitin do at least two things with the work submitted: sell or share the work to advertisers and marketers, who use student language to tailor their campaigns and sell or share the work to Generative AI, so that its output becomes stronger. In this way, teachers and students are working for Turnitin but definitely are not getting paid!

So much of our current technology works to divide and isolate, yes. But what if we countered that by working and learning together?

Much of this technology is owned and promoted by some of the most powerful corporations in the world. How can students, parents, teachers, and anyone else who is concerned about its use fight back?

Higdon: There’s a lot we can do to address this. First off, we all need to reject the idea that these tools are “free” and start explaining the real costs to people. We also need to push back against the common argument, “I’m not doing anything wrong, so why should I care if they’re watching me?” There are plenty of reasons to care. If you’re part of a vulnerable community — like those with contested immigration status — or if you’re a victim of stalking, you absolutely need privacy. Your data can be used against you, like raising your insurance premiums, or taken out of context, such as sarcasm being used to deny you a job or admission to a school. And let’s not forget that data breaches happen all the time, which could lead to identity theft or other personal information becoming public.

Teachers should use their collective power, like through union bargaining, to work with students and parents to demand privacy and even seek compensation for the data that’s taken from them. For example, if a platform like Canvas is making billions off of a teacher’s data, that teacher should receive a quarterly stipend. It’s also crucial for teachers to educate students about these issues so they understand how third parties are tracking their lives. Students should feel empowered to ask their teachers why they’re using these tools and to protest their school’s use of them. Schools should also explore proprietary software options for situations where technology can genuinely improve learning without compromising privacy.

Butler: Students and teachers can work together on this. They can ask some pretty simple questions of some of the technologies that they are mandated or strongly encouraged to use: Who owns the technology? What is their business model? What do they say they will do with the data? Why might this be valuable? In other words, don’t just accept the technology as-is, and if an administrator says that it’s mandated, ask why.

Oftentimes, administrators in K-12 and higher education are presented with a narrative that some particular technology, hardware or software will solve a known problem, or solve a problem the administrators did not even know they had. If administrators, teachers, students, and of course librarians, can work together, the school can function as a community of a whole, thoroughly informed on what is in their classrooms. So much of our current technology works to divide and isolate, yes. But what if we countered that by working and learning together?