In various instances, students have reached out for help through school-issued devices, prompting the surveillance software powered by artificial intelligence to alert Vancouver Public Schools staff in Washington state. This technology is being used by districts across the country to monitor students 24/7 for potential threats amid a student mental health crisis and concerns about school shootings. While the goal is to ensure student safety, the use of these tools raises significant concerns regarding privacy and security. Recently, Seattle Times and Associated Press reporters gained access to nearly 3,500 unredacted student documents revealing personal struggles shared by students, including issues such as depression, suicide, addiction, and bullying.
The Education Reporting Collaborative, consisting of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance in schools. Member newsrooms include AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
The documents demonstrate that students are using school-issued devices for personal matters beyond academic work, highlighting concerns about student privacy. Despite the software aiding counselors in identifying students in need of support, there have been instances where the technology has compromised students’ privacy and trust. Companies like Gaggle Safety Management, GoGuardian, and Securly are among those offering AI-assisted web surveillance tools to help schools keep students safe, with over 1,500 districts nationwide using such software.
Vancouver schools faced scrutiny after inadvertently releasing sensitive student documents, prompting apologies from the district. Nonetheless, school officials stress the importance of these surveillance tools in ensuring student well-being and safety. Principal Andy Meyer of Vancouver’s Skyview High School emphasized the positive impact of intervention when students are in need, underscoring the district’s commitment to protecting students above all else.
Foster expressed concerns about privacy violations while emphasizing the importance of keeping students safe in schools. Upon discovering that the district inadvertently released records, Foster questioned her options and expressed worry about the potential compromise of her daughter’s private information. Despite her concerns about privacy, Foster acknowledged the need for safety measures to prevent school shootings or suicides.
Student surveillance is implemented through a machine-learning algorithm utilized by Gaggle to monitor students’ online activities on school-issued devices or personal devices when logged into their school accounts. The service comes at a cost of approximately $328,036 for three school years, a similar expense to hiring an additional counselor. The algorithm scans for indicators of issues such as bullying, self-harm, suicide, or violence and alerts human reviewers when potential concerns are detected. In cases of serious threats, the school is notified, and law enforcement may be contacted for immediate intervention.
While the system has been effective in identifying critical situations, concerns have been raised about false alarms triggered by harmless content, such as student essays or casual conversations among friends. Despite some instances of unnecessary alerts, school officials argue that prompt action is essential to address any potential issues and build relationships with students.
Following inadvertent disclosures of flagged content, Gaggle has updated its system to restrict access to screenshots after 72 hours, requiring a login to view the information. This change aims to balance privacy concerns with the need for swift responses by emergency contacts. Despite improvements in safeguarding privacy, concerns remain about the balance between safety and privacy in student surveillance programs.
Ryn, a student, was part of a group where approximately 1 in 4 students had communications that triggered a Gaggle alert. Despite schools continuing to implement surveillance technology, the long-term impact on student safety remains uncertain. There is a lack of independent research demonstrating a measurable reduction in student suicide rates or violence as a result. A 2023 RAND study revealed minimal evidence of either benefits or risks from AI surveillance, with a conclusion that no comprehensive research has examined how these programs impact youth suicide prevention.
Benjamin Boudreaux, an AI ethics researcher and co-author of the report, highlighted that simply issuing more alerts without adequate mental health support is unlikely to enhance suicide prevention efforts. LGBTQ+ students are particularly at risk, as evidenced by instances in Vancouver schools where students writing about their LGBTQ+ identities were potentially exposed to school officials, compromising their privacy.
These students, more prone to depression and suicidal thoughts, often seek support online as a vital resource. Concerns were raised about the breach of confidentiality when a Vancouver high school student faced exposure after sharing personal struggles. Similar issues arose in North Carolina’s Durham Public Schools during the pilot of Gaggle, where alerts led to unintended disclosures and breaches of trust.
After instances of students being outed and privacy violations, the Durham Board of Education opted to discontinue Gaggle’s use in 2023, recognizing the potential harm it posed to students. The broader debate surrounding privacy and security in schools leaves many parents uninformed, with limited avenues to opt out even when they become aware of surveillance practices. The complexities of these issues underscore the importance of transparency and informed consent within school surveillance programs.
The school district denied Reiland’s request, causing concern. When Reiland’s daughter, Zoe, learned about Gaggle, she was so unnerved that she stopped searching for personal information on her Chromebook, including questions about her menstrual cycle, fearing repercussions. She expressed feeling too afraid to be curious.
Although school officials believe that Gaggle has saved lives, they do not track metrics to measure its effectiveness. However, they acknowledge that technology alone cannot ensure a safe environment for all students. Tragically, in 2024, a nonbinary teenager named Nex Benedict took their own life after facing relentless bullying at Owasso High School. An investigation by the U.S. Department of Education Office for Civil Rights revealed that the district had shown “deliberate indifference” to reports of sexual harassment and bullying.
Despite receiving nearly 1,000 Gaggle alerts during the 2023-24 school year, including reports of harassment and suicide, the issue of bullying persisted. When questioned about this ongoing problem, Russell Thornton, the district’s executive director of technology, emphasized that surveillance technology is just one tool and cannot single-handedly solve such complex issues.
While surveillance technology can help prevent tragedies by enabling timely intervention, it also raises concerns about privacy and the need for teenagers to have safe spaces online for self-exploration. AI ethics researcher Boudreaux highlighted the importance of allowing kids to develop a private life and navigate challenges without constant adult surveillance.
Gaggle’s Patterson cautioned against unlimited self-exploration on school-issued devices, as the school could be held accountable if a student makes a threatening statement. He emphasized the limitations of school systems in providing a platform for unrestricted expression.
The Associated Press’ education coverage is supported by various private foundations, with AP maintaining full editorial control. Additional information on AP’s standards for collaboration with philanthropies and a list of supporters can be found on AP.org.