Washington — For New York teacher Michael Flanagan, the pandemic was a crash course in new technology — rushing out laptops to stay-at-home students and shifting hectic school life online.
Students are long back at school, but the technology has lived on, and with it has come a new generation of apps that monitor the pupils online, sometimes round the clock and even on down days shared with family and friends at home.
The programs scan students’ online activity, social media posts and more — aiming to keep them focused, detect mental health problems and flag up any potential for violence.
’You can’t unring the bell,’ said Flanagan, who teaches social studies and economics. ‘Everybody has a device.’
The new trend for tracking, however, has raised fears that some of the apps may target minority pupils, while others have outed LGBT+ students without their consent, and many are used to instill discipline as much as deliver care.
So Flanagan has parted ways with many of his colleagues and won’t use such apps to monitor his students online.
He recalled seeing a demo of one such program, GoGuardian, in which a teacher showed — in real time — what one student was doing on his computer. The child was at home, on a day off.
Such scrutiny raised a big red flag for Flanagan.
’I have a school-issued device, and I know that there’s no expectation of privacy. But I’m a grown man — these kids don’t know that,’ he said.
A New York City Department of Education spokesperson said that the use of GoGuardian Teacher ‘is only for teachers to see what’s on the student’s screen in the moment, provide refocusing prompts, and limit access to inappropriate content.’
A student’s pencil rests on a laptop during an in-person hybrid learning day at the Mount Vernon Community School in Alexandria, Virginia, March 2, 2021.
Valued at more than $1 billion, GoGuardian — one of a handful of high-profile apps in the market — is now monitoring more than 22 million students, including in the New York City, Chicago and Los Angeles public systems.
FILE — Students navigate an online lesson with the help of an instructor, at West Brooklyn Community High School in New York City, Oct. 29, 2020.
Globally, the education technology sector is expected to grow by $133 billion from 2021 to 2026, market researcher Technavio said last year.
Parents expect schools to keep children safe in classrooms or on field trips, and schools also ‘have a responsibility to keep students safe in digital spaces and on school-issued devices,’ GoGuardian said in a statement.
The company says it ‘provides educators with the ability to protect students from harmful or explicit content’.
Nowadays, online monitoring ‘is just part of the school environment,’ said Jamie Gorosh, policy counsel with the Future of Privacy Forum, a watchdog group.
And even as schools move beyond the pandemic, ‘it doesn’t look like we’re going back,’ she said.
Guns and depression
A key priority for monitoring is to keep students engaged in their academic work, but it also taps into fast-rising concerns over school violence and children’s mental health, which medical groups in 2021 termed a national emergency.
According to federal data released this month, 82% of schools now train staff on how to spot mental health problems, up from 60% in 2018; 65% have confidential threat-reporting systems, up 15% in the same period.
In a survey last year by the nonprofit Center for Democracy and Technology (CDT), 89% of teachers reported their schools were monitoring student online activity.
Yet it is not clear that the software creates safer schools.
Gorosh cited May’s shooting in Uvalde, Texas, that left 21 dead in a school that had invested heavily in monitoring tech.
Some worry the tracking apps could actively cause harm.
The CDT report, for instance, found that while administrators overwhelmingly say the purpose of monitoring software is student safety, ‘it’s being used far more commonly for disciplinary purposes ... and we’re seeing a discrepancy falling along racial lines,’ said Elizabeth Laird, director of CDT’s Equity in Civic Technology program.
The programs’ use of artificial intelligence to scan for keywords has also outed LGBT+ students without their consent, she said, noting that 29% of students who identify as LGBT+ said they or someone they knew had experienced this.
And more than a third of teachers said their schools send alerts automatically to law enforcement outside school hours.
’The stated purpose is to keep students safe, and here we have set up a system that is routinizing law enforcement access to this information and finding reasons for them to go into students’ homes,’ Laird said.
A report by federal lawmakers last year into four companies making student monitoring software found that none had made efforts to see if the programs disproportionately targeted marginalized students.
’Students should not be surveilled on the same platforms they use for their schooling,’ Senator Ed Markey of Massachusetts, one of the report’s co-authors, told the Thomson Reuters Foundation in a statement.
’As school districts work to incorporate technology in the classroom, we must ensure children and teenagers are not preyed upon by a web of targeted advertising or intrusive monitoring of any kind.’
FILE — A man types on a computer keyboard in this illustration picture taken Feb. 28, 2013.
The Department of Education has committed to releasing guidelines around the use of AI early this year.
A spokesperson said the agency was ‘committed to protecting the civil rights of all students.’
Aside from the ethical questions around spying on children, many parents are frustrated by the lack of transparency.
’We need more clarity on whether data is being collected, especially sensitive data. You should have at least notification, and probably consent,’ said Cassie Creswell, head of Illinois Families for Public Schools, an advocacy group.
Creswell, who has a daughter in a Chicago public school, said several parents have been sent alerts about their children’s online searches, despite not having been asked or told about the monitoring in the first place.
Another child had faced repeated warnings not to play a particular game — even though the student was playing it at home on the family computer, she said.
Creswell and others acknowledge that the issues monitoring aims to address — bullying, depression, violence — are real and need tackling, but question whether technology is the answer.
’If we’re talking about self-harm monitoring, is this the best way to approach the issue?’ said Gorosh.
Pointing to evidence suggesting AI is imperfect in capturing the warning signs, she said increased funding for school counselors could be more narrowly tailored to the problem.
’There are huge concerns,’ she said. ‘But maybe technology isn’t the first step to answer some of those issues.’