Tim Cushing | TechDirt | Source URL
Schools are spending more and more money on safety. Or, at least, that's what they're saying they're spending the money on. But simply adding more layers of surveillance -- on campus and off -- isn't really doing much to make schools safer.
Preventing violence at schools is a noble goal, but many of the solutions are just repackaged law enforcement products that treat campuses as high-crime areas. The latest developments being pitched to schools aren't innovative. They're just another way to help administrators view students as persons of interest in crimes yet to be determined.
This might not be as problematic if the tools worked well. But they simply don't. ProPublica took some repurposed ShotSpotter-esque tech for a spin, only to find out it doesn't work as advertised. The idea is the microphones will pick up aggressive… um… noises and head off the next school shooter. Great in theory. In practice?
Ariella Russcol specializes in drama at the Frank Sinatra School of the Arts in Queens, New York, and the senior’s performance on this April afternoon didn’t disappoint. While the library is normally the quietest room in the school, her ear-piercing screams sounded more like a horror movie than study hall. But they weren’t enough to set off a small microphone in the ceiling that was supposed to detect aggression.
A few days later, at the Staples Pathways Academy in Westport, Connecticut, junior Sami D’Anna inadvertently triggered the same device with a less spooky sound — a coughing fit from a lingering chest cold. As she hacked and rasped, a message popped up on its web interface: “StressedVoice detected.”
“There we go,” D’Anna said with amusement, looking at the screen. “There’s my coughs.”
Good to know a student's coughing fit may find them staring down the business end of whatever weapon the school resource officer chooses to "reasonably" deploy in the face of certain danger. AggressionSpotter doesn't seem to be up to the task of providing school staff with a heads up. The company behind the tech -- Louroe Electronics -- pitches this to law enforcement and school administrators as being the tool they need to head off "antagonistic individuals" before it escalates into something that injures or kills students.
The problem is every product tested by ProPublica can't actually do what it says on the tin. When lives are on the line, aggression detectors are being set off by famous people with grating voices.
Our research found that [Sound Intelligence's aggression detector] tends to equate aggression with rough, strained noises in a relatively high pitch, like D’Anna’s coughing. A 1994 YouTube clip of abrasive-sounding comedian Gilbert Gottfried (“Is it hot in here or am I crazy?”) set off the detector, which analyzes sound but doesn’t take words or meaning into account.
So much for the future. This is the AI used by Louroe, which can be set off by Gilbert Gottfried bits while ignoring students screaming.
Louroe's attempt to win back some points by touting its privacy protections falls flat as well. The company claims the software only captures aggressive "sound patterns." What the company calls "sound patterns" is actually students speaking. And those recordings can be stored indefinitely by administrators, allowing them to replay conversations overhead by Louroe's mics.
The company behind the software at least seems a bit more pragmatic. Sound Intelligence CEO Derek van der Vorst acknowledged the fact that the tech will generate lots of false positives and probably won't prevent a "crazy loony" from shooting up a school. Still, he insists better safe than sorry, even when it seems "sorry" is the more likely outcome.
As is to be expected, this growth industry began as cop tech. Sound Intelligence's first deployments were by European law enforcement agencies.
It tested an early model in a Dutch “pub district,” according to a 2007 study co-authored by a company researcher. Microphones were placed in 11 locations in inner-city Groningen, and the detector’s findings were compared with police reports of aggressive behavior. The results were “so impressive,” the study reported, that the device was considered “indispensable” by several Dutch police departments, the Dutch railway company and two prisons.
If you're looking for analogies, Sound Intelligence has provided one for you. The difference between schools and prisons is the bell at the end of the day. Schools are loading up with surveillance tech to keep an eye on troublesome inmates, using toys battle-tested on literal prisoners.
Settings can be tweaked and a lot of early adopters are finding the presets to be useless. ProPublica reports schools have had the gunshot detector triggered by slammed locker doors and loud birthday wishes.
Deployment in other venues has been just as error-prone, making one wonder how low Dutch law enforcement sets the bar for "indispensable."
The software has been less effective at The Valley Hospital in Ridgewood, New Jersey. Daniel Coss, security chief for the hospital’s health system, said he’s phasing out the detector after a three-year, $22,000 pilot program. The devices — placed in public, “high risk” areas — had been set off by patients’ loud voices and cafeteria workers slamming cash registers closed. Once the detector was tweaked to be less sensitive, it ignored an agitated man who was screaming and pounding on a desk. The situation escalated until six security officers responded.
This is what educators are relying on to keep kids safe and, hopefully, head off the next mass shooting. The tech simply isn't up to it. While schools continue to beta test software and hardware using tax dollars, students are being inured to round-the-clock surveillance. The excuse is "safety," as it always is when surveillance programs ramp up. But there's no trade off to be made here, not when the tech can't pinpoint aggression and fails to recognize clear signals like screaming as signs of imminent danger.
I understand no school administrator wants to feel like they didn't try everything they could when the worst case scenario occurs, but using students as guinea pigs for unproven tech originally developed for prisons isn't the answer.