Mac Slavo | SHTFPlan | Source URL
Several California police departments have stopped using “predictive policing” software but not because they want to protect your basic human rights and dignity. The reason the software won’t be used is that it isn’t effective enough…yet.
Several police departments were disappointed that they won’t be able to punish thought crimes. Palo Alto police spokeswoman Janine De la Vega told the Los Angeles Times: “We didn’t get any value out of it,” after using the software for three years. As for Mountain View, California’s police department, which spent over $60,000 of taxpayer money on the software that was designed to violate the rights of the taxpayers stolen from to fund it, it was dubbed a “disappointment.” The police department used the software for over five years before dropping the program last June. Rio Rancho, New Mexico Police Captain Andrew Rodriguez said: “It wasn’t telling us anything we didn’t know.”
The software is a bold violation of the basic human right to free thought and those who use it are nothing more than freedom-trampling tyrants. The Los Angeles Police Department took an authoritarian leap in 2010 when it became one of the first to employ data technology and information about past crimes to predict future unlawful activity. The software is called PredPol, and is known as the “predictive policing tool” developed by a University of California at Los Angeles professor and the Los Angeles Police Department.
The good news is that it doesn’t work, so we’ve bought a few more years. But the bad news is that it’s likely there will be an ongoing effort to make Big Brother the Orwellian reality for all us…as if we aren’t tracked, surveilled, and monitored enough.
The LAPD itself was forced to admit following an internal audit that after eight years, there was “insufficient data” to show PredPol to be effective in reducing crime. This was largely due to massive inconsistencies in oversight, criteria, and program implementation.
In April, the department shelved another Orwellian program, which was found to be using “inconsistent criteria” to label people as future violent criminals. Last August, after a lawsuit from privacy and civil liberties groups forced the department to cough up its PredPol records, the LAPD discontinued another dystopian part of the program that picked out a list of “chronic offenders” every shift based on alleged gang membership, previous arrests, and one “point” for every “quality police contact.” –RT
Regardless of whether there’s data or no data, LAPD Chief Michel Moore doesn’t want to let PredPol go, claiming it is more accurate than human analysts at predicting where criminals will strike next. But even his defense of the program is a far cry from early publicity materials that trumpeted “‘cliff-like’ drops in crime often within months of deployment” among PredPol’s early adopters.