Motherboard | Source URL
Police, social services, and health workers in Canada are using shared databases to track the behaviour of vulnerable people—including minors and people experiencing homelessness—with little oversight and often without consent.
Documents obtained by Motherboard from Ontario’s Ministry of Community Safety and Correctional Services (MCSCS) through an access to information request show that at least two provinces—Ontario and Saskatchewan—maintain a “Risk-driven Tracking Database” that is used to amass highly sensitive information about people’s lives. Information in the database includes whether a person uses drugs, has been the victim of an assault, or lives in a “negative neighborhood.”
The Risk-driven Tracking Database (RTD) is part of a collaborative approach to policing called the Hub model that partners cops, school staff, social workers, health care workers, and the provincial government.
Information about people believed to be “at risk” of becoming criminals or victims of harm is shared between civilian agencies and police and is added to the database when a person is being evaluated for a rapid intervention intended to lower their risk levels. Interventions can range from a door knock and a chat to forced hospitalization or arrest.
Data from the RTD is analyzed to identify trends—for example, a spike in drug use in a particular area—with the goal of producing planning data to deploy resources effectively, and create “community profiles” that could accelerate interventions under the Hub model, according to a 2015 Public Safety Canada report.
Saskatchewan and Ontario officials say data in the RTD (sometimes called the “Hub database” in Saskatchewan) is “de-identified” by removing details such as people’s names and birthdates, though experts Motherboard spoke to said that scrubbing data so it may never be used to identify an individual is difficult if not impossible.
A Motherboard investigation—which involved combing through MCSCS, police, and city documents—found that in 2017, children aged 12 to 17 were the most prevalent age group added to the database in several Ontario regions, and that some interventions were performed without consent. In some cases, children as young as six years old have been subject to intervention.
How does people’s information get added to the database?
The Hub model seeks to connect police with community members in order to evaluate potentially at-risk people for interventions.
For example, a police officer may be called to respond to someone’s disruptive but non-criminal behaviour time and again. Under the Hub model, the officer can bring the person’s situation to the Hub—which may include staff from child welfare, addictions, or housing assistance agencies—and ask if other agencies can intervene.
During the ensuing evaluation, information about the person is shared between the participants and entered into the RTD. The person’s identity can be known to local law enforcement, social workers, and health workers, but when their information is added to the RTD, details that might identify the person are not supposed to be included. If agencies collectively decide the person is at an “acutely elevated” level of risk, an intervention is deployed. Interventions can occur without consent if Hub practitioners feel a person is at a high risk of harm.
More than 100 Hubs are now operating in cities and towns across Canada and the US, with 37 in Ontario (where Hubs are usually called Situation Tables) contributing to the Risk-driven Tracking Database as of April 2018, according to MCSCS documents. In total, 55 are expected to be contributing by the end of this year.
“We can knock on someone’s door and say, ‘We’re so worried about you, can we come in and chat?’”
Dr. Chad Nilson, an academic researcher with the University of Saskatchewan and the lead developer of the RTD, did not respond to questions about the database. His professional bio says the RTD is in use “across Canada.”
Lisa Longworth, an Ontario Provincial Police (OPP) program analyst who trains police in the Hub model, told Motherboard that the interventions are a “tool” for police that allows them to take action in situations where they previously would have been powerless to intervene.
“We can knock on someone’s door and say, ‘We’re so worried about you, can we come in and chat?’” Longworth told Motherboard in a phone call.
Reviews of Situation Tables in two Ontario cities commissioned by police and community partners show that some interventions have ended in forced hospitalization or arrest. A survey of people’s experiences of Hub intervention in Barrie, for example, found that one intervention ended with the individual going to jail. An assessment of the Situation Table in Waterloo noted an instance of a person being subject to involuntary hospitalization following an intervention.
According to MCSCS documents, in 2017 more than 300 RTD-related discussions for kids between the ages of 12 and 17 took place, and 30 for kids aged six to 11. Ontario’s annual RTD report for 2017 notes that kids aged 12 to 17 are the “most vulnerable” age group in the database, and the most prevalent in the RTD in several regions.
“It does happen with children that young,” said Longworth, noting that police have had to come up with ideas for “creative interventions for six-year-old[s].”
In response to questions about the high number of minors being evaluated by Situation Table, MCSCS provided a statement by email, saying, “Situation Tables are locally driven and there is no requirement for communities to report to the ministry” about their activities.
Longworth said that Situation Tables “aren’t formalized” and are “not perfect,” a characterization she says reflects the haphazard way in which the model was rolled out in the province.
“Situation Tables spread in Ontario really quickly,” said Longworth. “[MCSCS] has been playing catch-up.”
What’s in the database?
Hub interventions require cops, educators, doctors, and social workers to share extremely sensitive information about vulnerable people—and add it to the provincial database—in a process fraught with potential for privacy violations.
Data in the RTD may include a person’s age group, sex, location, and more than 100 “risk factors” used to describe individual circumstances. Standardized risk factors allow for national comparisons, according to provincial authorities.
According to MCSCS documents, the most common risk factors ascribed to people in the database in 2017 were mental health (including “suspected” mental health issues), criminal involvement, drug use, and “antisocial/negative behaviour,” defined as “obnoxious [or] disruptive” behaviour.
Asked about Situation Tables’ use of sometimes-vague risk factors to justify intervention, MCSCS reiterated that individuals deemed to be at high risk of harm exhibit “multiple risk factors” and require a collaborative approach to intervention.
“There are concerns about this kind of surveillance that go beyond privacy, [affecting] people’s basic rights to liberty and security of the person"
Because some of the information-sharing process before intervention is done verbally, Ontario’s Information and Privacy Commissioner (IPC) developed additional guidelines for Hubs intended to ensure people’s privacy rights are respected—getting individuals’ consent to share information whenever possible, for example, and acting in a way that “more positively than negatively” affects them.
But in Ontario, police, health care professionals, and social workers participating in the Hub model are under no obligation to follow IPC’s guidelines, though they are still beholden to provincial privacy laws.
“Communities are not required to apply the best practices included in the [IPC] guidance document,” stated an MCSCS briefing from an internal meeting held in June 2018.
However, Brian Beamish, Ontario’s IPC commissioner, told Motherboard in an email that “Situation Tables that deviate from the IPC’s guidance risk breaching [individuals’] privacy.”
When asked about the briefing by Motherboard, MCSCS did not comment, but a spokesperson said in an email that the ministry “strongly encourages” Situation Table participants to follow the IPC guidelines.
Brenda McPhail, director of the Privacy, Technology, and Surveillance Project for the Canadian Civil Liberties Association (CCLA), told Motherboard that MCSCS saying Situation Tables are not required to follow IPC information-sharing guidelines is “shocking.”
“There are concerns about this kind of surveillance that go beyond privacy, [affecting] people’s basic rights to liberty and security of the person,” McPhail told me in a phone call, and suggested that authorities may need to re-examine the use of the Hub model.
In 2014, Saskatchewan’s IPC completed an investigation into potential privacy violations related to Hubs in that province. It found “deficiencies” in the model’s privacy protections, noting that some Hub databases contained personally identifiable information.
The report also found that people targeted by Hubs were not informed of how to file privacy complaints, and that Hub agencies had scoured Facebook posts as a source of information regarding a person’s risk levels on at least two occasions.
The Saskatchewan IPC told Motherboard in 2017 that corrective measures had been taken, such as destroying personally identifiable data and ensuring that Hubs no longer use Facebook data to assess individuals’ risk levels, and that the IPC was satisfied with these steps.
One Ontario social worker, who participates in a Situation Table and spoke to Motherboard on condition of anonymity because they were not authorized to speak to the media, expressed worry about the high number of minors being evaluated for intervention, and having their information added to the RTD, because of “how often [interventions] are done without consent.”
While the worker has seen some individuals benefit from Situation Table interventions (“usually homeless people with severe addictions”), they believe that consent is vital.
“Without consent, intervention won’t be successful,” the worker said, noting that “around 50 percent” of the Situation Table interventions they’ve been involved in were undertaken without the person’s consent, and that many non-consensual interventions involved people with addictions.
Though authorities claim that the data is “de-identified,” the 100-plus risk factors used in the RTD may nonetheless describe a person’s life in intimate detail, noting if they drink alcohol, have trouble finding stable housing, skip school, are unemployed, or associate with “negative peers.”
The database also includes 51 options for “protective factors”—such as if a person has a loving family, or a positive relationship with law enforcement—that are used to counterbalance risk factors.
Tamir Israel, a lawyer with the Canadian Internet Policy and Public Interest Clinic (CIPPIC), told Motherboard that depending on how authorities use the database, there may be a risk that vulnerable people could be identified from the data held in the RTD.
Even without individuals’ names and addresses in the database, if authorities keep a lot of detailed information about risk factors, there’s a “chance that someone who grabs the database without authorization [could] cross reference the profiles in the database with other information sources (i.e., the news) in order to re-identify people,” Israel said in an email.
Predictive policing concerns
Particularly concerning for privacy advocates is the possibility that the RTD is being used for the purposes of predictive policing—a controversial strategy that employs data analysis to identify hot spots for crime.
A report produced by Public Safety Canada in 2015 notes that data gathered during Hub discussions can be used to help “identify and plan predictive risk patterns at local, regional, and provincial levels.” The information in the RTD can help to accelerate interventions in some communities, the report states.
“Situation Tables are a form of predictive policing, and are being used to feed databases that may facilitate other kinds of predictive policing,” McPhail told Motherboard.
According to Israel, there is a risk that if authorities don’t refine the data in the RTD over time to determine if a specific risk factor actually leads to increased crime or harm, flawed predictions could influence policing decisions.
Israel said that if there are “geographical indicators” in the RTD, they could allow people’s risk factors to be tied to areas that could then be classified as high or low crime, “with an eye to more efficiently allocating police resources.” If the underlying data is faulty, Israel said it could lead to a “higher and more suspicious police presence in neighbourhoods that are predominantly populated by visible minorities.”
It’s unclear what algorithm, if any, Canadian authorities use to analyse the RTD, but automated tools come with risks as well. Experts have pointed out that the algorithm underpinning PredPol, one of the most widely used predictive policing technologies, is fundamentally flawed in a way that can contribute to over-policing, particularly for marginalized communities.
The RTD has already inspired at least one predictive policing initiative, the Saskatoon Police Predictive Analytics Lab (SPPAL), considered to be Canada’s most advanced predictive policing program. One report states that the SPPAL “expands on the Hub model’s risk tracking system.”
Academic research has explored how the Hub model represents an “extension of police control” in society. A research paper published last month by Carrie B. Sanders, a criminologist at Wilfrid Laurier University, concluded that the Hub model is deeply influenced by “traditional policing practices” and that without sufficient oversight, Hubs “can evade democratic accountability.”
Police services that send data to the RTD must sign an agreement with MCSCS that bars them from speaking with the media about the database without the Ministry’s written consent, according to a copy of a 2016 agreement obtained by Motherboard.
McPhail said that without increased transparency, Hubs will continue to present risks to the privacy rights of the vulnerable people whose data they compile in the RTD.
“We can’t have processes that are based on using exceptions to privacy law, with no transparency or accountability as to how [Hubs] are interpreting those exceptions,” she said. “Privacy is a human right. It can’t be eroded just to make someone’s job easier.”