Scoop has an Ethical Paywall
Work smarter with a Pro licence Learn More
Top Scoops

Book Reviews | Gordon Campbell | Scoop News | Wellington Scoop | Community Scoop | Search

 

On The Flaws In National’s Pet Solution To Gangs And Poverty

In the 50 years since Norm Kirk first promised to take the bikes off the bikies, our politicians have tried again and again to win votes by promising to crack down on gangs. Canterbury University academic Jarrod Gilbert (an expert on New Zealand’s gang culture) recently gave chapter and verse on the decades of political posturing about gangs – led by the likes of Mike Moore and others - and the paltry outcomes, which have consistently been ineffectual. In today’s political climate, Gilbert’s research into the 1990s political panic about gangs still remains highly relevant.

Case in point… On the weekend, National Party Christopher Luxon was once again beating the “get the gangs” drum at a regional gathering of the party faithful. Some of Luxon’s key proposals – e.g. to outlaw the wearing of gang patches in public, and to criminalise the display of gang insignia and/or the writing of “pro-gang” thoughts online – were promptly rubbished by former National Party Cabinet Minister Chester Borrows, who had tried to impose a similar raft of measures at local government level in Whanganui 13 years ago. The approach didn’t work then, Borrows explained, and “it won’t work” now, either.

Advertisement - scroll to continue reading

Are you getting our free newsletter?

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.

His bill was a futile and ultimately ineffective measure, the former Courts Minister says, 13 years after the Wanganui District Council (Prohibition of Gang Insignia) Bill was passed into law. Borrows, who is also a former police officer and current member of the Parole Board, is now criticising his former party – saying its latest proposal to curb gang crime was designed for “big headlines” and would be mostly ineffectual in practice… Police will either spend a lot of time trying to enforce this, and it may not be all that helpful, or the police will ignore it and the public will be upset it isn't being enforced.”

According to Borrows, an effective policy needs to address why people join gangs in the first place – and with a focus on prison reform, intergenerational gang membership, and the socio-economic reasons why joining gangs was seen as such a compelling option. No sign of that from Luxon.

Simple (minded) solutions

Who would have thought the remedy for gang violence and drug dealing merely required them to take off their jackets? In his conference speech, Luxon did concede that the causes of gang recruitment were “complex.” His solution? Ever since he became party leader, Luxon has been touting the Bill English/Paula Bennett policy of “ social investment” as his Big Idea on social reform. This has got the right wing commentariat very excited. As Luxon repeated at the weekend:

National will bring back the long-term, social investment approach so that resources are directed where they can do the most good. That means developing targeted interventions to steer at-risk young people in a direction that gives them the chance of a positive and productive life.

Right. Yet National lost office in 2018 before English could enact the social investment policy. However, the Auckland University academics who devised the basic predictive analytical tools and the core approach have seen it being adopted over the past five years by several US counties – e.g. Allegheny County in Pennsylvania, and in parts of Florida, Oregon, Maine and southern California.

Many of the US news reports on this issue have openly acknowledged New Zealand’s pioneering work in developing the approach. As long ago as 2017 however, Illinois dropped the predictive analytics approach because it was found to be ineffective in predicting the worst cases.

In line with what National is proposing here, the algorithm used in Allegheny County taps into personal data from government data sets—including Medicaid, mental health, and jail and probation records—to calculate numerical risk scores of criminal behaviours and/or the likelihood of welfare dependency and abuse. 

However… In recent weeks, Oregon (which had followed the Allegheny County precedent) has also dropped the social investment approach of predictive analytics and targeted responses, because of negative research findings that it led to the racial stereotyping of the very people and communities it was supposedly trying to help.

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Penn., an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

The problem goes beyond Oregon and Illinois. The AP-reported research showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

Right now, New Zealand is emerging from the trauma caused by the dysfunctional actions of Oranga Tamariki. Under Luxon, National is proposing a social investment approach fraught with similar risks of negative racial, class and cultural stereotyping. The AP investigation had found that the tool, given comparable rates of calls, “would have recommended that two-thirds of Black children be investigated, compared with about half of all other children.”

Even the dwindling number of defenders in Oregon of the social investment approach have conceded that the core algorithms require a ‘fairness corrective” to be installed in order to compensate for the racial biases inherent in the model.

[According to] Jake Sunderland, press secretary for the Oregon Department of Human Services, the DHS made adjustments to its algorithm to account for racial bias:“Knowing that algorithms are at risk of perpetuating racial biases and structural inequities, ODHS the Safety at Screening Tool was developed with an algorithmic ‘fairness correction’ to correct for the biases in the data,” Sunderland [said.]

Does National have a “fairness corrective” in mind, and what would it look like? In short, the social investment approach is deeply flawed – mainly because of its obsessive focus on the rear view mirror, whereby past policing/social worker contact is treated as the strongest predictor of future behaviours, such that pre-emptive intervention is then more likely to occur. (Poverty becomes a flat circle. So does crime.)

Social Investment, Anything But

In an Orwellian twist, the “social investment” approach does not identify – let alone invest in correcting – the structural causes of poverty, criminal behaviours or welfare dependency. Instead, it entails a targeting tool aimed at the underclass – and in particular, at individuals, families and communities based on their previous contact with Police, welfare workers and the courts.

The algorithms calculate the likely risk scores of re-offending based on computer findings about the outcomes for people with similar family or community or income or experiential profiles – such that the interventions by Police and social workers can then be targeted at them. For those on the receiving end, it can look very much like a harassment device.

Many critics have pointed out the similarity to the Tom Cruise film Minority Report – in that action by the authorities is likely to be triggered less by actual behaviours than by AI tools designed to predict and pre-empt potential anti-social behaviours from occurring before they happen, largely based on a risk calculus model. It's not what you did, it's what Big Brother says you might do in future.

Obviously though, and as the American Civil Liberties Union has pointed out… If the authorities want to devise a predictive tool, they would have first needed to create a list of variables that will then be checked against the jurisdiction’s historical case files to see which ones are associated with different types of outcomes (e.g., future removal of children from their homes).

But who decides what variables should be looked at and what outcomes should be considered? Furthermore, similar to the tools we have seen in the criminal legal system, any tool built from a jurisdiction’s historical data runs the risk of perpetuating and exacerbating, rather than ameliorating, the biases that are embedded in that data.

Exactly. And as the ACLU indicated:

Historically over-regulated and over-separated communities may get caught in a feedback loop that quickly magnifies the biases in these systems. Even with fancy — and expensive — predictive analytics, the family regulation system risks surveiling certain communities simply because they have surveiled people like them before. Or, as one legal scholar memorably framed it, “bias in, bias out.”

To put it mildly, the social investment approach is not culturally or racially sensitive. It does not bother itself with the structural and historical causes of institutional racism that tend to underpin many of the actions of the criminal underclass. The AI instruments involved are too blunt for that sort of analysis and– in practice – they leave less room for discretion by the social workers operating at the coal face.

Theoretically of course, the social worker would still be able to override what the algorithms say should be done. In reality though, the ubiquity of the tool will tend to put the social worker on the back foot as to why they may be choosing to ignore the predictive test scores in a particular case. Very quickly, the tool will rule.

As the ACLU concluded, “Information about the data used to create a predictive algorithm, the policy choices embedded in the tool, and the tool’s impact system-wide and in individual cases are some of the things that should be disclosed to the public both before a tool is adopted and throughout its use.”

Would that happen here if National wins the power to implement its pet approach? Fat chance. In passing, one might feel somewhat sorry for the police officers and social workers who would then have to enforce it. As the British academic Joanne Warner pointed out last year, police officers and social workers function as “street level bureaucrats’ when it comes to how they interpret and enforce social policy and the law. As Warner added:

With parallels in social work, police work is described as being a ‘tragic necessity’ involving great power, practised in low visibility environments, and (mainly) encounters with groups who have relatively low social status and little power…”

Exactly: power tends to be wielded out of sight, and out of mind by those more comfortable. Yet before we give the Police (and social workers) sweeping new powers and marching orders to target organisations, communities, families and individuals deemed to pose a future risk to 0ur sense of well-being, surely we need to ask beforehand what limits to state power – if any - we are happy to entertain. The Nanny State has its problems, but the authoritarian Daddy State tebnds to be far, far worse.

I hate to be rashly predictive, but we’re in trouble if we’re expecting Christopher Luxon to lead a reasoned debate on such matters, sometime before a social investment model that’s being scrapped elsewhere, gets put into practice here, regardless.

Footnote: In 2015, Werewolf published a backgrounder on how predictive analytics had been developed, and the inherent risks of racial and class profiling it involves. Here’s an excerpt from that article that still seems relevant:

The value of predictive modelling work lies in its alleged ability to accurately predict those cases where child abuse will later be substantiated. Circular reasoning can readily creep into this process, and raise the spectre of racial profiling and stigmatisation.

To critics…that’s a problem with the approach [in that] families already known to CYF are subject to more surveillance and monitoring, therefore children already known to them are likely to have a flag in the system for subsequent babies born to those parents. “This,” as one social work academic interviewed told me, “would partly explain why contact with CYF for older children and parents’ own CYF histories are such strong predictors. “

This isn’t to say that abuse isn’t occurring among such families – but if abuse is occurring elsewhere and amid other families, it is less likely to be picked up due to less monitoring; plus if older children have been substantiated, that too is likely to add to the risk picture and will lead to the substantiating of other children from the same families. “Therefore CYF contact becomes self-referential. It predicts itself.”

Chester Borrows is not the only person who should be feeling sceptical (and alarmed) about us implementing a so-called ‘social investment’ strategy that’s likely to do far more harm than good.

© Scoop Media

Advertisement - scroll to continue reading
 
 
 
Top Scoops Headlines

 
 
 
 
 
 
 
 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.