|>>|| No. 16812
>There are always surprises in the data, you just need a competent statistician to start finding them. If you hire the right people and have collected reasonably clean data, you can get actionable insights in a matter of days with a hugely positive RoI.
There are a lot of "if"s involved right there that can render even the most well-intended technology useless. I'm still not convinced that turning your prevention and social work duties over to a machine is going to produce the better results, both in terms of staff cost and situation assessment.
It appears you have never spent much time programming computers. AI may be all the rage in the computing world right now, but whether you're programming an Arduino to blink an LED or designing an AI network containing billions of lines of code, there is an infinite number of ways you can implement pretty shit code. And the more complex your code, the easier it is for bits of shit code to be overlooked, and remain undetected for a long time, all the while producing wrong results. So maybe somebody who knocked over his elderly neighbour's wheelie bin will be assessed as 80 percent likely to mug somebody at knifepoint, while somebody who stole the rims off a car in the street at night gets classed as medium-risk.
And even the best AI system is only as good as the data you feed it. Garbage-in still invariably produces garbage-out. More than that, if you ask actual computer scientists, the majority of them will be wary of recommending it for the kind of applications that law-and-order politicians daydream about. Because computer scientists know what can go wrong. It's usually private companies that put their resources towards developing these AI systems, because they know that there are shedloads of money to be made from it by selling it to politicians and governments. That does not mean their product is both ethically and technologically sound.
>A classic example is Scared Straight. In the US, it was quite popular to take naughty kids into prison to show them where they could end up if they kept acting like a prick. Intuitively, this makes a reasonable amount of sense - if you're a delinquent teenager who isn't scared of a bollocking, you might be scared of the possibility of ending up in prison. Nobody bothered to evaluate these programmes for years; when we did, we found that they increased the crime rate, probably because it took away the fear of the unknown.
There was a Beavis and Butthead episode back in the day where they poked fun at Scared Straight. With Beavis and Butthead essentially thinking that prison was so cool a place that it was no deterrent, but actually something to aspire to. Because that's where all the tough guys were who didn't take shit from anybody.
Kowing a fair bit about life in the U.S., I think these programmes are usually the result of scared old white people institutionalising their hatred against the young. You see it with judges operating prison pipelines, with the way youngsters are punished even for ridiculously tiny amounts of drugs for personal use (not to mention that you can die for your country at 17 but can't legally drink a beer in a bar until four years later), you see it with the godawful boot camps that scar the souls of kids for life and do nothing to combat recidivism, and in the way that teenagers are punished for sex with their peers in some states. And in the way successive Republican administrations have cut funding for non-abstinence sex ed which their Democratic predecessors had reenacted back and forth.
Juvenile delinquency in nearly every European country is lower than in the U.S.. And my long standing theory is that that's because parents, but also adults in general in Europe don't see young people as a potential problem that needs to be dealt with, they are not seen as a threat to society, but as valuable future members of society who most of all need positive reinforcement.