AI Policing’s Effect on Race

Original article was published on Artificial Intelligence on Medium

There are two main reasons why predictive polcing can lead to such an outcome.

  1. Feedback Loops

Machines analyse current data to assess the probability of an outcome and synthesise new predictions. When discrimination has been entrenched in society for a long time, data being fed to the system is likely to be instilled with the racial bias of people who have collected the pieces of data in the first place. These people tend to be in higher positions of power compared to those being subjugated, resulting in a classic case of selection bias.

Given that the data provided to the algorithm is highly skewed towards a certain demographic, it is inevitable that the algorithm would churn out predictions that bring police forces back to the same individuals or areas that already have high police presence. This can lead to a higher number of arrests, simply due to the disproportionate level of policing in a specific location. It can also cause friction between the police and locals, potentially resulting in violence. As more of the same data is being fed into the algorithm, the vicious cycle continues of discrimination continues… …

2. Specific Inferences from Generalised Data

In calculating a person’s propensity to re-offend, algorithms produce a score that is based on his or her background. Details such as income level, gender, education level, family history and personality traits are taken into account when separating high-risk individual from low-risk ones.

While risk assessment tools have been around for quite some time, algorithms give the false impression that outcomes are accurate and are not subject to human error, when they are in fact reliant on old, mostly inaccurate data compiled by humans. Furthermore, being guaranteed perfect analyses gives rise to the tendency to conflate correlation with causation.

Making use of someone’s past mistakes or social background to judge his or her level of criminality is bad enough. When the data on which we make that judgement is flawed in the very first place, the criminal justice system fails to protect its citizens, much less Black, Asian and minority ethnic (BAME) groups.