14 police forces are using, have previously used or are planning to use shady algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.
This is according to the February 2019 report (pdf) from Liberty on police algorithms.
The report finds:
- police algorithms entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed
- predictive policing programs which assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling
- a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions – and even the police do not understand how the machines come to their conclusions
- the significant risk of ‘automation bias’ – a human decision maker simply deferring to the machine and accepting its indecipherable recommendation as correct.