Can computerized policing systems be biased?

On Behalf of | Jul 14, 2023 | Criminal Defense |

The idea of predictive policing sounds like something created by science fiction writers. Yet, this technology does exist. The police can use a computer model that will tell them where it believes crime will take place. It does this by studying past data and making judgments about what is most likely to happen in the future.

The advantage of this approach is that the police could increase their presence in certain areas to better ensure public safety. There are those who say that predictive policing can help to deter crime simply because officers can respond before anything happens. Their presence alone may stop criminal activity in the area.

However, there are also those who say that the system is racially biased. In the MIT Technology Review, one researcher went so far as to conclude that “predictive policing algorithms are racist” and claimed they should no longer be used. But if these systems are run by computers, and not people, how could they possibly be biased? Shouldn’t one of the advantages of a computer system be that it would take any personal racial bias out of whatever process is in question?

Training computer models

The issue is that algorithms need to be “trained” with data that a police department inputs. A computer doesn’t actually have any idea where crime will occur. It simply analyzes the data that is given to it and then makes a prediction based on that data.

What if the programming model is designed or influenced by a biased source? Say that there is a police officer who is biased against African-American individuals, and that officer arrests 10 African-Americans for every white individual. This would suggest to the computer system that African-Americans are vastly more likely to commit crimes, and it would dispatch more officers to neighborhoods with higher levels of African-American population groups.

In some cases, this kind of indirect bias creates a loop that only exaggerates any bias already present in a model. If officers make more arrests after being sent to these locations, that just suggests to the algorithm that its data is correct, and the racial bias only gets worse.

Legal options moving forward

Bias may have been an unintended consequence in predictive models, but it’s exactly the outcome that experts at MIT think is both happening and needs to be avoided. Those who believe they have been racially profiled or arrested by biased police officers also need to know about all of their defense options. Seeking legal guidance is a good place to start.

Archives

RSS Feed

FindLaw Network