Implementing AI to predict crime: a social debate around discrimination

Year 2054. The captain of the Washington police force, John Anderton, has a team of people capable of anticipating the future using mental faculties that exceed the possibilities of human beings. Thanks to this, murders were eradicated from the territory. This fictional world was created by Steven Spielberg in his movie Minority Report. Today, in the midst of 2020, several cities around the world have anticipated Spielberg by using models to predict crimes.

Technology, the best ally for citizen security

Andrés Barrantes, CEO of Nuvu, in his speech at ANDICOM 2020, explained how technology has made it possible to move towards an anticipatory government, with the aim of offering citizens better services. Through the use of data, the software can draw conclusions that show when, how and where a crime may occur.

Different cities around the world, such as Los Angeles, use platforms to reduce crime rates with successful results. These applications use figures and data extracted from insecurity rates, reports to emergency lines, characteristics of criminals, among others. After collecting this data, algorithms are used to predict the place and time when a crime will occur. 

For example, characteristics such as poor lighting, broken glass or population density are correlated with the number of thefts in a territory.

This predictive analysis makes it possible to anticipate these events, making it easier for police forces to do their job, knowing where to locate their patrols, optimizing resources and arriving before the crime occurs. Other software has the ability to analyze body language and through micro-expressions identify who is about to commit a crime.

Social discrimination, a debate that must be held

The world learned of a letter signed by more than a thousand data experts, sociologists, historians and scientists criticizing these models. Their main criticism stems from the stigmatization of populations by using previous data to predict future events. According to the experts, justice systems can be biased and police make some wrong decisions. Such as, for example, racist biases. Platforms, by using this type of data to generate their predictions, can perpetuate these biases.

A Liberty Association report reinforces this argument, stating that "these programs adopt a security management approach based on discriminatory profiling". This is a debate that the technology sector should not shy away from.

Although predictive software uses data from the justice system, it is necessary to structure platforms whose diversity of information reduces the risk of discrimination and bias in their use. There is evidence that crime prediction software contributes to crime reduction. Therefore, the use of technology as an ally should not be questioned.

The diversity of data in predictive models can contribute to optimize police force resources and reinforce the presence of these individuals in specific sectors and at specific times where crimes may be more frequent. The prudent and safe use of figures that include social or physical characteristics will be key to moving towards less biased and more effective models for predicting crimes.


Contact us at

Uploaded on

2/10/20

in the category

Artificial Intelligence

nuvu 2023 | All rights reserved.