Predictive policing strategies, based on techniques that focus on reasonable suspicion, raise concerns about Fourth Amendment violations.
Does Predictive Policing Violate Civil Rights?
Predictive policing uses artificial intelligence to identify potential crime locations. Computer software uses algorithms, known as COMPAS, to identify possible crime locations in geographic clusters with high crime rates. The software looks at short-term and long-term crime rates, as well as daily, weekly, and seasonal trends. Predictive policing programs examine the potential for minor crimes like disorderly conduct and vandalism, serious crimes like violent assaults, and escalating patterns of gang violence.
Predictive policing software is currently used by more than 60 police departments around the country. Programs like CrimeScan and PredPol were developed to make law enforcement’s job easier by identifying patterns of criminal behavior in certain areas. Although CrimeScan and PredPol limit criminal activity projections to geographic locations, some police departments use the software to identify potential individuals who may commit crimes. In major cities like New York, Los Angeles, and Chicago, police departments take a controversial approach to predictive policing programs by identifying individuals in geographic areas that are most likely to be perpetrators or victims of a crime.
Various civil rights organizations including the American Civil Liberties Union (ACLU) are raising concerns about bias and prejudice built-in to predictive policing programs. They say algorithms that target geographic crime areas indicate good and bad neighborhoods based on arrests data rather than predictions of crimes. Does this mean that police who patrol “bad areas” are more likely to use more aggressive police tactics and make more arrests than the police who patrol “good areas”?
A recent investigative report shows that 13 jurisdictions using predictive policing programs have been inputting data based on unconstitutional police stops, searches, and arrests. The jurisdictions have all entered into court-monitored settlements with the Department of Justice for corrupt or illegal policing practices. It was determined that entering “dirty data” leads to civil rights violations. The report notes that biased police data leads to targeting innocent individuals and unjustified, aggressive police tactics. It also puts primary focus on more violent crimes, while serious white-collar crimes don’t get investigated.
Currently, civil rights organizations and lawmakers in Washington are working to raise awareness about the problems with algorithms used in predictive policing systems. Legislators are considering an algorithm accountability bill that would establish transparency guidelines for police departments and other institutions including healthcare and education who use these systems.