Without Fears

March 20th, 2015

By Guest Blogger, Roberto Obando

Photo courtesy of Wikimedia Commons, Author,Olek Remesz, User Scott_Sanchez

In my previous blog  I explained in general terms how predictive policing works. Now I will explain its implementation in more detail.

Studies in the United States have shown that half the crimes in Seattle take place in just 4.5 percent of the city’s streets; that a bit more than 3 percent of the streets and intersections in Minneapolis account for half the crime there; and that just 8 percent of the city blocks in Boston account for 66 percent of the city’s robberies.

Similar statistics have been gathered in Latin American cities. For instance, in Cali, Colombia, 100 percent of all crimes reported in 2012 took place in 17 percent of its streets. With these kinds of statistics, we can start to build the algorithms for a predictive policing model.

In the city of Santa Cruz, California, crime dropped by 11 percent in the first six months of a predictive policing program. The success rate continued to rise in the next six months, achieving a 19 percent cut in crime. It’s important to note that the Santa Cruz police did not change any other part of its work. No additional police were hired, no additional hours were worked and no additional patrols were deployed. The only new action was the use of a predictive policing model to send officers on patrols to hotspots for criminal activity determined by the algorithm.

The Los Angeles Police Department tried an experiment with one model of predictive policing. The results provided scientific proof of its effectiveness. What was the experiment? Investigators established a model for the department’s Foothill Division, with a population of about 300,000. Each day, the department distributed maps to officers for their work. On some days, the maps and patrol routes were based on traditional policing methods. On others, they were based on the algorithms of predictive policing. The officers did not know which map they received.

The results were surprising. The algorithm proved to be twice as precise as traditional policing models. The days when the division used the maps generated by the algorithm, crimes against property dropped by 12 percent. There was no change on the other days. Despite initial resistance to the implementation of this new model, even the worst skeptics realized after awhile that the model was useful.

The municipality of Madrid, Spain also adopted some aspects of predictive policing, using location data and analysis systems to more efficiently manage the human resources assigned to security in the municipality.

In Singapore, a pilot program called Safe City combines video monitoring technologies and tools for predictive analysis to detect which places or situations may generate concerns for public security. When the system detects incidents with a high probability of generating concerns, it automatically alerts the police and other appropriate authorities.

Police work in Latin America and the Caribbean faces many strong challenges. Low salaries, obsolete or broken organizational structures, problems with equipment, insufficient infrastructures to meet citizen needs and corruption are just some of the problems. But that reality should not erase the hope for new models of police work.

Police officers can indeed become soothsayers. Don’t be surprised. The Web page of Amazon is full of predictions of the future. From the time you buy something, Amazon’s computer system begins to  send you information on other products that an algorithm predicts you may also want to buy.

Every day, police departments in Latin American and the Caribbean gather up large amounts of data and relevant information. The challenge is to organize the information and put it to work in a system that will help to produce improvements like those we have been describing.

Tags: , , , , , ,

The opinions on this page do not necessarily reflect the views of The Gleaner.
The Gleaner reserves the right not to publish comments that may be deemed libelous, derogatory or indecent.
To respond to The Gleaner please use the feedback form.

Comments are closed.