Home » india news » Delhi Police to go high-tech with predictive policing: Here's why it's a bad idea
 

Delhi Police to go high-tech with predictive policing: Here's why it's a bad idea

Saurav Datta | Updated on: 10 February 2017, 1:49 IST
QUICK PILL
What is predictive policing?
  • Predictive policing uses computer algorithms to predict crimes
  • It not only zooms in on a where, but also a who depending on statistics
Where does the idea come from?
  • Delhi Police\'s CMAPS has been created along the lines of US\' infamous COMPAS
  • COMPAS has been severely criticised over recent times
More in the story
  • Why is predictive policing such a bad idea?
  • Why this method is especially a bad for India?

Earlier in February this year, Delhi Police had announced that it would soon start using space technology for live crime mapping and adopting a 'predictive policing' mechanism.

This technology, developed in partnership with Indian Space Research Organisation (ISRO), called CMAPS (Crime Mapping, Analytics and Predictive System) was based on US' security programme called COMPAS (Correctional Offender Management for Alternative Sanctions).

Implementation of the programme has been in limbo for all these months, but police officers who believe the time for such a technology is now, have asserted that the process needs to be put on fast track.

CMAPS, like COMPAS, arms cops with relevant and timely data to effectively fight organised crime. The technology has already been used in cities like New York, Los Angeles, Berlin and London, but not without criticism.

What will CMAPS do?

The system helps control crime, law and order situations and manage security through analysis of relevant data and patterns.

According to media reports, right now crime mapping is a periodical process which is conducted manually at an interval of 15 days. Reports are prepared by joint commissioners and forwarded to special commissioner (Law and Order), who then briefs police chiefs.

Also Read: Sexual harassment is still the second highest crime in Delhi, Mumbai

CMAPS makes crime mapping a real-time process with space-based technology that helps sleuths collect and assess data. Cops are equipped with personal digital assistant devices that are connected to a central processor storing records of more than two lakh criminals.

With all the information easily available online, made accessible at the crime scene itself, officers will no longer need to go back to the police station to file reports.

The technology will also have -

  • MHA-approved call interceptions that gives police an edge in the case over criminals
  • Every distress call will be converted into a digital message with the location of the caller being flashed through GPS

With patterns created from the data in the system, police can -

  • Identify gangs in specific areas on real-time basis
  • Analyse neighbourhoods by understanding crime events and circumstances behind them specific to a certain area based on previous records
  • Carry out a proximity analysis. Proximity analysis provides information about criminals, victims, witnesses etc who are or were within a certain distance of the crime scene

So how does CMAPS help?

With all data available real-time, there will be no time lost between a crime being committed and a report being filed.

It will also help cops predict a crime, zero in on a dangerous area to take necessary precautions and also keep an eye out on 'potential' criminals.

And how it doesn't help

CMAPS is entirely based on algorithms created by the computer after crunching all the data available.

While computer algorithms, by themselves are error free, we have to keep in mind that the data being fed into the system creating the algorithm is not free from bias. And that is exactly where the problem with a technology like CMAPS and predictive policing lies.

An article in Columbia Journalism Review alleges that a predictive policing model leads to what is commonly termed as the 'machine bias'.

"Algorithms can be especially susceptible to perpetuating bias for two reasons. First, algorithms can encode human bias, whether intentionally or otherwise. This happens by using historical data or classifiers that reflect bias (such as labeling gay households separately, etc.). This is especially true for machine-learning algorithms that learn from users' input," says the article titled Investigating the Algorithms that Govern our Lives.

Also Read: Fatal blow: why the Jain practice of voluntary death is a crime

Especially in India, given the enforced ghettoisation of Indian Muslims and communalisation of police and paramilitary forces, this system can potentially derail criminal justice.

One only has to visit certain localities like Juhapura in Ahmedabad, Mohammad Ali Road in Mumbai and Mumbra in its suburbs, or Calcutta's Topsia, Tiljala and Metiabruz localities to gauge the extent of the problem.

When the chances of 'calculated' bias is both perpetrated and perpetuated, very little remains of justice and, civil liberties and democratic rights.

Issues up close

Based entirely upon computer algorithms which have no concern for either human differences nor racial and ethnic sensibilities, the COMPAS, on which CMAPS is based on, tries to map out if more crimes are going to be committed in a certain area as compared to another. For example: COMPAS tries to predict if more crimes will be committed n in New York's Bronx or in the Harlem area.

The algorithms study previous crime rates, based on mainly quantitative, not qualitative statistics, and thereafter enables what experts call 'crime forecasting'.

Two studies by the RAND Corporation, Evaluation of the Shreveport Predictive Policing Experiment and Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations provides an insight into how correlation and not causation can affect criminal justice. According to these reports, statistics are, more often than not, misleading unless a lot more data is available.

Also Read: Crime figures slide 27 per cent in Bihar since ban on liquor

While almost all police departments that have used predictive policing admit that it definitely is an optimisation of resources, in certain cases it has helped improve community relations, but it is at the end of the day a prediction that isolates one area/person as being more prone to crime than another.

According to Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations, 'the operational value of predictive policing tools is in their contribution to broader law enforcement strategies that use the tools' risk assessments to inform resource allocation and problem-solving decisions.'

'To be effective, predictive policing must include interventions based on analytical findings. Successful interventions typically have top-level support, sufficient resources, automated systems providing needed information, and assigned personnel with both the freedom to resolve crime problems and accountability for doing so,' the report adds.

But in the case of India, chances are that these interventions, or more in-depth data simply might not exist.

The Indian issue

The 2002 Steven Spielberg movie Minority Report showed a Pre-Crime Captain from a Special Police Unit, armed with the 'smartest' tech to apprehend future murderers, turning into a murder accused because of a statistical analysis gone awry.

When we are dealing with computer predictions, especially in the case of crime, there are chances that wrong areas and wrong people might be brought under scrutiny because of pre-conceived notions and socio-religious biases. For India, besides the obvious machine bias, the minority bias will also be affecting the computer algorithms.

With the heightened fear of 'terror attacks' and the vague (and legally indefensible) definition of 'radicalisation' (as applied to Muslims), the adoption of predictive policing methods may well have a crushing effect on civil liberties and democratic rights of Indian citizens.

Truth, and nothing but the whole truth, as determined and conclusively established by law, should form the basis of criminal justice and not algorithm-based suspicions. Especially in a country reeling under the effects of Hindu majoritarianism where members of a particular minority community are more likely to be falsely implicated in criminal cases.

Edited by Jhinuk Sen

Also Read: Voice of the weak: why forcing out Scroll reporter is a crime against Bastar's victims

First published: 27 June 2016, 5:24 IST
 
Saurav Datta @SauravDatta29

Saurav Datta works in the fields of media law and criminal justice reform in Mumbai and Delhi.