“Mr. Marks, by mandate of the District of Columbia Pre-Crime Division, I’m placing you under arrest for the future murder of Sarah Marks.”
So states Captain John Anderton, played by Tom Cruise in the neo-noir science fiction action thriller Minority Report. Set in Washington, DC in the year 2054 — essentially four decades into our future — the premise is that the police can look into people’s habits and behaviour, and then, with the help of advanced computers and some psychics, see your future. And if the psychics agree — by a majority of two to one — that you will commit a crime, you can be arrested as if you actually did the deed.
The irony is that some new scientific procedures employed by police forces in Chicago, Los Angeles and Manchester (England) could be approaching this sort of intrusion into citizen’s privacy and rights. Once again, this raises the age-old question of how much privacy should citizens relinquish in the name of security and safety? Is there a way to control this technology and target wrong-doers, separating them from law-abiding citizens?
A simple formula: discover hot spots and extinguish them.
It turns out that criminal activities form patterns in time and space. Consider an assault on a certain street in a previously boring neighbourhood. This fact doesn’t mean anything by itself, but by tracking seemingly random data over a large area and collecting thousands of individual data points, trends develop that can be interpreted in order to determine risk factors. Computer simulations of these encapsulated and classified data points can then begin to predict trends and identify hot spots of criminal activity, which help law enforcement put more resources and manpower into those areas. This intelligent policing has proven to cause significant reductions in crime.
Proven results: predictive policing in LA and Manchester.
Six months after implementing predictive policing in the Foothills area of Los Angeles, property crimes went down by 12% compared to the previous year. Strangely, they went up by 0.5% in neighbouring districts. Police in Trafford, a suburb of Manchester, England cut burglaries by 26.6% by identifying the high risk areas and upping their presence in those areas, including sending recruits on driver training exercises through the streets of these hot spots.
More data, finer tuning.
The more data that is entered into the system, the better the results will be. For example, beyond reports of burglaries and other crimes, adding environmental detail (such as traffic patterns, outdoor lighting, locations of security cameras, property upkeep (or lack thereof) and places where potential victims may be exposed at a bus stop with hidden observation posts and speedy getaway routes) — all of these can be assimilated into formulas that determine risk. Weather patterns, ATM locations, and smart CCTV programs that begin to recognise suspicious behaviour of the subjects being recorded — all of these observations combine to determine and even schedule — where police presence is most appropriate.
As in any complex system, there is the tendency to both put too much faith in results that could have been biased by a programmer. There is also the possibility that results will be offset by the quality and selection of the data being used to make conclusions. While it is easy to input simple robbery and burglary reports, some crimes such as drug deals (perhaps hundreds of transactions compared to one arrest) and gang violence are under-reported, and this creates an imprecise reading.
Washington DC law professor Andrew Ferguson is more concerned that, as these procedures become more mainstream, judges and juries will become complacent and comfortable with simply accepting the “truthfulness” of these crime profiling tools. He warns against losing transparency and not having access to the base programs in order to challenge and perhaps correct the assumptions.
The question is “where is this all taking us?” Authorities in Maryland are trying to predict which families in the social services database are more likely to inflict physical abuse on their children. The US Department of Homeland Security is developing software to scan crowds and even airport security lines to detect nervous behaviour such as swift heartbeats, fidgeting, furtive eye movements and abnormal sweating. Police in California are using robots to crawl through local social media posts to find notices of wild parties. ECM Universe has created programs to read sites “rife with extremist viewpoints” in order to tag possible targets.
Predictive and proactive: Chicago as a case study.
In Chicago, the Metro Police have initiated a pilot program in a part of the city where computer analysis determined which citizens are likely to either commit a violent act or be the victim of one. Based on studies on the West Side by Yale sociology professor Andrew Papachristos, the study discovered that violent activity is limited to only about 400 individuals, much of which is located in the Austin neighbourhood. The police commander conducted home visits with neighbourhood leaders, encouraging them to reach out to these individuals to explain the risks of prison and death, and to recruit these gang members away from their violent lifestyles by helping them to find jobs and redirect their energy into beneficial projects. Crime statistics in the area are decreasing accordingly.
What could possibly go wrong?
In Captain Atherton’s case, the psychics (called ‘precogs’) dreamed that he would commit a murder in the near future. It turned out that powerful people were influencing the system to cause results that were both untrue and served their own nefarious needs. Any system can be ‘gamed’ in order to skew results. Is it possible that the predictive models being developed by today’s law enforcement and sociologists could indeed be used against innocent people?
Your Turn: Has Big Brother taken over our lives? Or do you feel safer knowing that the police are indeed finding bad guys before they can hurt you or someone you love? Could you be identified as a possible suspect based on your appearance or mannerisms?