"Minority Report" just got a step closer to becoming reality in Britain.
A new investigative tool has been tested by London’s Metropolitan Police this week to assess the likelihood of gang members committing crimes in the future.
Developed by technology firm Accenture, the software merges data from various crime reports with criminal intelligence systems and applies predictive analytic information to generate risk scores.
A more common example of predictive analytics is credit scoring. It’s a statistical tool that predicts the future by using data from the past.
Ger Daly, senior managing director for Accenture’s Defense and Public Safety, told ABC News that the focus of the program is "to look at a select groups [and] gangs," rather than specific individuals.
“We used data from 2009 to 2012 to predict what would happen in 2013. We will compare the results to real figures and tweak the algorithm accordingly,” Daly added.
The software can be applied to other crimes by simply changing the algorithm.
“We could look into burglary or domestic violence for example,” said Daly.
The company has developed similar projects in other countries. In Singapore, for example, Accenture worked on a CCTV tools that detects patterns and events such as overcrowding or flooding.
British police are now evaluating Accenture’s program findings.
“Our objective is to stop gang shootings in London and bring those responsible for crime to justice,” said Sarah Samee, a spokeswoman for the Metropolitan Police Specialist Crime and Operations.
In 2012, gangs were responsible for approximately 22 percent of serious violence, 17 percent of robberies, 50 percent of shooting incidents and 14 percent of rape in London, according to Metropolitan Police figures.
“We’re always keen to use technology, but it’s too early to say whether this software will help us in our broader strategy,” Samee added.
The police’s Digital Policing team and Trident Gang Crime Command will likely decide in the coming weeks whether to use the software or not.
Several civil liberty advocates, however, are worried about potential privacy infringements.
One of them is Daniel Nesbitt, research director at Big Brother Watch.
“Police should be careful not to target or stigmatize people unfairly. It could make it harder for them to connect with those they are trying to catch,” he said.
While police have made more efforts to engage with the public on their tactics, Big Brother Watch believes they need to be transparent about the technology and information used.
In the U.K., the Police and Criminal Evidence Act of 1984 was created to strike the right balance between police powers and rights and freedoms of the public. An Accenture spokesman told ABC News he believed it was the government’s role to draw a line and insisted their software “operates within the rules of the law”.
“Ultimately, there’s the police on one side, the politicians on one side, and the people being protected on the other,” said Daly. “A dialogue between them is needed."