Over 80% of facial recognition suspects flagged by London's Met Police were innocent, report says

London police have been monitoring crowds with the technology since 2016.

July 4, 2019, 9:16 AM

LONDON -- A new independent report claims that 81% of suspects flagged by facial recognition technology used by London's Metropolitan Police were innocent.

The study was commissioned by Scotland Yard and researchers from the University of Essex.

Live Facial Recognition (LFR) has been used by the police in various trials to monitor crowds since 2016. This is the first independent report into the use of the technology, which suggests there are "significant operational shortcomings in the trials which could affect the viability of any future use of LFR technology."

The authors of the report, Peter Fussey and Daragh Murray, were given access to six of the 10 trials that have taken place between June 2018 and February 2019. Only eight of the 42 matches that the LFR technology made were deemed correct with "absolute confidence," according to the report.

Fussey and Murray also found that the criteria for registering people on an LFR "watchlist" was not clearly defined, which raises significant concerns for privacy law and the protection of human rights.

"A key issue with facial recognition going forward is that human rights compliance be built in from the outset, not treated as an add on," Murray told ABC News. "There is an obligation on police forces to ensure that rights are protected in the pursuit of public order. The two can go hand-in-hand, but this requires specific measures to assess the human rights impacts of new technology and the necessity of using such technology. The end point should not be technological development as an end of itself but ensuring that technological development serves society."

PHOTO: Armed British Metropolitan Police officers carry their guns as they patrol on Whitehall in central London on May 23, 2019.
Armed British Metropolitan Police officers carry their guns as they patrol on Whitehall in central London on May 23, 2019.
Isabel Infantes/AFP/Getty Images, FILE

Fussey said that the findings of the study show the "need for meaningful leadership on these issues at a national level."

"The report demonstrates a need to reform how certain issues regarding the trialing or incorporation of new technology and policing practices are approached and underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision making processes," he said in a statement.

The authors concluded that it is "highly likely" that LFR would be ruled unlawful if it was challenged in court and are now calling for the all live trials of LFR to stop until their concerns are addressed.

Although the Metropolitan Police chose not to exercise a right of reply before the report was published, Deputy Assistant Commissioner Duncan Ball has since said that the Metropolitan Police are "extremely disappointed with the negative and unbalanced tone of this report."

“It is important to note that while this technology can be used to assist us, it does not by any means replace the role of a police officer," he said in a statement to ABC News. "Ultimately the decision to stop an individual is not made by the technology but as a result of officers making a decision on whether they believe the individual on the street matches the wanted person. The final decision to engage with an individual flagged by the technology was always made by a human."