More facial recognition technology reported in non-white areas of NYC: Amnesty International

The crowdsourced data mapped more than 25,500 CCTV cameras across New York City.

February 14, 2022, 7:01 PM

More CCTV cameras with face recognition capabilities were observed in New York City boroughs and neighborhoods with higher concentrations of non-white residents, according to new research by human rights group Amnesty International.

"Our analysis shows that the NYPD's use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City," Matt Mahmoudi, an artificial intelligence and human rights researcher at Amnesty International, said in a statement to ABC News.

"The shocking reach of facial recognition technology in the city leaves entire neighborhoods exposed to mass surveillance," he added. "The NYPD must now disclose exactly how this invasive technology is used."

In a conversation about face recognition technology, New York City Police Department Deputy Commissioner John Miller told ABC News that the victims of violent crime in the city are "overwhelmingly" people of color.

"They not only deserve but demand that police respond to reports of crime and apprehend those responsible," Miller said.

Amnesty International's findings are based on crowdsourced data obtained as part of the Decode Surveillance NYC project, which mapped more than 25,500 CCTV cameras across New York City. The data was gathered between April 14, 2021, and June 25, 2021.

The logo of the New York City Police Department (NYPD) is placed on a surveillance camera in New York, Sept. 12, 2019.
Picture Alliance/DPA via Getty Images

The project's goal was to find surveillance cameras in New York City and reveal where people are most likely to be tracked by face recognition technology (FRT). Amnesty International then worked with data scientists to compare this data with statistics on stop, question and frisk policies and demographic data.

Stop-and-frisk policies allow officers to stop, question and pat down anyone believed to be suspicious.

The research found that the areas heavily populated with CCTV cameras proved to be at greater risk of stop-and-frisk practices by police. Some people have criticized this policing tactic as discriminatory. In 2019, 59% of those stopped by police as part of stop and frisk were Black and 29% were Latino, according to the New York Civil Liberties Union, which cited NYPD data.

According to data gathered by the United States Census Bureau in July 2021, of those living in New York City, 24.3% were Black and 29.1% were Latino.

In a statement to ABC News, Miller said that stop and frisks "have been down over 90% for over eight years."

"Numerically, the much fewer stops that are still made are based on descriptions of people given by crime victims who are most often members of the community where the stop is made," he said.

Miller added that these kinds of stops contribute to the NYPD's current level of gun arrests -- "the highest levels in 25 years," he said -- which is critical because "homicides are up by half, and shootings have doubled."

However, activists worry that invasive surveillance and face recognition technology threaten individual privacy and disproportionately target and harm Black and brown communities. Mahmoudi called the prevalence of CCTV "a digital stop and frisk."

The NYPD used FRT in at least 22,000 cases between 2016 and 2019, Amnesty International said, according to data S.T.O.P, an anti-surveillance non-profit, was able to obtain from the NYPD through the city's Freedom of Information Law.

"I'm not surprised that the surveillance technology hits, again, the same communities that have already been the primary targets of police enforcement, or specifically NYPD enforcement," Daniel Schwarz, a privacy and technology strategist at the NYCLU, told ABC News.

"It's a highly invasive harmful technology. It presents an unprecedented threat to everyone's privacy and civil liberties," Schwarz said. "We've been calling for a ban on this technology, because we can't see how it can be safely used, given its great impact on civil rights and civil liberties."

The criticism comes as New York City Mayor Eric Adams said he'd expand the NYPD's use of technology, including FRT.

"We will also move forward on using the latest in technology to identify problems, follow up on leads and collect evidence — from facial recognition technology to new tools that can spot those carrying weapons, we will use every available method to keep our people safe," Adams said at a press briefing in January.

Adams' office did not respond to ABC News' request for comment.

A sign stands outside the Lambert Houses, a Phipps Houses Development low-income housing complex, in the Bronx borough of New York, Sept. 1, 2017.
Bloomberg via Getty Images

The NYPD has been using FRT since 2011 to identify suspects whose images "have been captured by cameras at robberies, burglaries, assaults, shootings, and other crimes," according to the NYPD's website. However, the department says that "a facial recognition match does not establish probable cause to arrest or obtain a search warrant, but serves as a lead for additional investigative steps."

Robert Boyce, retired chief of detectives at the NYPD, said the department has stringent guidelines for using face recognition technology. No one is allowed to use the technology without a case number and approval from a supervisor, he said.

"It's a high bar to be able to use it and that's the way it should be," Boyce, who retired in 2018, told ABC News. "We don't use it for anything other than a criminal investigation, and we wrote a very strict policy on this, because it was under scrutiny by a lot of people."

The quality of CCTV footage is often not good enough for police to use it for face recognition, Boyce said, based on his time with the department. More often, he said, police use social media accounts to find images of individuals they are looking into rather than conduct FRT searches.

Images from social media accounts are often of better quality and are therefore more useful in getting accurate results when using face recognition software, according to Boyce. Police use FRT as a pathway to help them find someone, but they still need a photo array or lineup to identify a subject for it to be admissible in court, he said.

"I can't tell you how important it is. Our closing rates have gone up significantly because we do this now," Boyce said of FRT. "I think it's a tremendous aid to us. But like anything else, it can be abused, and you have to stay on top of that.

"If I had to give it a number, I would say they went up something like 10%," Boyce said of the department's closing rates. Closing rates refer to the number of cases the department is able to solve.

Boyce argued that FRT should be adopted by more states and used more widely around the country with federal guidance on its usage.

A closed circuit security camera (CCTV) mounted on a street light in Times Square in New York.
Ramin Talaie/Corbis via Getty Images

According to the U.S. Government Accountability Office, 18 out of 24 federal agencies surveyed reported using an FRT system in the fiscal year 2020 for reasons including cyber security, domestic law enforcement and surveillance.

Along with the research, Amnesty International also created a new interactive website that details potential FRT exposure. Users can see how much of any walking route between two locations in New York City might involve face recognition surveillance.

Amnesty International claimed that there were higher levels of exposure to FRT during the Black Lives Matter protests in 2020.

"When we looked at routes that people would have walked to get to and from protests from nearby subway stations, we found nearly total surveillance coverage by publicly-owned CCTV cameras, mostly NYPD Argus cameras," Mahmoudi said.

"The use of mass surveillance technology at protest sites is being used to identify, track and harass people who are simply exercising their human rights," Mahmoudi said, calling it a "deliberate scare tactic."

He added, "Banning facial recognition for mass surveillance is a much-needed first step towards dismantling racist policing."

The NYPD responded, saying it had no control over where protestors walked.

"We did not choose the route that the demonstrators took. Nor could we control the route that the demonstrators took," Miller said in response to Amnesty International's claims.

"There was no scanning of demonstrations for facial recognition," Miller said.

"The facial recognition tools are not attached to those cameras," Miller said. "In the cases where facial recognition tools were used, it would be where there was an assault on a police officer or serious property damage, whether it was a viable image to run against mug shots."

The NYCLU has also called for a ban on face recognition or biometric surveillance by the government toward the public, Schwarz said.

"Any surveillance technology can have a chilling effect on how people engage and how they make use of their free speech rights. It's extremely frightening thinking about how protests can be surveilled," Schwarz said. "I think there should be a clear guardrails on its use."

Miller, the NYPD deputy commissioner, said Amnesty International's research does not tell the full story of how FRT is used.

"Amnesty International has carefully cherry-picked selected data points and made claims that are at best out of context and at worst deliberately misleading. In the characterization of how the NYPD uses 'artificial intelligence,' the report has supplied only artificial information," Miller said to ABC News.

Last year, Amnesty International sued the NYPD after it refused to disclose public records regarding its acquisition of face recognition technology and other surveillance tools. The case is ongoing.

Editor's note: This article has been updated to reflect the name of the NYCLU.

Related Topics