Criminal Justice

Is more surveillance tech coming for the NYPD?

Mayor Eric Adams’ mentions of facial recognition and gun detection technology drew swift pushback from civil liberties advocates.

Eric Adams visits 1 Police Plaza on Jan. 14.

Eric Adams visits 1 Police Plaza on Jan. 14. Ed Reed/Mayoral Photography Office

In the summer of 2020, the New York City Council passed a bill requiring the police department to disclose details about the surveillance technology it uses in a bid for more transparency about how the department uses controversial tools such as facial recognition, drones and so-called “digital stop-and-frisk.” The passage of the Public Oversight of Surveillance Technology (POST) Act was viewed as an incremental but significant step to creating oversight of how law enforcement uses these tools, which in cases like facial recognition, have been found to be prone to error, especially when used to identify people of color. 

Now, some privacy advocates and progressive lawmakers are warning that New York City is at risk of taking a step in the wrong direction. On Monday, Mayor Eric Adams unveiled a multifaceted approach to tackling gun violence, including the creation of a modified plainclothes unit and calling on the state to revisit bail reform. Included in Adams’ plan is a proposal to explore further uses of technology, such as facial recognition and gun detection tools. Adams’ “Blueprint to End Gun Violence” doesn’t go into much detail on this topic but suggests exploring “the responsible use of new technologies and software to identify dangerous individuals and those carrying weapons.”

In remarks at a press conference on Monday, Adams elaborated a bit. “There's some amazing technology out there,” he said during a question-and-answer period. “Someone can take a picture of your face and in eight seconds, see everything that you have in public view – not private view,” he said, in an apparent reference to limiting the technology’s reach. “If you're on Facebook, Instagram, Twitter … they can see and identify who you are without violating the rights of people, only looking at public data.” He went on to mention gun detection technology too, which generally uses artificial intelligence to visually identify a person is carrying a gun. The NYPD already uses ShotSpotter, a separate kind of tool that detects the sound of gunshots but has had problems distinguishing between the sound of gunshots and fireworks.

Adams said on Monday that he aimed to prioritize “responsible” uses of these kinds of technology. “It’s supposed to be used for investigatory purposes, and that's what we're going to use it for,” he said. “We're looking at all of this technology out there to make sure that we can be responsible within our laws, we're not going to do anything that's going to go in contrast to our laws. But we're going to use this technology to make people safe.”

Some argue that there isn’t a way to use facial recognition technology safely because of the biases it carries. “This is a technology that's biased against Black and brown communities. It's a technology that's error prone,” Albert Fox Cahn, executive director of the nonprofit Surveillance Technology Oversight Project, told City & State of facial recognition systems. “I'm deeply disappointed that despite all his years on the force, Eric Adams seems to be falling for the same sales gimmicks that Silicon Valley has used to win over other city leaders.” Progressive Democratic City Council Member Tiffany Cabán also criticized Adams’ embrace of the technology, saying in a statement she strongly opposed “expanding the use of facial recognition technology.” 

Multiple studies have shown such biases; a report by the National Institute of Standards and Technology studying 189 different facial recognition systems found that Black and Asian people were up to 100 times more likely to be misidentified than white people. Several municipalities across the country have banned law enforcement’s use of facial recognition technology, and local elected officials, including Public Advocate Jumaane Williams, have called for New York City to do the same. The NYPD already has a facial recognition database and has experimented with the facial recognition app Clearview AI. The department says that no arrests have ever been made based solely on facial recognition searches.

In an op-ed on Tuesday in the Daily News, Cahn argued that gun detection software is unproven too. 

If the NYPD does expand its use of these tools or deploy new gun detection technology, the department would be required to disclose impact and use policies on the tools under the POST Act. Cahn said that he hopes Council Member Gale Brewer – the new chair of the oversight and investigations committee – will use her position to ensure full compliance with the POST Act. “I think that an oversight hearing from the council on the POST could be incredibly powerful to gain more information about not just what systems are on the horizon, but how systems are already used,” he said.