New York should regulate law enforcement use of facial recognition technology

security camera
security camera
ymgerman/Shutterstock
Warning sign indicating security camera being used by New York Police Department.

New York should regulate law enforcement use of facial recognition technology

The risk for unfairly targeting minorities or stifling free speech is high.
January 7, 2019

In 2012, the NYPD implemented a vague, benign-sounding program called the Domain Awareness System. In partnership with Microsoft, the NYPD set up thousands of public-facing surveillance cameras, many owned by private businesses, in Lower Manhattan. The police department was able to do something it had never been able to do before: access these cameras and link the feeds with software to cross-check criminal and terrorist databases, scan license plates and take radiation levels, 24/7. “We’re not your mom-and-pop’s Police Department anymore,” Michael Bloomberg, the mayor at the time, bragged. “We are in the next century. We are leading the pack.”

As part of the deal, when Microsoft turned around to sell the surveillance tech to other cities, New York would get a cut. “I hope Microsoft sells a lot of copies of this system,” Bloomberg said.

More than six years later, surveillance technology has flourished. The next frontier, which has rapidly evolved since the birth of the Domain Awareness System, is facial recognition technology – a reality that brings great promise and even greater peril. Law enforcement has more power than ever to both solve crimes and wrongfully ensnare the most disadvantaged.

Like other law enforcement agencies, the NYPD daily collects unfathomable amounts of information. Photos and images are constantly logged into searchable databases. If a person is suspected of a crime, computerized algorithms can match an image of the suspect’s face to driver’s license and ID photos or whatever has been logged in the database. Crimes can be solved faster than ever before.

Very little is actually known about the NYPD’s surveillance and facial recognition systems. The police department has long resisted efforts to make public how exactly they uses the information they gather.

The pitfalls are dizzying. Since people of color are statistically more likely to be targeted by police and end up in contact with the criminal justice system, a mug shot held of a black suspect in a database – a suspect, perhaps, who was never found guilty of anything – can be scanned by facial recognition software to ensnare another black suspect of a crime who the software believes to be the same person.

What if they turn out to be different people? Facial recognition technology has a tendency to misidentify darker-skinned faces, according to Gender Shades, an MIT project that studied racial and gender biases in algorithms.

Just as ominously, no state, including New York, has passed a law comprehensively regulating police face recognition. Warrants for the search of face databases are not required. No reasonable suspicion is required before conducting a search.

The risk for stifling free speech is high. Thanks to the prevalence of cameras and this new technology, police departments can use face recognition to track individuals engaging in political, religious or other protected speech.

The Center on Privacy and Technology at Georgetown Law School extensively studied facial recognition technology several years ago, coming up with a number of recommendations that should be heeded in New York. For starters, the state Legislature should adopt Georgetown’s model bill for regulating police face recognition.

The legislation would, among other things, require operators of arrest photo databases to regularly purge images of people who are not later convicted. Restraining this sort of technology would be a worthy goal for an emboldened, progressive Democratic state Senate.

American law enforcement has always followed a basic standard: The police cannot search anyone they please. Before law enforcement officials infringe on an individual’s liberty, they generally must have an individualized suspicion that the suspect is engaged in criminal conduct.

At a minimum, New York lawmakers should require that face recognition searches be conditioned on an officer’s reasonable suspicion that a person is engaged in criminal conduct. Face recognition allows law enforcement to identify a suspect without talking to him or her. A reasonable suspicion standard should apply to searches that run on mug shot databases.

We are on the cusp of a dangerous new world. As technology rapidly advances, it’s up to our government to protect us from its worst excesses.

Ross Barkan
is a writer, journalist, and former State Senate candidate
20190122