New York City Council

The fight to make New York City's complex algorithmic math public

William Alatriste for the New York City Council

In New York City, government bureaucrats use algorithms to help make decisions on where students are assigned to school, whether a suspect is allowed out of jail and which buildings should be targeted by inspectors.

But amid the growing reliance on technology and complicated formulas to deploy resources and make critical decisions, criminal justice advocates and watchdog groups are seeking more transparency and input over the way they’re used.

Legislation, which may be the first of its kind in the country, was recently introduced in the New York City Council that would peek under the hood of tools being deployed across the city, and help legislators and citizens shed light on any biases in the computations. The bill’s sponsor, City Councilman James Vacca, said his legislation, which would require agencies to publish the source code of the algorithms they use for targeting and penalizing individuals, has “touched a nerve” and started a conversation about the usage of such tools.

RELATED: How ACS measures its providers' performance

“I want people to know how data is analyzed and how data is utilized,” he told New York Nonprofit Media. “The governance of data is going to be increasingly important to our society going forth. Yet, very few places in our country are discussing algorithms, and they’re not discussing the collection of data and then how it’s used.”

In the 1970s, the New York City-RAND Institute developed a formula for the New York City Fire Department that officials reportedly used to justify closing fire stations in the Bronx, just as a wave of arson swept across the borough. In the 1980s, Vacca, then a district manager in the borough, also had requests for more police officers stymied because of formulas, experiences he said helped inspire the bill. “To this day, I don’t know what is the formula the police use to determine how many officers are in every station house,” he said.

Vacca’s bill would apply to every city agency, and it would mandate the agencies post the algorithms on their websites. Criminal justice advocates have been some of its loudest backers because of the direct effect the legislation would haveon its clients, and because the biases of some algorithms have already been well-publicized.

These formulas, or “risk assessment instruments,” weigh a number of factors, such as previous arrests, to determine whether someone arrested for a particular crime is likely to return for court dates or should get bail under certain conditions.

Proponents say the tools can keep the public safe. A working paper released in February by the National Bureau of Economic Research examined simulations of the tools using arrest data from New York City between 2008 and 2013 and found crime could be reduced by nearly a quarter with no change in jailing rates, and the number of people detained in jails could be reduced by 42 percent with no increase in the crime rate.

But critics warn that these technologies have built-in flawed assumptions that can lead to continued bias against affected communities.

In an example highlighted by ProPublica last year, an algorithm wrongly considered a black woman who briefly took a bike from a neighbor’s yard to be more likely to commit a crime in the future than a white man arrested for shoplifting who had a lengthy criminal record. The judge set a higher bond for the black woman, but did not recall whether her “risk score” affected his decision.

Rashida Richardson, the legislative counsel at the New York Civil Liberties Union, said she’s concerned about how the algorithms are used. Even if the formulas are written in a way to reduce any bias, the end users – the civil servants, for example – might not fully understand how to wield the results, or might rely too heavily on them. “It’s possible with those systems that the person making the decision will just rely on what the system shoots out, rather than using any human judgement,” Richardson said.

It’s a delicate balance. Richardson acknowledged that there could be benefits to using the algorithms, but there’s a need for legislation that raises the level of data transparency across agencies without violating individual rights and while retaining each tool’s effectiveness.

“Is there a one-size-fits-all regulation model that can be developed for all of these different agency uses?” Richardson asked. “Or is it going to have to be agency-specific, where NYPD has one form of regulatory oversight and then DOE will have another, because of not only the population it’s serving, but how it’s being used.”

Richardson also said New York largely lags behind other progressive-leaning states when it comes to releasing data. “A lot of the times, I feel like we’re chasing behind California or Massachusetts, or other fairly progressive states,” she said.

“To this day, I don’t know what is the formula the police use to determine how many officers are in every station house.” – New York City Councilman James Vacca

Attorneys with Brooklyn Defender Services, a public defender organization representing nearly 40,000 people each year, said that possible indicators of flight risk – such as homelessness, employment status, school enrollment, previous convictions or imprisonment – can be discriminatory because of the societal factors that lead to those indicators.

Currently, judges are only able to consider whether someone is at risk of fleeing when setting bail conditions. But the city is supportive of allowing judges to also consider public safety.

Yung-Mi Lee, a supervising attorney specializing in criminal defense at BDS, said it was important to alert people of this technology and how it is used before it expands to the point where people are singled out even before they commit a crime. “That’s also the inherent danger of risk assessment instruments: That it will allow for the detention of people that have not even committed that future crime yet,” she said.

Scott Hechinger, the senior staff attorney and director of policy at Brooklyn Defender Services, said, “Any time there’s a criminal justice reform conversation, public defenders and clients – the people affected by the practice – should be at the table. And unfortunately we’re not called upon enough, our voices are not listened to enough.”

Getting updated risk assessment tools into the hands of judges is the first strategy listed in New York City Mayor Bill de Blasio’s plan to close the Rikers Island jail complex and reduce the city’s jail population.

Elizabeth Glazer, director of the Mayor’s Office of Criminal Justice, said researchers are helping redesign the risk assessment tools for judges, with the goal of showing advocates and citizens the processes that go into the next generation of tools. “The big change for us is really just to make the process as open as possible, to ultimately post the data that underlies the tool publicly so that people can see for themselves how the tools operate,” she said. (Some of that data is already available to the public.)

Glazer’s office is seeking a partnership with ideas42, a nonprofit behavioral economics firm, to help redesign the “failure to appear” risk assessment tool to add clarity to how judges see and weigh the results in order to ensure they aren’t solely relying upon the tools. “Judges are humans. They’re not machines. And you truly don’t want simply this algorithm to rule a judgement,” Glazer said. “A judge can see all kinds of things and an algorithm can’t.”

Formulas and technologieshave long shaped how police are deployed in the city, most notably with the introduction of CompStat in the 1990s, which helped police predict where an incident is likely to occur and is credited in helping to drive down the astronomical crime levels of that era. Today, the predictive policing technology is based on historical data and dozens of data points, making the assessments that much more granular.

Another City Council bill, backed by Council members Dan Garodnick and Vanessa Gibson, seeks to make public more information about the use, capabilities, guidelines and training surrounding the NYPD’s surveillance technology.

Academics and civil liberties groups have also been seeking information on how predictive policing tools are used, arguing that they create feedback loops that send more police to areas that already have a high concentration of police, based on the increased number of infractions and crimes that police observe. Critics also argue there is little evidence that such efforts reduce crime; a study on an algorithm-based crime prevention program in Chicago found it didn’t save any lives.

The Brennan Center for Justice has been fighting to learn more about these technologies. A legal claim against the NYPD is pending after it withheld some information about a $2.5 million software contract from Palintir that provides, among other services, predictive policing tools.

RELATED: The best New York City Council members

The AI Now Institute, a collection of researchers at New York University, issued a report earlier this year that encouraged governments to eschew “black box” tools in favor of openness, test appropriately for any bias and encourage staff with diverse backgrounds and from various specialties to help develop and test the algorithms.

For its part, the city does share a lot of information. The city’s open data portal makes a tremendous amount of data generated by city agencies available to the public. The Mayor’s Office of Data Analytics even put online some of the analytics tools it used to identify which cooling towers to inspect after a 2015 outbreak of Legionnaires’ disease.

During an October New York City Council hearing on the measure, city officials said publishing the source code that companies use to generate these algorithms could allow people to hack the systems, and would have a chilling effect on technology vendors looking to do business with the city. There is precedent for some open-source developers to share their code for predictive policing tools, but many vendors worry that sharing their code could reduce their competitive advantage.

Vacca has suggested that the city appoint an expert who can gauge the openness and fairness of city agencies’ use of formulas. It isn’t clear yet that Vacca’s broad 150-word legislative proposal will ultimately strike the balance of revealing the factors behind these formulas while retaining the confidence of the city’s tech staffers and external partners.

Vacca said he is taking a look at the officials’ comments and will work to incorporate those concerns into the legislation, particularly because it could influence how similar rules are formed in other jurisdictions. “I realize that because this bill is tackling things that have not been tackled throughout the nation, I realize that we have to be very deliberate,” he said.

As Noel Hidalgo of BetaNYC, a technology civic organization, said during the hearing on Vacca’s bill, “If we refuse to hold algorithms and their authors accountable, we outsource our government to the unknown.”

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.