Barrett and Greene
Friday, February 26, 2021
Algorithms can be little more than another tool to help make decisions.

A few months ago, we wrote a piece for Route Fifty about the use of algorithms in state and local governments. As we read through a number of articles about the issues surrounding the use of these devices, we discovered repeated instances in which the word “algorithm,” was preceded by the words, “black box.”

This was often the case when referring to the algorithms used in social media, as opposed to those used in government, but the idea that algorithms are secretive numeric tools to hide reality from the public is potent, and needs to be addressed for cities, counties and states that use them.

To be sure, there is a real risk that algorithms used by government to make decisions about human beings can be a chancy proposition, as biases can easily creep into the equations that govern such important matters as the allocation of vaccines for COVID.

Algorithms can be little more than another tool to help make decisions. But the big problem is when they are perceived to be – or genuinely are – hidden from the public eye, like the rabbits that are startlingly pulled out of a magician’s top hat.  And, for better or worse, these sometimes-arcane mechanisms are here to stay. The key, as far as we can see, is for them to be used in a transparent fashion; to turn the black box into a fishbowl.

That’s just what New York City is beginning to do. Unlike any other city in America, it has just published the first directory of algorithmic tools used by city agencies. This is an important first step, for the city, and is the foundation of a planned robust algorithm management framework.

The road to this directory began with an executive order from late 2019 which said, in part:

“Vast amounts of data of all types are increasingly created, collected, integrated, used, and shared in new ways made possible through the use of algorithms and other emerging technologies, and thus traditional governance frameworks must evolve and adapt to ensure that principles of fairness, transparency, human-centered design, and privacy protection remain central to government practices, recognizing both the benefits to be gained, as well as the potential risks of inadvertent harm to individuals and communities that may result from the use of such tools and systems absent new understandings and guidance.”

The executive order didn’t directly call for a directory of algorithms, but that seems like a clear first step to any kind of transparent governance framework for them.

The directory is intended to be a resource that members of the public can go to, just as they do to the city’s popular Open Data  platform, which lets people surround themselves with information about city functions, ranging from the names of the parks and rec centers that have closed due to COVID to a list of all candidates who passed the civil service examination, ranked in score order.

The algorithms directory will be revised on an annual basis and gives users the ability to see “the name of the agency reporting the tool, the tool name and use date, and importantly, . . . narrative descriptions about the tool’s purpose and how it functions to aid the agency in making decisions.”

In a short four-month period, the city’s department of operations, under the leadership of a new position, the Algorithms Management and Policy Officer (AMPO), worked with city agencies to find all instances in which algorithms were used by them. Nine agencies were identified including the Department of Health and Mental Hygiene, the Fire Department, the Police Department. the Department of Social Services and the Administration for Children’s Service.

One of the algorithms identified, for example is used in the city’s Division of Child Protection in the Administration for Children’s Services (ACS). The Division has the capacity to review about 3,000 investigations annually out of about 56,000 investigations conducted each year, and the algorithm is used to “support the selection of cases for review.”

Will this directory drive potential biases out of the algorithms? No. But by clearly spelling out their use to the citizenry, public officials and advocacy groups, it opens them up to questioning, push-back and pressure, where needed, for complete disclosure of the assumptions that are made within the formulas used.