The Right Way to Regulate Algorithms
by Stephen Goldsmith and Chris Bousquet
They’re intended to make decision-making more objective. But data-based tools will have the opposite effect if they aren’t subject to public scrutiny.
Which public school will your child attend? How severe a sentence will you receive in the criminal justice system? Will you earn tenure as a teacher? In many cities, a new force is playing a critical role in answering these questions: algorithms.
Cities rely on algorithms to help make decisions that affect people’s lives in meaningful ways, from assessing teacher performance to predicting the likelihood of criminal re-offense. And yet, the general public knows almost nothing about how they work.
Take a recent example in New York City: The police department began using algorithms to help decide where to deploy officers across the city. In 2015, the New York Police Department performed a 45-day test of software company Azavea's HunchLab platform, which considers a variety of factors—from historic crime to proximity to bars and bus stops—to determine where crime is most likely to happen. In the coming years, the NYPD pursued similar tests with a number of other companies, and while the department did not deploy any one of these tools, it drew insights from these trials in order to into design its own predictive policing platform. For more on this article, click here.