Algorithms are a set of rules that help us succeed in doing something. Yet while we might think they are objective and scientific, they are actually opinions embedded in code. Cathy O’Neil is a Harvard and MIT-educated American mathematician who worked in finance and as a data scientist for almost a decade. In 2016 her book Weapons of Math Destruction was long-listed for the National Book Award for non-fiction; an extraordinary feat, she admits, for a book about algorithms.
BAD DATA
At its heart, however, O’Neil’s bestselling book is about social justice. She wrote the book in response to an increasing concern that societies are turning over some of their most important decisions to algorithms based on obsolete, unethical assumptions. This process goes unchecked, she says, as algorithms seem complicated and intimidate people.
DEFINE SUCCESS
Algorithms are easier to understand than we think, says O’Neil. We actually use them all the time. Say we are making a meal. If we were creating an algorithm for that, the ‘data’ we use is a combination of the ingredients, the time that we have to make the meal, and our objective; that is, what makes that meal successful for us. This is key, she says, as for a parent a meal is successful if it is healthy and nutritional; but for kids it is ‘successful’ if it is tasty, colourful or appears on an advert in the break of their favourite TV show! The first rule of algorithms, says O’Neil, is that it depends on opinion.
THE BIG ISSUES
These days algorithms are being used to solve societies’ more fundamental questions: from how to find a partner to how to match the right people with the right job; from how to improve education, to how to predict, prepare for or even prevent the spread of an illness. Although often algorithms are created with good intentions, sometimes they are not. It was O’Neil’s experience working in the financial sector, where algorithms were being used as an exploitative way to make easy money, that prompted her to leave her job, disillusioned, and dedicate herself to social justice instead.
algorithms as social control
In her bestselling book Weapons of Math Destruction the mathematician Cathy O’Neil shows how society has become reliant on algorithms to solve its most prominent issues. But while we assume they provide us with a more objective means of finding a job or getting a mortgage, she says, we are putting our trust into formulas that most of us cannot understand; sometimes not even an expert like O’Neil can understand them. In a talk about the book, O’Neil began by saying that mathematical notation is not as complicated as it seems.
Cathy O’Neil (American accent): Mathematicians use notation as shorthand for much more complicated things that they’d have to write out with words. But when non-mathematicians see notation, they get scared. They’re intimidated. They feel like there’s some kind of authority there, there’s some kind of objectivity, some scientific truth that they’re not allowed to question because they’re not experts. So that ‘authority of the inscrutable’ is translated as well to algorithms.
WHEN ALGORITHMS ARE WEAPONS
Although some algorithms may be well-intentioned, many are not helping people at all, says O’Neil. These, she calls ‘weapons of math destruction’. She lists the criteria:
Cathy O’Neil: They’re secret, they’re opaque, people who are targeted by these algorithms don’t understand how they work. Usually the people who are targeted do not agree with the definition of success. These are algorithms that are used in all sorts of places and all sorts of industries. But they’re used as a form of social control.
EDUCATION
One example is the value-added model for teachers, an algorithm aimed at saving money and improving education in the US. As of 2010, school districts across the country had adopted the system, including Chicago Public Schools and New York City Department of Education. O’Neil lists some of the effects.
Cathy O’Neil: It’s a widespread current education reform algorithm. It’s supposed to hold teachers accountable for good teaching. People in DC got fired based on bad value-added model scores. The Chicago teachers’ strike was largely an argument over how much value-added teachers’ scores could be used in assessing teachers…
TOO ABSTRACT
In October 2019, the Chicago teachers’ strike paralysed the nation’s third-largest school district for nearly a fortnight. Elsewhere, the algorithm’s ranking system made a major impact upon teachers and countless school children. Naturally, says O’Neil, those affected by the algorithm wanted to know what was in it.
Cathy O’Neil: My friend who runs a high school in New York wanted to understand this, it’s [she’s] a math and science high school [teacher] so she thought she’d be able to understand it. She asked her department of education to send her information about it [and] they said, “Oh, you wouldn’t want to know about it, it’s math.” She persisted, finally got a white paper and showed it to me. It was unreadable. Too abstract to be useful.
FIND THE SOURCE!
O’Neil went further — looking for where the algorithm came from. But she found herself blocked at every turn.
Cathy O’Neil: So, I filed a freedom of information act request to get the source code… which was denied. Nobody in the department of education in New York City understands that model. No teacher gets to understand their score, nor can they improve their score because it’s [they’re] not told how. We’re talking about accountability for teachers but there’s no accountability for the model.
IN THE COURTS
The use of algorithms goes much further. Judges in nearly half the US States employ them to predict if an offender might commit more crimes and their flight risk. This data is used to guide judges in sentencing, bail or whether to grant or deny parole. So what sort of data goes into these algorithms? O’Neil was shocked by what she found.
Cathy O’Neil: Whether you finished high school, whether you have a job, or whether you will have a job upon leaving prison, and even whether your father was in jail… this is stuff that wouldn’t be considered acceptable if a lawyer brought it to the judge. It’s unconstitutional. But again, because it’s cloaked behind mathematical obscurity, it lacks accountability.
COUNTERPRODUCTIVE
Placing pre-existing bias in secret codes increases inequality, argues O’Neil.
Cathy O’Neil: Models are embedded opinions. They’re embedded historical practices. They threaten democracy because part of living in a democracy is understanding the rules. So unless we specifically make sure that models do not unfairly punish poor people or black people, we will end up with models that do.