Skip to main content

Review: Weapons of Math Destruction

In an important new book, Cathy O'Neil warns us that algorithms can and do perpetuate inequality

Credit:

Cathy O'Neil

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


“The technology already exists. It’s only the will we’re lacking.” These sentences from Cathy O’Neil’s new book Weapons of Math Destruction have been haunting me since I read it. They come from the last chapter of a book in which she has illustrated again and again how, in the words of her subtitle, "big data increases inequality and threatens democracy." With Facebook's new trending topics algorithm and data-driven policing in the news, the book is certainly timely. 

Weapons of math destruction, which O’Neil refers to throughout the book as WMDs, are mathematical models or algorithms that claim to quantify important traits: teacher quality, recidivism risk, creditworthiness but have harmful outcomes and often reinforce inequality, keeping the poor poor and the rich rich. They have three things in common: opacity, scale, and damage. They are often proprietary or otherwise shielded from prying eyes, so they have the effect of being a black box. They affect large numbers of people, increasing the chances that they get it wrong for some of them. And they have a negative effect on people, perhaps by encoding racism or other biases into an algorithm or enabling predatory companies to advertise selectively to vulnerable people, or even by causing a global financial crisis.

O’Neil is an ideal person to write this book. She is an academic mathematician turned Wall Street quant turned data scientist who has been involved in Occupy Wall Street and recently started an algorithmic auditing company. She is one of the strongest voices speaking out for limiting the ways we allow algorithms to influence our lives and against the notion that an algorithm, because it is implemented by an unemotional machine, cannot perpetrate bias or injustice.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Many people think of Wall Street and hedge funds when they think of big data and algorithms making decisions. As books such as The Big Short and All the Devils Are Here grimly chronicle, subprime mortgages are a perfect example of a WMD. Most of the people buying, selling, and even rating them had no idea how risky they were, and the economy is still reeling from their effects.

O’Neil talks about financial WMDs and her experiences , but the examples in her book come from many other facets of life as well: college rankings, employment application screeners, policing and sentencing algorithms, workplace wellness programs, and the many inappropriate ways credit scores reward the rich and punish the poor. As an example of the latter, she shares the galling statistic that “in Florida, adults with clean driving records and poor credit scores paid an average of $1552 more than the same drivers with excellent credit and a drunk driving conviction.” (Emphasis hers.) 

She shares stories of people who have been deemed unworthy in some way by an algorithm. There’s the highly-regarded teacher who is fired due to a low score on a teacher assessment tool, the college student who couldn’t get a minimum wage job at a grocery store due to his answers on a personality test, the people whose credit card spending limits were lowered because they shopped at certain stores. To add insult to injury, the algorithms that judged them are completely opaque and unassailable. People often have no recourse when the algorithm makes a mistake.

Many WMDs create feedback loops that perpetuate injustice. Recidivism models and predictive policing algorithms—programs that send officers to patrol certain locations based on crime data—are rife with the potential for harmful feedback loops. For example, a recidivism model may ask about the person’s first encounter with law enforcement. Due to racist policing practices such as stop and frisk, black people are likely to have that first encounter earlier than white people. If the model takes this measure into account, it will probably deem a black person more likely  But they are harmful even beyond their potential to be racist. O’Neil writes, 

A person who scores as ‘high risk’ is likely to be unemployed and to come from a neighborhood where many of his friends and family have had run-ins with the law. Thanks in part to the resulting high score on the evaluation, he gets a longer sentence, locking him away for more years in a prison where he’s surrounded by fellow criminals—which raises the likelihood that he’ll return to prison. He is finally released into the same poor neighborhood, this time with a criminal record, which makes it that much harder to find a job. If he commits another crime, the recidivism model can claim another success. But in fact the model itself contributes to a toxic cycle and helps to sustain it.

O’Neil’s book is important in part because, as she points out, an insidious aspect of WMDs is the fact that they are invisible to those of us with more power and privilege in this society. As a white person living in a relatively affluent neighborhood, I am not targeted with ads for predatory payday lenders while I browse the web or harassed by police officers who are patrolling “sketchy” neighborhoods because an algorithm sends them there. People like me need to know that these things are happening to others and learn more about how to fight them. 

While Weapons of Math Destruction is full of hard truths and grim statistics, it is also accessible and even entertaining. O’Neil’s writing is direct and easy to read—I devoured it in an afternoon. And the book is not all grim. In the last chapter, she shares some ideas of how we can disarm WMDs and use big data for good. She proposes a Hippocratic Oath for data scientists and writes about how to regulate math models. Let’s return to the sentences I opened with: “The technology already exists. It’s only the will we’re lacking.” That is bleak—we aren’t doing what we can—but should give us some hope as well. The technology exists! If we develop the will, we can use big data to advance equality and justice.