Home > Author > Cathy O'Neil
81 " Our livelihoods increasingly depend on our ability to make our case to machines. "
― Cathy O'Neil , Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
82 " We can use the scale and efficiency that make WMDs so pernicious in order to help people. It all depends on the objective we choose. "
83 " Welcome to the dark side of Big Data. "
84 " The practice of using credit scores in hirings and promotions creates a dangerous poverty cycle. After all, if you can’t get a job because of your credit record, that record will likely get worse, making it even harder to land work. It’s not unlike the problem young people face when they look for their first job—and are disqualified for lack of experience. Or the plight of the longtime unemployed, who find that few will hire them because they’ve been without a job for too long. It’s a spiraling and defeating feedback loop for the unlucky people caught up in it. "
85 " The algorithms would make sure that those deemed losers would remain that way. A lucky minority would gain ever more control over the data economy, raking in outrageous fortunes and convincing themselves all the while that they deserved it. "
86 " People with savings, of course, can keep their credit intact during tough times. Those living from paycheck to paycheck are far more vulnerable. Consequently, a sterling credit rating is not just a proxy for responsibility and smart decisions. It is also a proxy for wealth. And wealth is highly correlated with race. "
87 " we’ve seen time and again that mathematical models can sift through data to locate people who are likely to face great challenges, whether from crime, poverty, or education. It’s up to society whether to use that intelligence to reject and punish them—or to reach out to them with the resources they need. We can use the scale and efficiency that make WMDs so pernicious in order to help people. It all depends on the objective we choose. "
88 " It’s important to note, as we endeavor to understand relative harms, that they are entirely dependent on context. For example, if a high-risk score for a given defendant qualified him for a reentry program that would help him find a job upon release from prison, we’d be much less worried about false positives. Or in the case of the child abuse algorithm, if we are sure that a high-risk score leads to a thorough and fair-minded investigation of the situation at home, we’d be less worried about children unnecessarily removed from their parents. In the end, how an algorithm will be used should affect how it is constructed and optimized. "
89 " mathematical models were opaque, their workings "
90 " As we’ve seen, they (the companies) routinely reject applicants on the basis of credit scores and personality tests. Health scores represent a natural—and frightening—next step. "
91 " And in Florida, adults with clean driving records and poor credit scores paid an average of $1,552 more than the same drivers with excellent credit and a drunk driving conviction. "
92 " It’s a silent war that hits the poor hardest but also hammers the middle class. Its victims, for the most part, lack economic power, access to lawyers, or well-funded political organizations to fight their battles. The result is widespread damage that all too often passes for inevitability. "
93 " In a federal lawsuit, Baltimore officials charged Wells Fargo with targeting black neighborhoods for so-called ghetto loans. The bank’s “emerging markets” unit, according to a former bank loan officer, Beth Jacobson, focused on black churches. "
94 " In short, WMDs are targeting us all. And they’ll continue to multiply, sowing injustice, until we take steps to stop them. "
95 " If we’re going to be equal before the law, or be treated equally as voters, we cannot stand for systems that drop us into different castes and treat us differently. "
96 " We are judged by what we do, not by who we are. "
97 " finance. What’s the expected amount of data? What’s the expected signal in the data? How much data do we have? What are the opportunities to use this model? What is the payoff for those opportunities? What’s the scale of this model if it works? What’s the probability that the idea is valid? "
― Cathy O'Neil , On Being a Data Skeptic
98 " Once companies amass troves of data on employees’ health, what will stop them from developing health scores and wielding them to sift through job candidates? Much of the proxy data collected, whether step counts or sleeping patterns, is not protected by law, so it would theoretically be perfectly legal. And it would make sense. As we’ve seen, they routinely reject applicants on the basis of credit scores and personality tests. Health scores represent a natural—and frightening—next step. "
99 " Apollo Group, the parent company for the University of Phoenix, spent more than a billion dollars on marketing in 2010, almost all of it focused on recruiting. That came out to $2,225 per student on marketing and only $892 per student on instruction. Compare that to Portland Community College in Oregon, which spends $5,953 per student on instruction and about 1.2 percent of its budget, or $185 per student, on marketing. "
100 " I should note that empathy is hardly a cure-all. It tends to embed biases, because by nature we find it easier to empathize with people like us. "
― Cathy O'Neil , The Shame Machine: Who Profits in the New Age of Humiliation