Home > Author > Ruha Benjamin >

" The intention to harm or exclude may guide some technical design decisions. Yet even when they do, these motivations often stand in tension with aims framed more benevolently. Even police robots who can use lethal force while protecting officers from harm are clothed in the rhetoric of public safety.35 This is why we must separate “intentionality” from its strictly negative connotation in the context of racist practices, and examine how aiming to “do good” can very well coexist with forms of malice and neglect.36 In fact a do-gooding ethos often serves as a moral cover for harmful decisions. Still, the view that ill intent is always a feature of racism is common: “No one at Google giggled while intentionally programming its software to mislabel black people.”37 Here McWhorter is referring to photo-tagging software that classified dark-skinned users as “gorillas.” Having discovered no bogeyman behind the screen, he dismisses the idea of “racist technology” because that implies “designers and the people who hire them are therefore ‘racists.’” But this expectation of individual intent to harm as evidence of racism is one that scholars of race have long rejected.38 "

Ruha Benjamin , Race After Technology: Abolitionist Tools for the New Jim Code


Image for Quotes

Ruha Benjamin quote : The intention to harm or exclude may guide some technical design decisions. Yet even when they do, these motivations often stand in tension with aims framed more benevolently. Even police robots who can use lethal force while protecting officers from harm are clothed in the rhetoric of public safety.35 This is why we must separate “intentionality” from its strictly negative connotation in the context of racist practices, and examine how aiming to “do good” can very well coexist with forms of malice and neglect.36 In fact a do-gooding ethos often serves as a moral cover for harmful decisions. Still, the view that ill intent is always a feature of racism is common: “No one at Google giggled while intentionally programming its software to mislabel black people.”37 Here McWhorter is referring to photo-tagging software that classified dark-skinned users as “gorillas.” Having discovered no bogeyman behind the screen, he dismisses the idea of “racist technology” because that implies “designers and the people who hire them are therefore ‘racists.’” But this expectation of individual intent to harm as evidence of racism is one that scholars of race have long rejected.38