top of page
  • Writer's pictureShamika Klassen

My Reflections on "Race After Technology": The Introduction

Updated: Feb 7, 2022

Name: Race After Technology: Abolitionist Tools for the New Jim Code

Citation: Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code. Polity, 2019.

Author: Ruha Benjamin

Institution: Princeton University

Keywords (Subject and Genre from San Francisco Public Library website): Digital divide — United States — 21st century, Information technology — Social aspects — United States — 21st century, African Americans — Social conditions — 21st century, Whites — United States — Social conditions — 21st century, SOCIAL SCIENCE / Demography, United States — Race relations — 21st century.


Ruha Benjamin, inspired by Michelle Alexander’s scholarship about the New Jim Crow, focused her attention on new and current technology and how it reinforces and perpetuates inequities while purporting to be either objective or progressive. For Benjamin, the New Jim Code is “...the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era” (Benjamin, p. 5-6). A response to techno-solutionism (the idea that technology can be used to solve any and all problems including social, cultural, or societal issues), techno-determinism (the reductionist theory that draws a connection between the technology a society possesses and its social structure and cultural values) and techno-chauvinism (the belief that technology is better than humans), Benjamin crafts a well researched and engaging argument for how technology and race interplay and are affected by one another.


Everyday Coding

The introduction begins with names. Stemming from an activity that Benjamin facilitates with her students, the convention of names and how they are racially coded is used to illustrate one of many everyday contexts that use emerging technology which negatively impacts racially coded people. For White students who believe they “have a normal name” directly plays into what Benjamin describes as “the presumed blandness of White American culture” as “a crucial part of our national narrative” and ultimately exposes how “[i]nvisibility, with regard to Whiteness, offers immunity” (Benjamin, p. 4) Benjamin points out how names posses social codes imbued with race, gender, and even class. While some Black people are criticized for seeming to “make up names”, Benjamin rightly points out that at some point all names were made up but only certain people thanks to their social position can experiment with names without as much scrutiny. For myself, celebrities came to mind. Elon Musk’s child had recently been born as I was first reading this book who is named X Æ A-Xii. Of course, being a Black woman with the name “Shamika”, I also thought about the narrative I told myself and others about my name being made up as it was unlikely to be in any baby name books.

In her own words, Benjamin describes Race After Technology as “a field guide into the world of biased bots, altruistic algorithms, and their many coded cousins” (Benjamin, p. 7). As previously mentioned, Benjamin draws from Michelle Alexander’s work in The New Jim Crow. Benjamin informs readers that Jim Crow debuted in 1832 as a minstrel show character and became “an academic shorthand for legalized racial segregation, oppression, and injustice in the U.S. South between the 1890s and the 1950s” (Benjamin, p. 9). The shift from the old to the New Jim Crow, facilitated by neoliberalism and “explicit racialization to a colorblind ideology”, offers codes that obfuscate underlying white supremacist motivations. “Criminal” becomes a code for not only “Black” but “poor, immigrant, second-class, disposable, unwanted, detritus” (Benjamin, p. 9). Systems and technologies tasked with sorting or making decisions about/for/on behalf of people “from employment, education, healthcare, and housing” ultimately construct a “digital caste system” with technology that is presented to be “morally superior because they purport to rise above human bias, even though they could not exist without data produced through histories of exclusion and discrimination” (Benjamin, p. 10) Going beyond the colorblindness of the New Jim Crow, Benjamin recognizes that the New Jim Code encompasses situations in which technology not only “sees” race but racial difference. People assume that if a technology produces a vision of racial differences, it is more objective than if a human were to do so regardless of the fact that humans are creating the technology, to begin with.

Move Slower…

One of the things I most enjoyed about this book was the many poignant quotes and various scholars, artists, activists, researchers, and more who are evoked in excerpts throughout the book. This section of the introduction begins with one such quote from the co-founder of Data for Black Lives Yeshimabeit Milner: “[t]he decision to make every Black life count as three-fifths of a person was embedded in the electoral college, an algorithm that continues to be the basis of our current democracy” (Benjamin, p. 11). If an algorithm is a set of instructions designed to solve a problem, who gets to decide what the problem is in the first place? What if you and your people are deemed the problem? This reminded me also of W.E.B. DuBois’ famous quote from 1903, “the problem of the twentieth century is the problem of the color line”. For Benjamin, “[t]he animating force of the New Jim Code is that tch designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process.” (Benjamin, p. 11-12). In addition to the neoliberalism that underlies the conversation, Benjamin also points to libertarianism “which extols individual autonomy and corporate freedom from government regulation” as a mechanism that undergirds private tech companies resistance to regulation and insistence on self-regulation (Benjamin, p. 12). However, private tech companies whose products and services reach millions if not billions of people around the world make choices that should in fact be public policy decisions. It is here that Benjamin raises Facebook’s original motto: “Move Fast and Break Things” only to ask “What about the people and places broken in the process?” (Benjamin, p. 13).

Tailoring: Targeting

A lot of people take an instrumentalist approach to technology as see it as an instrument or tool. Benjamin takes race as a tool “one designed to stratify and sanctify social injustice as part of the architecture of everyday life” that has been used by White Americans in the U.S. “to denigrate, endanger, and exploit non-White people - openly, explicitly, and without shying away from the deadly demarcations that racial imagination brings to life (Benjamin, p. 17-8). Benjamin notes that the shift toward multiculturalism has followed a shift away from “one-size-fits-all mass marketing toward ethnically tailored niches that capitalize on calls for diversity” without actually making significant or lasting changes to that end (Benjamin, p. 18). Benjamin presents marketing fails like pinkwashing (BICs For Her pens and IBM’s Hack a Hair Dryer initiative) as examples of “tailoring morph[ing] into targeting and stereotypical containment” (Benjamin, p. 21). Targeting technology can make a turn toward “criminalizing misrepresentation” such as facial recognition software which Benjamin touches on briefly but assures readers she will return to in subsequent chapters (Benjamin, p. 22).

Why Now?

At the heart of Benjamin’s argument is the fact that “the glaring gap between egalitarian principles and inequitable practices is filled with subtler forms of discrimination that give the illusion of progress and neutrality, even as coded inequity makes it easier and faster to produce racist outcomes” (Benjamin, p. 22). As Benjamin remarks that White supremacists have taken well to technology, I resonate with that claim in regards to conservative and evangelical-leaning faith communities. Part of the reason I decided to study technology, ethics, and social justice issues while in seminary was because I noticed social justice-oriented, progressive, and affirming faith communities were not as likely to be tech-savvy as their counterparts. The embrace of technology by White supremacists is welcomed by many tech companies because they drive clicks, views, and most importantly ad sales. Money from White supremacists is more profitable than upholding social justice for tech companies.

In addition to rampant and profitable White supremacy on social media platforms, Benjamin cites the paradox raised by Michelle Alexandar as another reason to discuss the New Jim Code: “the legalized discrimination afforded by the U.S. penal system at a time when de jure segregation is no longer acceptable” (Benjamin, p. 24). Almost ironically, thanks to the work of Alexander and others to raise awareness of these issues, people are looking to technology to address the issue more humanely or objectively. In reality, technology such as ankle bracelets or crime prediction software merely “encode[s] inequity in a different form” (Benjamin, p. 25).

The presence of more prominent Black people, similar to the facade of progress with placating superficial diversity in pop culture, does not equate to an indication that racial progress is being made. In 2018, Microsoft premiered a campaign with rapper Common to advertise their A.I., however, actually integrating Black voices in A.I. like Siri is looked down upon. Finally, Benjamin notes that automated decision-making systems are often touted as cost-cutting measures and ways to minimize bias. As Benjamin puts it, “[p]rofit maximization, in short, is rebranded as bias minimization” (Benjamin, p. 30). As these systems take over, quite literally, they begin to call into question what it means to be human, have autonomy, or free will. While Benjamin discusses the “redefinition of human identity, autonomy, core constitutional rights, and democratic principles more broadly”, she brings up a quote by philosopher Sylvia Wynter that aligns with a concept I coined a few years ago in seminary (Benjamin, p. 31). Wynter proposes three “genres” of humans: “full humans, not-quite humans, and nonhumans'' and Benjamin notes that through these genres, “racial, gendered, and colonial hierarchies are encoded” which resonates with my concept - the uncanny valley of humanity (Benjamin, p. 31). Here is an excerpt from a paper I published in the RESET Journal in 2021 about the concept:

In robotics and Computer Generated Imagery (CGI), the uncanny valley is the moment at which a manufactured object looks and moves like a human - but not quite - resulting in an unpleasant response (Mori, 1970). The gap between perfection in presenting as human and that moment makes the viewer uncomfortable...The uncanny valley of humanity is a concept within which a person is dehumanized and their deviation from what is considered the norm causes a negative reaction. The norm for the contemporary westernized context is the white, heterosexual, cisgender, able-bodied men who historically and to this day have been established as the example of a healthy average human. This demographic of people is not found lacking and able to be considered 100% whole. Other identities, which are seen as deviant to the accepted norm, fall in line behind them at decreasing percentiles. When someone who deviates from this set of identities acts or speaks in a way that is out of line with their expected stereotypical behavior, it moves that person closer to the traits of a whole human, a space that is currently owned by the hegemonic norm. Assimilating moves put the deviant into the uncanny valley of humanity, thus making others uncomfortable and disturbed. (Klassen, p. 11)

Back to Benjamin, who uses Wynter’s analysis to see anti-Black technologies which do not limit their harms only to Black people. Instead, Benjamin insists that “Black people already live in the future” and paints the plight of Black people at the hands of various technologies as the canaries in the mineshaft of society (Benjamin, p. 32). Benjamin goes on to critique futurists who envision a post-humanist reality as it assumes that everyone has had the opportunity to be human and like all posts (postracial, postcolonial, etc) it stems from the experiences of “the Man”. The architecture of power can be brought into relief by decoding the racial aspects of technology as well as highlighting the construction of various genres of humanity. This architecture uncovers both the top-down aspect of powerful tech companies pushing their products onto the public as well as our individual roles of opting into these systems. Targeted ads may seem preferable, but, as Benjamin points out, “there is a slippery slope between effective marketing and efficient racism” and “coded inequity makes discrimination easier, faster, and even harder to challenge” (Benjamin, p. 33). As more people and institutions speak out against racial discrimination, the practice is becoming more a part of our everyday sociotechnical practices.

The Anti-Black Box

The scholarship of Race After Technology is rooted in science and technology studies (STS), critical race studies, and a framework Benjamin calls race critical code studies (Benjamin, p. 34). The term “Black Box” is used in STS to mean “how the social production of science and technology is hidden from view” such as the “secret algorithms” used in Silicon Valley and Wall Street are vehemently protected by the law while the privacy of citizens is ignored (Benjamin, p. 34). Benjamin’s term of the anti-Black box “links the race-neutral technologies that encode inequity to the race-neutral laws and policies that serve as powerful tools for White supremacy” (Benjamin, p. 35). Because geography can be used as a reliable proxy for race, lawmakers and technologies can instill racist ideologies without explicitly using race. How can we change the way emerging technology is designed to be more race-conscious?

Race as Technology

In order to see race as a technology, Benjamin describes race as “designed to separate, stratify, and sanctify the many forms of injustice experienced by members of racialized groups but one that people routinely reimagine and redeploy to their own ends” (Benjamin, p. 36). Just as many see technology as a tool, Benjamin makes the case for race itself as a tool as well. One that has been forged over hundreds of years and affected countless lives. While race is a technology and a tool, racism is described by Benjamin like doublespeak from 1984 as a way to reconcile contradictions. In the United States at its founding, the contradictions of the constitution saying “...all men are created equal” while upholding chattel slavery is a prime example of how racism reconciles contradictions.

As a testament to how historical imagination and language can shape ideas and even societies, Benjamin unpacks the case of the Luddites in nineteenth-century England. Considered by some to have been haters of technology and progress, Benjamin instead tells of a group of workers who were “protesting the social costs of technological ‘progress’ that the working class was being forced to accept” (Benjamin, p. 37). The Luddites were not against technology but the way it was being rolled out which disregarded negative impacts on both the workers and society. This sentiment is more important now more than ever as Amazon workers and “electronic sweatshop” workers for Apple, Dell, and HP are given demanding quotas without time for even bio breaks.

Beyond Techno-Determinism

Techno-determinism sees technology as inevitable and unstoppable. Benjamin does not agree and also pushes us to consider that it is not just the case that society is impacted by technology. Technology is impacted by society. Going past mere techno-determinism, Benjamin states, “...any given social order is impacted by technological development, as determinists would argue, but that social norms, ideologies, and practices are a constitutive part of technical design” (Benjamin, p. 41). Turning to early scholarship on the internet and race, Benjamin remarks that the “digital divide” discourse that was characterized by race, class, and gender lines also saw technological access as a solution to inequality. This perspective erased the many ways that marginalized people were engaging with technology and ignored the structural barriers those groups faced in regard to accessing the Internet and obtaining the tech needed to do so. In the early days of the Internet, the utopia of a race-free space where “nobody knows you're a dog” relied on text-only web spaces. With the proliferation of visual media online, this utopia is far from reachable. Regardless, “in both conceptions, technology is imagined as impacting racial divisions - magnifying or obliterating them - but racial ideologies do not seem to shape the design of technology” (Benjamin, p. 43).

Evoking Safiya Noble’s Algorithms of Oppression and Simone Browne’s Dark Matters: On the Surveillance of Blackness, Benjamin positions her work as an extension of their scholarship. “[A]nti-black racism, whether in search results or in surveillance systems, is not only a symptom or outcome, but a precondition for the fabrication of such technologies” (Benjamin, 44). One manifestation of Benjamin’s approach is race critical code studies. She defines it “not just by what we study but also by how we analyze, questioning our own assumptions about what is deemed high theory versus pop culture, academic versus activist, evidence versus anecdote” (Benjamin, p. 45).

Beyond Biased Bots

Instead of stopping at the notion of bias in robots and A.I., Benjamin wants to expand our scope and analysis. The New Jim Code has four dimensions: “engineered inequity, default discrimination, coded exposure, and technological benevolence” which each will be covered in subsequent chapters (Benjamin, p. 47). Chapters 1-4 respectively cover engineered inequity “works to amplify social hierarchies that are based on race, class, and gender”; default discrimination “grows out of design processes that ignore social cleavages”; coded exposure refers to “various form of visibility and of how, for racialized groups, the problem of being watched (but not seen) relates to newfangled forms of surveillance”; and technological benevolence “animates tech products and services that offer fixes for social bias” (Benjamin, p. 47). Chapter 5 examines the various people who are challenging the New Jim Code as well as ways we, the reader, can demand more justice-oriented technology.

The introduction is an excellent taste of what is to come in the rest of Benjamin’s work. She outlines and motivates the problems within and surrounding technology and the study of its impact on society. As Benjamin will soon unpack the dimensions of the New Jim Code in upcoming chapters, the introduction leaves the reader willing to answer her final words of “Are you ready?” with a resounding “Yes!”

Mori, M. (1970). “The Uncanny Valley,” Energy, vol. 7, no. 4, pp. 33–35, (in Japanese)

Shamika Klassen. 2021. Rendering Us All. Reset - The Journal. Goldsmiths Racialised Postgraduate Network, Goldsmiths University Press, Volume 1- Issue 1. p. 11-13.

135 views0 comments


bottom of page