10.2: Coded Inequalities - Classroom Activity
- Page ID
- 248574
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)Reading the Internet | Classroom Activity | What's In A Name?
How To Use This Material [Instructor Note]
- The following is a general outline for a whole class discussion, where students are introduced to the concept of ‘the algorithm’ through an activity which has them think of their name like a kind of code
- The first section offers a general overview of the subject, before turning to introduce the particulars of the activity; I typically follow this with a 5-10 minute free write, before opening up the class to share
- This activity is meant to follow an engagement with Butler, Sargent, and Smith’s “The Age of Algorithms” chapter, but could certainly work as a stand-alone as well
- You are welcome to make a copy of this material to edit and remix as you wish; please be sure to follow the CC license mandates when doing so
- Additional credit goes to Ruha Benjamin, who mentions utilizing a similar activity in her own courses, which inspired this one
Introduction
As the title suggests, “The Age of Algorithms” looks to better contextualize, explain, and, ultimately, problematize a digital technology we encounter each and every day, but don’t always stop and critically consider: algorithms. As noted early on in the chapter, “everyone who has accessed the internet has experienced the personalizing actions of algorithms, whether they realize it or not” (Head et al, as cited by Butler et al). It is algorithms, for instance, which fuel the auto-correct feature on our phones, laptops, and tablets. It is algorithms which power the search engine results we conjure each and every day. It is algorithms which allow map apps to correctly guide us around town. Algorithms are essential for digital life, but nevertheless remain technological processes we rarely consider, and often take for granted.
“There is no doubt,” Butler, Sargent, and Smith continue, “that algorithms can be useful and help to improve our lives.” However: they note, too, that we should look at these daily technologies with a bit more critical perspective. What happens, they wonder, “when algorithms are used to predict when college students are ‘cheating’ on a test, or to predict who should be hired for a job or who should get a loan, or to decide the type of information we see in our social media news feeds, or to calculate credit scores, or even to predict criminal behavior and determine prison sentences? Are Google search results really an unbiased presentation of the best available information on a research question? How do algorithms impact our perception of a research topic, or of our own realities?”
For many contemporary tech scholars, activists, and journalists, better making sense of the power and influence of the algorithm in modern life is one of fundamental importance, a pressing issue of our time. As our scholars for the week argue, “the pervasiveness of algorithms—and their incredible potential to influence our society, politics, institutions, and behavior—has been a source of growing concern.” The following image offers a great, over-aching look at the practice (and problems!) of modern algorithms:
Today’s discussion is centered around helping us all better understand, spot, and call out the algorithm in our day-to-day lives. In particular, we’ll be exploring the idea of ‘algorithmic bias,’ the ways in which algorithms can carry over biases, subjectivities, and stereotypes from our ‘real’ world, and manifest them in our digital one. We will do this by thinking through a code—and larger algorithmic system—we engage with each and every day: our names. But first: a bit of context.
Algorithmic Bias
While computing power is often discussed as an inherently apolitical, neutral, and objective force enacted through the cold, calculative properties of code, the reality is that these technologies are far more entrenched in the social than we often understand. We cannot forget that computer code, while based on mathematical processes, is created by humans, and therefore influenced by the social, political, and cultural just as much as we are; as Safiya Noble argues in Algorithms of Oppression, “part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings” (1). Digital technologies, functionalities, and processes are innately tied to the humans who create them, and by that very fact, cannot be freed of all the nasty, lowly, biased, subjective qualities that mark the human experience—one’s hands are not magically washed clean of all their biases and subjective worldviews when their fingers touch upon a keyboard.
And, unfortunately, the fingers upon the keys tend to hail from certain demographics as opposed to others: as Catherine D’Ignazio and Lauren Klein report, “according to the most recent data from the US Bureau of Labor Statistics, released in 2018, only 26 percent of those in ‘computer and mathematical occupations’ are women. And across all of those women, only 12 percent are Black or Latinx women, even though Black and Latinx women make up 22.5 percent of the US population.” Moreover, a study done by the research group AI Now found that “women comprise only 15 percent of AI research staff at Facebook and 10 percent at Google.” More alarming still, D’Ignazio and Klein discovered that these numbers are actually getting worse: a 2015 report from the American Association of University Women found that “women computer science graduates in the United States peaked in the mid-1980s at 37 percent,” with the number dropping to around 26 percent at the time of the study (Data Feminism, 80).
These failures to create a diverse and robust computing community have real, damaging effects on the products they create. Take Google Search, for instance: when Safiya Noble searched “black girls” in Google in 2012, the first result was for pornography site “sugaryblackpussy.com.” Rather than providing “relevant and culturally situated knowledge on how women of color have traditionally been discriminated against, denied rights, or been violated in society and the media,” or how they “have organized and resisted this on many levels,” the “fair” and/or “unbiased” code of Google’s ‘Universal Search’ instead regurgitated the very same racist, exploitative, and sexualized representational frameworks of black girls that circulate in the so-called ‘real’ world. In another instance, first brought to light by teenager Kabir Ali on Twitter in 2016, Google Search offered up radically different results when searching “three black teenagers” and “three white teenagers”: for the former, “the results that Google offered were of African American teenagers’ mugshots, insinuating that the image of Black teens is that of criminality,” while for the latter, the results were “wholesome and all-American” (80). As Noble writes, these instances made clear that “search engine results don’t only mask the unequal access to social, political, and economic life in the United States as broken down by race, gender, and sexuality—they also maintain it” (40): all across the web “algorithms are serving up deleterious information about people, creating and normalizing structural and systemic isolation, or practicing digital redlining, all of which reinforce oppressive social and economic relations” (10).
It might be nice to think of digital tech as a roadway to a more just and egalitarian future, but the reality is far from it; code does not free one from their own encoded biases, perspectives, oversights, and hatreds. And thinking that it does is part of the problem. As Ruha Benjamin argues in Race After Technology, we are bearing witness to what she coins “the New Jim Code,” where we continually employ “new technologies that reflect and reproduce existing inequities,” while they are nevertheless “promoted and perceived as more objective or progressive than the discriminatory systems of a previous era” (1). The illusion of objective, perfected code, in other words, is one of this new discriminatory regime’s greatest attests. Internet companies and tech developers “encode judgments into technical systems,” but at the same time “claim racist results are entirely exterior to the coding process,” which allows “racism [to] become doubled—magnified and buried under layers of digital denial” (6).
What's In A Name?
Ruha Benjamin’s brilliant text on race and contemporary technology, aptly titled Race Against Technology, opens with what might seem, to many, a strange preamble: “naming a child is serious business,” she writes, “and if you are not white in the United States, there is much more to it than personal preference” (1).
As she argues, names may very well be “one of the everyday tools we use to express individuality and connections, [but] they are also markers interacting with numerous technologies, like airport screening systems and police risk assessments, as forms of data” (1). Names are not objective, neutral, all treated equally—“names are racially coded” (1), and can open up doors for some while shutting doors for others: “like a welcome sign inviting people in or a scary mask repelling and pushing them away, this thing that is most ours is also out of our hands” (2).
To begin her course on race and racism, Benjamin mentions starting in a similar place, turning to an activity she designed called ‘What’s In A Name.’ The goal is to get students thinking a bit about what larger social, cultural, and political values are embodied in a name. As Benjamin writes, “what’s in a name? Your family story, your religion, your nationality, your gender identity, your race and ethnicity? What assumptions do you think people make about you on the basis of your name? What about your nicknames—are they chosen or imposed?” (1).
By starting with something so familiar and immediate to students, Benjamin notes that the class “begin[s] to understand that all those things dubbed ‘just ordinary’ are also cultural, as they embody values, beliefs, and narratives” (2). Names, in other words, “are not neutral but racialized, gendered, and classed in predictable ways” (2).
One’s name, in other words, is not just their ‘title,’ but a piece of information, a marker for both who they are and how the world sees them. When this piece of information intersects with the world, different results are generated depending on what that name is. The assumed unimportance of a name, in other words, masks how very, very influential that information is. Similarly, data technologies are thought to be objective and mathematical, inherently non-human and thus free from all the biases that mark the human social experience. And yet: different results are generated depending on who you are, depending on what ‘name’ you plug in. Even more alarmingly, unless you're sitting next to a friend who has a different ‘name,’ you’d have no idea of this difference, nor of the fact that what was generated for you was the result of a host of biases within the technology itself. This is what Benjamin is trying to get us to see: that technology works on us differently because of a variety of social/cultural/political factors, and yet these differences are so often lost, in part because we assume digital technologies to be objective agents. Which means these unequal results proliferate and grow in the world, making the inequities worse, not better. “Just as in naming a child, there are many everyday contexts—such as applying for jobs, or shopping—that employ emerging technologies, often to the detriment of those who are racially marked” (1).
5 Minute Free-Write
- Question: “What’s up with your name?”
- Is it a family name, or something come up with out of the blue? Does it nod to some larger tradition, idea, or concept? Any ties to religion or pop culture? Any ties to your race, ethnicity, and/or culture? What do other people make of your name? Do you think anyone has ever made an assumption about you, because of your name? Have you ever used a nickname? When someone hears your name, what do you think they might imagine in their mind’s eye?
Class Discussion
- In my experience, students respond positively to the prompt/writing activity, and many want to share, so I start with a bit of an open free-for-all where students are invited to speak to their takes on the prompt, without much guidance or structure, before steering towards a more explicit connection between names and algorithms/code.
- Below are a few perspectives/quotes/frameworks/etc. you might find helpful, both in the early stages and as you try to shift into a larger conversation about code/algorithms
Discussion Aids
- “Usually, many of my White students assume that the naming exercise is not about them. ‘I just have a normal name,’ ‘I was named after my granddad,’ ‘I don’t have an interesting story, prof.’ But the presumed blandness of White American culture is a crucial part of our national narrative. Scholars describe the power of the plainness as the invisible ‘center’ against which everything else is compared and as the ‘norm’ against which everyone else is measured. Upon further reflection, what appears to be an absence in terms of being ‘cultureless’ works more like a superpower. Invisibility, with regard to Whiteness, offers immunity… a ‘normal’ name is just one of many tools that reinforce racial invisibility” (Benjamin 2)
- "Part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings. While we often think of terms such as 'big data' or 'algorithms' as being benign, neutral, or objective, they are anything but" (Safiya Umoja Noble, Algorithms of Oppression, 1)
- “With emerging technologies, we might assume that racial bias will be more scientifically rooted out. Yet, rather than challenging or overcoming the cycles of inequity, technical fixes too often reinforce and even deepen the status quo” (Benjamin 3)
- “All other things being equal, job seekers with White-sounding first names received 50 percent more callbacks from employers than job seekers with Black-sounding names. They calculated the racial gap was equivalent to eight years of relevant work experience, which White applicants did not actually have; and the gap persisted across occupations, industry, employer size—even when employers included the ‘equal opportunity’ clause in their ads” (Benjamin 3)
- LinkedIn, 2016: recommended male variations of women’s names—Andrea would bring up a prompt asking if users meant ‘Andrew,’ suggesting that one gender was preferred over another
- Target, 2012: gathered data points to infer when women customers were pregnant, even if they had not announced it, and then sharing that information with marketing partners—led to a high school girl being outed by ads served on the family computer
- Spotify, 2019: women’s music recommended less consistently than men’s music
- Nikon, 2010: cameras with ‘smart’ tools built-in kept warning Asian photo-takers that they were blinking in the photo, and prompting to take another shot
- Google, 2015: image-identification algorithm kept identifying Black users as gorillas
- Facebook, 2017: an algorithm designed to remove online hate speech was found to advantage white men over black children when assessing objectionable content, according to internal Facebook documents; while algorithms are used to track and block hate speech, some were found to be 1.5 times more likely to flag information posted by Black users and 2.2 times likely to flag information as hate speech if written in African American English
- Uber, 2018: Trans drivers couldn’t be verified by the app’s facial verification tool (needed to drive for the company)
- Stanford study, 2017: tested algorithms in a machine learning system that was said to be able to detect an individual's sexual orientation based on their facial images; the model in the study predicted a correct distinction between gay and straight men 81% of the time, and a correct distinction between gay and straight women 74% of the time.
- “Such findings demonstrate what I call ‘the New Jim Code’: the employment of new technologies that reflect and reproduce existing inequalities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era. Like other kinds of codes that we think of as neutral, normal names have power by virtue of their perceived neutrality” (Benjamin 3)
- "Digital decisions reinforce oppressive social relationships and enact new modes of racial profiling" (Safiya Umoja Noble, Algorithms of Oppression, 1)
- "Algorithmic oppression is not just a glitch in the system, but, rather, is fundamental to the operating system of the web" (Safiya Umoja Noble, Algorithms of Oppression, 10)
Works Cited
Benjamin, Ruha. Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press, 2019.
D'Ignazio, Catherine, and Lauren Klein. Data Feminism. MIT Press, 2020.
Duhigg, Charles. “How Companies Learn Your Secrets.” The New York Times, 16 Feb. 2012, www.nytimes.com/2012/02/19/magazine/shopping-habits.html.
“Facebook's hate speech algorithms are biased, internal documents reveal.” The Guardian, 11 Nov. 2017, www.theguardian.com/technology/2017/nov/11/facebook-hate-speech-algorithms-biased-internal-documents.
“Google’s Image-Recognition System Tagged Black People as ‘Gorillas’.” The New York Times, 28 Jun. 2015, www.nytimes.com/2015/06/29/technology/google-image-recognition-gorillas.html.
Kosinski, Michal, and Yilun Wang. "Deep neural networks are more accurate than humans at detecting sexual orientation from facial images." Journal of Personality and Social Psychology, vol. 114, no. 5, 2018, pp. 546-558.
“LinkedIn’s Gender Bias: Why Does It Suggest Men’s Names for Women?” TechCrunch, 27 Jun. 2016, techcrunch.com/2016/06/27/linkedins-gender-bias/.
Morrison, Kaitlyn. "Uber’s facial recognition system fails to verify transgender drivers." The Verge, 6 Aug. 2018, www.theverge.com/2018/8/6/17662300/uber-facial-recognition-transgender-drivers-verification.
“Nikon Apologizes for ‘Blinking’ Warning to Asian Users.” BBC News, 17 Dec. 2010, www.bbc.com/news/technology-12050354.
Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press, 2018.
“Spotify’s Gender Bias: Women’s Music Recommendations Are Less Reliable.” The Guardian, 15 May 2019, www.theguardian.com/music/2019/may/15/spotify-gender-bias-music-recommendations.
© J. F. Lindsay, CC BY-NC-SA