By Anya Shukla
So here’s the thing. I know that “Algorithms of Oppression” was a New York Times best-seller and the book was selected as NYU Press’ Book of the Decade and its author is a 2021 MacArthur Fellow. But I did not like it. (I’M SO SORRY.) I did not have a fun time reading it. I did not enjoy girding my mental loins and plodding into this oh-so academic text.
I really hate going against popular opinion but here goes…
Review: Safiya Umoja Noble’s “Algorithms of Oppression” illustrates the bias in seemingly “neutral” search and categorization systems used in computer science and information/library sciences. She primarily uses Google as an example, illustrating the ways its search system and page rankings perpetuate racism.
Google has bettered many of its algorithms since the publication of this book, yet many of Noble’s ideas still hold true today… if they can be understood by the everyday reader. Noble's academic writing style often obscures the information she wishes to convey. My Rating: 2.5/5.
What I Loved: Learning stuff! Reading this book was almost like being in school again. (Never thought I would say this... but I miss school so bad. I am starved for knowledge. August cannot come fast enough.)
Idea #1: Many believe Google is a neutral service that shows us search results based on an objective algorithm, prioritizing pages with the greatest number of hyperlinks. However, Google is a company that wants to maximize its profits. How? By prioritizing its own products (Google Books, Maps, etc.) or advertisements from paying customers by placing them at the top of the search results. Companies that have economic capital can “game” the system by using keywords or other paid search engine optimizing tools to push themselves up in the search results.
For example, searching “black on white crime” in 2015 provided Noble with a slew of white supremacist and nationalist sites that shared false information about black-on-white crime statistics. Google did not direct searchers to the FBI site, which provides accurate information about inter- and intra-racial crime.
These search results have a tangible impact. When Dylann Roof searched for “black on white crime,” he clicked on the first site he saw, a white nationalist organization “cloaking” its identity as a mainstream source. He received cherrypicked statistics. These erroneous facts led him down a path that culminated in a hate crime and the deaths of nine Black worshipers.
Idea #2: Algorithms produce biased search results. When Noble searched “black girl" in Google, she received a page of porn results. When she searched for “doctor” or “professor style,” she saw rows of images of white men.
If advertising companies and the sex industry demean women of color yet have the economic means to optimize its page ranking, that bias will be seen in search results. If societal convention and thousands of digitized articles state that women shouldn’t go into STEM, Google’s hyperlink-based algorithm will reflect that in its images. Worse, we internalize what we see in the media—and search results are a type of media that many believe to be accurate due to their “objective” nature.
But what is objectivity, really? Not a description of our world, which centers people (often white and male) who have historically held power. So if Google builds an algorithm that reflects current or past societal trends, the resulting product will not be objective. A seemingly “neutral” algorithm that does not course-correct for racism and sexism in our society will perpetuate inequities.
What I Didn’t Love: “Algorithms of Oppression” begins with these words: “This book is about the power of algorithms in the age of neoliberalism and the ways those digital decisions reinforce oppressive social relationships and enact new modes of racial profiling” (pg. 28). I’m sorry, what?
My issue is not with Noble's writing, but with the fact that she expects a casual reader to understand the terms she uses without context. This book is written for a specific type of person—someone who knows the meaning of “neoliberalism” without looking it up in the dictionary and implicitly understands the link between the military-industrial complex and the internet. Heck, I regularly spend time learning about social justice, and I was confused!
“Algorithms of Oppression” reads like a Ph.D. thesis destined solely for circulation in the highest of scholarly circles. (Especially because Noble has a habit of describing the research she intends to cover, talking about said research, and then summarizing the research she has already talked about.) The book contains many relevant, valuable points, but if its author cannot speak to us plebeians, then I feel it’s difficult for her message to spread to a broader public. Individuals who do not have knowledge of racial justice terms and ideas, especially, will find “Algorithms of Oppression” difficult to get through—and aren’t they the ones who should be learning about this topic? If issues with search engines affect all of us, shouldn’t Noble write to all of us, so that everyone can be informed of this problem?
I recognize that “Algorithms of Oppression” is an academic text, and perhaps Noble is simply conforming to academic conventions. But there must be a way to make research-based books like this one accessible to common readers.
Food For Thought: Noble believes that programs like Black Girls Code put the onus on Black girls to lead change rather than pushing for systemic and company-wide racial justice practices. I found this idea similar to Gloria Ladson-Billings’ thoughts on the achievement gap in education. Ladson-Billings argues that focusing on the achievement gap sidelines important conversations about the history of segregated education and the ways in which education continues to be inequitable today. How do we as a society begin to address the systemic and historical effects of racism and sexism? (On a related note, how can youth begin making systemic change when they are often left out of conversations around institutional planning/growth?)
Also, I wonder how “open sourcing,” a recent practice gaining in popularity, would reduce/is reducing racism or sexism in search. Many tech companies “open source” their code on platforms like GitHub, allowing anyone to see and run their algorithms. (Because only some tech companies have the resources to actually manage and run the code, especially when it comes to data-intensive AI algorithms, there isn’t any fear of another company swooping in to steal their ideas.) What would happen if Google or Facebook open sourced the AI algorithms that allow them to combat fake news, so that big tech companies can collaborate and learn from one another?
A Quote I Would Like On Goodreads: “We have to ask what is lost, who is harmed, and what should be forgotten with the embrace of artificial intelligence in decision making” (pg. 52).
Up next: “First They Killed My Father” by Loung Ung.