Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble

 ⭐⭐⭐☆☆

Prof. Noble wrote a book that focused largely on how search algorithms are designed to continue the hegemony of the white, male, heterosexual due in large part to the creators of the algorithms falling within this group. She further contends that search engine companies, have misrepresented their product by insinuating that the results represent the 'most correct' or 'most useful' information on the web, but they do not. The arguments are legitimate but I feel that the book repeated its arguments without much evidence and attempted to take a moralistic stance at times that felt oppressive. I would say that the final part of the book, truly the last two chapters, is what makes the book worth reading. The first two chapters are repetitious and could benefit from a lot of shortening. In these chapters, Nobel elucidates that algorithms aren't neutral and have implicit biases due to the nature of development, she takes about 100 pages to explain this and give a few interesting examples from the earlier days of the internet. The final 1/5 of the book is what becomes interesting as it continues to be relevant today.

The Main Contention

This book mainly asserts that large search engines, specifically Google, are doing a disservice to the world by not implementing policies of fair cultural representation. Further, users of such technology should be encouraged to question the results presented to them in a search result and question what additional motivations there might be for a company whose goal is commercial in nature.

Noble asserts:

"When it comes to online commercial search engines, it is no longer enough to simply share news and education on the web; we must ask ourselves how the things we want to share are found and how the things we find have appeared."

"The entire experiment of the Internet is now with us, yet we do not have enough intense scrutiny at the level of public policy on its psychological and social impact on the public."

"In essence, we need greater transparency and public pressure to slow down the automation of our worst impulses. We have automated human decision making and then disavowed our responsibility for it. Without public funding and adequate information policy that protects the rights to fair representation online, an escalation in the erosion of quality information to inform the public will continue."

Noble's argument is that because Google represents its search results as the 'best' possible information, while simultaneously not disclosing that it will prioritize results that link back to its own platforms (e.g. Youtube, Blogger, Google Scholar, etc), as well as those from companies who can pay to be higher in the search results, the user is not always getting the best representation of information and at times, the information represented is through the lens of the majority culture - leaving out a fair and accurate representation of minority groups. 

Mediating Online Experiences: Cultural Regulation

A question I had when reading this book is should "search engines such as Google ... be regulated over the values they assign to racial, gendered, and sexual identities, as evidenced by the type of results that are retrieved"? Is the regulation something that these companies are responsible for? If it is, how can society at large enforce such regulation, and who would decide the regulations? Companies, law-makers, the disadvantaged?

The large companies have argued that the disparity is at the fault of the disadvantaged. "Since search is such a significant part of mediating the online experience ... scholars have argued that increased culturally relevant engagements with technology will contribute to greater inclusion and to greater ... agency for historically underrepresented ... groups. This is the thrust of the neoliberal project..." placing the onus of change upon the underrepresented. This, according to Noble, is a ridiculous assertion. As a disadvantaged group, they have little way to make their voices heard. Companies need to take the time to find appropriate solutions. 

Is Google a Democratizing Force?

Noble would content that no, they are not. Since the algorithms, they have created reinforce "hegemonic narratives and [exploit] its users." The output of the algorithms has been characterized as "dialectic" having less to do with the technology and services and more to do with the company's organization of labor and capitalist relations of production.

Noble suggests that Google and other companies should hire students from ethnic studies backgrounds, including Black studies, American Indian studies, gender studies, etc, who have a deep knowledge of history and cultural theory to help guide them through the difficult conversations that she attempts to highlight. Noble suggests that the large technology firms espouse the neoliberal ideals of a technocracy. If this is their goal/claim she believes inserting more voices with appropriate training in ethnic studies backgrounds would go a long way to ensuring better representation in the decisions made by the current elite.

The Law and Online Companies

As online communication and search has become increasingly prevalent and important in our everyday lives, the law has been slow to keep up with regulation. Truly, only Section 230 of the 1996 Communications Decency Act has dealt with content representation on the web. This law gives immunity to online companies, which cannot be found liable for content posted by third parties. It was specifically designed to protect children from online pornography. However, if an online company participates in 'filtering' they may be held liable (see the case against Prodigy). It has become common knowledge that Google does have filtering practices in it's search and possibly could be held liable. More recent cases have found that companies cannot be held liable for not self-censoring or removing content. The 1996 act created a distinction between "computer service providers" (nonmediated content) and "information providers" (mediated content). This distinction has been held up to give further immunity to companies. 

Another area of the law that should be debated with regard to online search companies, is how content is prioritized and what information if any, the company should be giving the consumer about how the results were determined. "Focus on content prioritization processes should enter the debates over net neutrality and the openness of the web when mediated by search engines, especially Google."

These particular issues have become more apparent in the years since the publication of the book with the spotlight falling upon Facebook and Google after the 2016 presidential election in the United States. 

Free Unpaid Labor

The least discussed issue in the book is that Google is mining users for the data and then profiting from it. While some users may be aware of such policies, most are unaware. Using Google's "free" tools generates billion-dollar profits for the company. The profits come from unpaid labor from users and the delivery of audiences to advertisers."  "Google's commodities are not its services such as Gmail or Youtube; its commodities are all of the content creators on the web whom Google indexes (the prosumer commodity) and the users of their services who are exposed to advertising (audience commodity)." Simply put we, the users, are the product that Google is selling to its customers, the advertisers. 

Comments

Popular posts from this blog

Emerald Blaze (Hidden Legacy, #5) by Ilona Andrews

The Echo Wife by Sarah Gailey

"Every Which Way but Dead" by Kim Harrison, (The Hollows Book 3)