4 Is the Internet Racist? Algorithms of Oppresion

queirugj

To Begin…

  1. What is the role of algorithms in search engines, and how do they determine search results?
  2. Can search engines be biased?

Let’s Break It Down

Saifya Umoja Noble sheds light on the ways in which search engines, particularly Google, can reinforce racial and gender biases. The central thesis of her book revolves around the concept of  “algorithm biases,” which refers to how search engine algorithms can produce biased search results that reflect and amplify societal prejudices and stereotypes.

What’s the tea?

As observed in an eye-opening examination of the ways in which algorithms and search engine results harbor prejudices against women of color What will you find if you search for “black girls” on Google? Search terms that are sexually explicit, like “Big Booty,” are probably going to rank highly. However, if you substitute “white girls,” the outcomes are very different. Unmoderated conversations about “why black women are so angry” or “why black women are so sassy” and suggested pornographic websites paint a frightening picture of black womanhood in contemporary society. Safiya Umoja Noble challenges the notion that search engines such as Google provide an even playing field for all ideas, identities, and activities in her book Algorithms of Oppression. Discrimination in data is a true social issue; Noble contends that a biased set of search algorithms that favor whiteness and discriminate against people of color, particularly women of color, are the result of the combination of private interests in promoting particular websites and the monopoly status of a relatively small number of Internet search engines.

Noble exposes a culture of racism and sexism in the creation of discoverability in the online space through an analysis of textual and media searches and in-depth research on paid online advertising. With search engines and related businesses becoming more and more important—serving as a major source of email, a major learning tool for elementary and secondary schools, and more—it is critical to comprehend and reverse these unsettling trends and discriminatory practices.

 

Figure I.1. First search result on keywords “black girls,” September 2011.
Figure I.1. First search result on keywords “black girls,” September 2011.

 

Figure I.2. Google Images results for the keyword “gorillas,” April 7, 2016.
Figure I.2. Google Images results for the keyword “gorillas,” April 7, 2016.

 

 

Figure I.3. Google Maps search on “N*gga House” leads to the White House, April 7, 2016.
Figure I.3. Google Maps search on “N*gga House” leads to the White House, April 7, 2016.

 

 

Figure I.4. Tweet by Deray McKesson about Google Maps search and the White House, 2015.
Figure I.4. Tweet by Deray McKesson about Google Maps search and the White House, 2015.

 

 

Figure I.5. Standard Google’s “related” searches associates “Michelle Obama” with the term “ape.”
Figure I.5. Standard Google’s “related” searches associates “Michelle Obama” with the term “ape.”

How does Algorithmic bias get manifested?

Noble argues that algorithmic bias manifests in several ways including:

  • Reinforcing Stereotypes: Search engine algorithms may prioritize and display content that reinforces harmful stereotypes about racial and ethnic groups, reinforcing existing biases.
  • Underrepresentation: Certain groups, particularly those from marginalized communities, may be underrepresented or misrepresented in search results, leading to a skewed view of their contributions and experiences.
  • Misinformation and Disinformation: Biased algorithms can lead to the promotion of misinformation and disinformation, especially when it comes to topics related to race, ethnicity, and identity.
  • Commercialization of Bias: The pursuit of profit can drive search engines to prioritize content that generates ad revenue, even if it perpetuates biases or is harmful.
  • Discriminatory Impact: Algorithmic bias can have a discriminatory impact on individuals and communities, affecting their access to information, opportunities, and fair treatment.

Commercialization of search engines

  • Paid Search Results: Search engines like Google, generate a significant portion of their revenue through advertising. These paid results are typically displayed prominently at the top of the search results page or labeled as ads.
  • Ad placement: The positioning of paid search results is based on a bidding system where advertisers compete to have their ads displayed for particular search terms. This means that ad placement can be influenced by the amount of money advertisers are willing to spend, not necessarily the relevance or quality of the content.
  • Bias Toward Advertisers: The profit motive incentivizes search engines to prioritize content from advertisers because they generate revenue from them.
  • Content Promotion: Advertisers have the means to promote their products, servies, or viewpoints through paid search results. In cases where advertisers interest align with certain biases or stereotypes, their promoted content may reinforce those biases.
  • Filter Bubbles: Search engines may personalize search results based on user data to maximize ad revenue.

For Example

Racial Stereotypes: If a company is running an ad campaign that plays into racial stereotypes, search engines may display those advertisements prominently when users search for related keywords. This can reinforce and perpetuate harmful stereotypes.

The impact on marginalized communities

  • Biased search results can disproportionately affect marginalized communities, including racial and gender minorities.
  • These communities may encounter biases or harmful content when searching online.
  • Biased content in search results can contribute to discrimination and prejudice.
  • Marginalized individuals may face discrimination based on the information found in search results.

Look at it this way: 

Imagine a popular social media platform called “SocialNet,” which uses complex algorithms to curate content for its users. However, these algorithms unintentionally perpetuate biases, as described by Noble:

  1. Reinforcing Stereotypes:
    • Users on SocialNet, primarily interested in fashion, are exposed to content that consistently features models conforming to Eurocentric beauty standards. The algorithm prioritizes posts that align with these standards, reinforcing harmful stereotypes about beauty and perpetuating biases against individuals who don’t fit these norms.
  2. Underrepresentation:
    • The algorithm fails to adequately represent content from artists and creators belonging to marginalized communities. As a result, users may receive limited exposure to diverse perspectives and artistic expressions, leading to a skewed view of the creative landscape and contributing to the underrepresentation of certain groups.
  3. Misinformation and Disinformation:
    • SocialNet’s algorithms inadvertently promote misinformation related to cultural events and historical facts. For instance, during Black History Month, the platform might display inaccurate information about significant historical figures or events, contributing to the spread of misinformation about the experiences of racial and ethnic groups.
  4. Commercialization of Bias:
    • Advertisers on SocialNet, driven by profit, frequently target users based on demographic data. The algorithm, prioritizing ad revenue, might display content that reinforces stereotypes or biases to maximize engagement and ad clicks. This commercial focus can result in the propagation of biased content that caters to prevailing societal prejudices.
  5. Discriminatory Impact:
    • The biased algorithm may impact job opportunities for individuals from certain communities. For instance, if the algorithm favors resumes with specific keywords or educational backgrounds, individuals from marginalized communities may be systematically excluded from job recommendations, perpetuating discriminatory practices in employment.

This fictional example illustrates how algorithmic bias on a social media platform can manifest in various ways, aligning with Noble’s arguments about the potential consequences of biased algorithms.

some changes did happen…

  • Google’s Response to Controversial Content: Noble mentions instances where Google had taken actions to remove or demote specific content that violated its policies or generated public outcry. Google’s actions in response to public pressure can influence search results and the visibility of certain content.
  • Algorithmic Updates: While not a legal change, Noble discussed how Google frequently updates its search algorithms to improve the quality and relevance of search results. These updates can impact the ranking and visibility of content, potentially affecting the prominence of biased or discriminatory material.
  • Content Removal Requests: Noble highlights the importance of individuals and communities actively reporting biases or harmful content to search engine companies. These reports can lead to the removal or demotion of specific pages or websites from search results.

the need for transparency

Noble argues that algorithmic transparency in search algorithms is essential to understanding and addressing the issue of bias, discrimination, and the perpetuation of stereotypes in search results.

Taking action as a community

What can we do?

  • Advocate for Algorithmic Transparency
  • Demand greater transparency from search engine companies about their algorithms and content moderation practices
  • Encourage critical thinking when evaluating online information and search results.

q & A Time, Let’s chat!

  1. Noble suggests the need for algorithmic transparency. How might transparency in search engine algorithms help address issues of bias and discrimination?
  2. What are some practical steps that individuals can take to become more critical consumers of online information and to mitigate the impact of algorithmic bias in their own online experiences?
  3. In what ways can individuals, communities, and educators foster a culture of digital literacy that empowers people too critically assess information and recognize biased content?
  4. In the context of algorithmic bias, what can we learn from other industries or historical examples where technology has had significant societal implications?

 

 

definition

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Critical Digital Literacies Copyright © 2023 by queirugj is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book