5 Digital Redlining

Photo by Growtika on Unsplash.com

Origin of redlining

In 1934 the FHA, also known as the Federal Housing Administration started denying loans and credit to families who live in urbanized and Black communities. The FHA simply initiated redlining by essentially dividing up neighborhoods with a redline to distinguish what groups of people may qualify for loans and credit. Although the Fair Housing Act of 1968 prohibited discrimination amongst individuals with sales of housing, we are now faced with the same concern but in the new digital era.

What is digital redlining?

Similarly, the concept of digital redlining intertwines with the understanding of the redlining from 1934. In this new technological era, companies are targeting, as well as excluding oppressed groups, people of color, and low income individuals to make a profit.

For example: An internet company providing excellent service to wealthier neighborhoods because from their residency location, they can most likely afford it. Whereas in lower income neighborhoods, they are forced to receive slower quality internet.

Digital redlining permits old attitudes and philosophies to arrive in our current world through technology. Humans who have biased standpoints created algorithms to design our new advanced technologies. The redlining concept discriminates against those who cannot control the constant work of digital advancements.

How are we being discriminated against by technology?

Photo by Twitter user @ElaineBabey

In February of 2020, a Black woman was trying to get her passport picture taken. Her picture kept getting rejected by the passport machine. The machine kept telling her that her mouth was open, therefore processing it as invalid. She thought this situation was quite hysterically bizarre and decided to share her experience on twitter. Many comments of her Twitter post were individuals relating to a similar incident of racial bias. Various Black women came out saying their picture got rejected because of the color of their afro blending in with the background. And then an Asian man said his picture got rejected as well because his “Eyes were deemed closed ”, even though they were open.

Digital Discrimination

The Next Door application that was launched in 2011 had many discriminatory factors emerge the more users populated it. Essentially this app was a networking site specifically designed for neighborhoods. Individuals in the neighborhoods may sell old furniture, promote businesses, look for missing dogs and also begin a neighborhood watch via app. However, the majority of watchers were going after Black people. The Next Door users were posting beware signs for Black individuals who were spotted riding bikes late at night. To the point where the developers of this application had to address the constant reporting issue. Now, when individuals tried to report anything regarding suspicious activity, the app would ask a series of questions such as “ Is what I really saw suspicious, especially if I take out race or ethnicity out of the equation?”.

The Next Door app ultimately brings up the repeated historic racial segregation that resides in the new modern digital era.

the new jim code is emerging

Essentially, Jim Crow was started by a white actor named Thomas Dartmouth Rice. He would dress up in blackface makeup as a character named Jim Crow. Eventually, leading to Jim Crow being used as a racial slur and then creating laws that oppressed Black people. These laws were constructed to show that white people are superior to Black people. They were made to believe that Black people were only used for being servants and nothing else.

System of Jim Crow: 

  1. Black men could not approach White women in risk of calling it rape.
  2. Black people could not vote.
  3. Black people had different services, such as Black only restrooms.
  4. Separate drinking fountains for White and Black individuals.
  5. Black individuals could not have access to a good education.

Ruha benjamin’s “Race after technology”

Ruha Benjamin, the author of “Race After Technology”acknowledges the New Jim Code emerging in the 21st digital century. She comes up with four dimensions that investigates the segregation happening within technology.

  1. Engineered inequity: Works to amplify social hierarchies that are based on race, class, and gender.
  2. Default discrimination: Discrimination emerging from historical patterns and predictable algorithms.
  3. Coded exposure: Technology enables various forms of visibility and for racialized groups the problem of being watched ( but not seen) relates to newfangled forms of surveillance.
  4. Technological benevolence: Animation of tech products and services that offer fixes for social bias. But still end up reproducing or deepening discriminatory processes.
Engineered Inequity 

Engineered Inequity is portrayed in Netflix’s Black Mirror episode “Nose Dive”. The premise of this episode is that all of humanity is under a social credit system. The higher in credit/points you have, the more respected you are. As well as being appointed points, with every interaction an individual has. The protagonist, Lacie is looking to buy a bigger place to live, however her credit/point advisor suggested she should look at smaller places since she does not have enough points. Lacie insisted on getting the bigger condo, so her advisor told her she must impress the upper class in order to receive more points.

Similarly, in the modern day and age, employers are using a similar system to hire potential candidates. This new system checks the credit of individuals, to distinguish if they will be a responsible employee. Companies believe that if a person has a bad credit score, this equates to the individual being untrustworthy and irresponsible. Thus, bringing up engineered inequity amongst individuals. The technological era is benefitting the upper class, whereas the groups of minorities are receiving the negative features. This new era is mapping out the future for us by making their decisions based on what technology suggests.

Default Discrimination 

Default discrimination is essentially the behavior of individuals who have been made to be a particular way within societies norms. In this case, its software programs as well as technology defaulting to exclude and discriminate against  certain groups. As individuals we do not realize because it is out of our control or that we just did not notice because it was the norm we were raised in. Technology is programmed by a bias because developers seem to be unaware of the damage that they are causing.

In 2016, a twitter user by the name Kabir Alli posted a tweet of something he had image searched on Google. He searched for “Three White teenagers” and got pictures of a generic white trio, whereas when we searched “Three Black teenagers” he received images of mugshots. Twitter users were furious with this outcome of Google’s algorithms and deemed them to be racist.

Similarly, a group of Graduate students were asked to research pictures of “Inappropriate hairstyles for work” and they received images of Black women with braids and dreadlocks.

As digital redlining becomes more frequent in the modern era, individuals are placed into groups and discriminated against. People have no control over what is happening, because technology is advancing quickly and inappropriately.

Coded Exposure 

Coded exposure is when technological advancements either ignore or act hyper vigilant towards certain groups of people; Mostly, Black people. For example, when Black parents in schools started complaining as to why their child’s face keeps coming up as blurry in school pictures, the companies did not address it. They would simply ignore the complaints until that same company started having issues with their own products’ photos. Certain dark pieces of furniture such as tables or chairs were not being picked up on the camera clearly, so then the company decided to take action to fix this problem.

Ignoring Black people is the first part of Coded Exposure, but then we have hyper vigilance. This is when police officers will target Black neighborhoods because in their statistics, Black neighborhoods have more crime. They are going out of their way to surveil neighborhoods with minorities.

Structured racism forcibly surveils oppressed groups . They are trapped between being unseen and ignored with being forcibly watched. Saartjie also known as Sara Baartman went through a state of being forcibly seen. She was taken from South Africa to Europe in 1810 and visualized by the public. She was photographed, studied, and even dissected in death. Her skeleton, brains, and genitals were displayed publicly for everyone to see. Baartman’s horrific experience in life displays the connection of visual and scientific technologies. Many people only know some of her story from studies they have read but do not know the full story. Her body was returned to South Africa in 2002 so she can have a proper burial. The South African committee demanded her remains back from the French, but the French had claimed they lost it. Eventually after much more demand, the people of France handed it over but the South African committee were skeptical about the remains actually belonging to Baartman. Out of respect they decided not to test her remains and buried her.

Social theorist Michael Foucault says power is enforced through surveillance. Surveillance is definitely subjected to individuals’ different experiences. We have police officers wearing body cameras to capture “What really happened”, we have white women reporting Black individuals for every little thing, and then we have police departments surveilling Black neighborhoods looking for crime.

Black individuals are constantly dealing with the negative outcomes of new technological advancements. Whether it is being watched every time of the day, to somehow criminalize them, or completely ignoring the technological discrimination built by programmers, Black individuals will always have the harmful outcomes of the new forms of surveillance.

Technological Benevolence 

Technological benevolence is new developing technologies that are mitigating social cleavages. Their intentions are to acknowledge social biases but they end up being discriminatory.

Companies started programs that use Artificial Intelligence to interview potential employees. The company’s goal was to “reduce unconscious bias and promote diversity” within the work environment. This program collects data of verbal and nonverbal interactions such as facial expressions, voices, and scores of previous potential employees. This system was designed to be 79% faster with the hiring process and go around any human bias.

However, a Princeton team is working to fix the program into not developing any biases from their creators. Nonetheless, candidates are complaining about the AI hiring system. They are saying that they have no idea how they are being evaluated and feeling like they are being watched with every movement they make. Thus making the experience feel less human.

Companies such as Amazon are attempting to use AI for a quicker hiring process but they are worried that computers, whether coded racially unbiased or not, will still continue to pick up a discriminatory process.

The New Jim Code permits old attitudes and philosophies to arrive in our current world through technology. Humans who have biased standpoints created algorithms to design our new advanced technologies. New approaches to these designs are being created from a more empathetic standpoint. This is called “Design thinking”, more or rather to think and empathize before creating.

What can we do to stop this discrimination?

Digital redlining inherits bias from developers who are not aware that their beliefs are being picked up and formed into a new app or advancement. Actions are taken when the public exposes such developments, but when brought to attention. Reactions happen when individuals go on Twitter to post something that has deeply upset them, and the world follows this exposure. Students who take Critical Digital Literacies courses also become informed by the atrocity of technological advancements. Being more aware and less ignorant can be the first step in ending digital discrimination against all groups of individuals who never have a voice.

 

definition

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Critical Digital Literacies Copyright © 2023 by is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book