Coded Stereotype

May 16, 2021

I'm Digvijay. I try to come up with mental models which helps me think & do stuff efficiently. Know more about me.

Back in 2018, Reuters published a report on Amazon’s AI based recruiting tool proven to be biased against women. A similar stereotypical behaviour was seen with a policing algorithm, as per the program black individuals were more likely to commit crime.

Algorithms perform predictions based on the data they are provided with, which is usually curated with a narrow perspective, one is very likely to pass on their own interpretations of how things should ideally work.

These algorithms are an amplified reflection of the existing world, the world full of stereotypes and biases. Amazon’s AI would try finding the best candidate based on the profiles of ones who have performed well in the organization and majority of workforce is male dominated.

“There’s a long history of data being weaponized against Black communities.” - MIT Technology Review

Usage of these ML programs by governments and corporations having influence on people’s lives, can have severe impacts. We let machines decide who gets the job, who should be in jail or who gets loan based on the stereotypical data they are trained on. Unfortunately, these decisions are mathematical with no moral values.

This widens the biases against a particular group of individuals and normalize it. As a society, we restrict new possibilities and opportunities hence hindering our growth.

You see, we need policy reforms along with technological progression, to make the advancements more acceptable. AI, being such a revolutionary technology, requires guidelines over it’s applications.

I decided to write this after watching ”Coded Bias” by Shalini Kantayya.

More info on algorithmic bias:

  • Algorithmic bias in facial recognition (Ted talk by Joy Buolamwini): https://youtu.be/UG_X_7g63rY
  • Weapons of math destruction (book & talks at Google) by Cathy O’ Neil
  • AMAs around algorithmic bias on Reddit.
  • News reports around the same.