top of page
Search
  • bruffin8

EXTRA CREDIT - MIS-REPRESENTATION AND REGONITION

Updated: Sep 24, 2020


Joy Buolamwini, an MIT Media Lab researcher went into the lab discovering her own idea of how a mirror can show words of positivity when you stand in front of it. What she didn’t realize was that this was about to become the future of identity that was being tracked by computer data. When she tested the mirror that she brought to life, the technology only recognized the face with a white mask on and when she took it off, she was rejected. The data in our computer systems have been programmed based upon our past events and past rights of African Americans, racism, and Freedoms of Speech, Religion, and identity. What Joy did was that she researched and developed a way in which she could stop Facial Recognition before it takes over our society.

We have these algorithms that are programmed into our computers, phones, and cameras that use past data to help the computers recognize if the person is “high”, “medium”, or “low” risk. The AI is a group of people in which watches over the systems to make sure who is getting a job, who is fired, who is owning this certain house based upon their facial recognitions and data from the computer systems. What we don’t know and what is scary is that the data in the computer systems can be missed or mis-identified which means that a person who is completely innocent can be recognized as being a wanted person.

Back in time, and in our history, police used to use finger prints and ID cards to make sure that the citizens walking down the street or going into a restaurant is an innocent person. Now that technology has become our way of communication and a way of experimentation of facial recognition, has made Joy speak up to Congress in the United States about these racial injustices and equalities that have been miss identified from a computer. There have been many incidents in which young students walking home from school have been stopped by police because the facial recognition identified them as being at “high” risk. What Joy did to stop it was that she got a group and spoke up to others that this is not okay. A computer can’t define who you are. We have rights, we have freedoms, we have justices, and we have equalities, and a computer cannot score us, and target innocent people. This violates our 14th Amendment rights. No algorithm is defined as what is just and if humans keep training computers to target innocent people and give false information, then why should we have the technology that recognizes misidentified faces. The algorithm doesn’t define who we are as people. This is what Joy Buolamwini expressed to Congress. After her Court case, many states banned facial recognition. The power of being a human means to be vulnerable. The power to share information with a computer about others through a facial recognition by using cameras on the streets and in stores can be misidentified and violate our rights as humans.


Joy helped us get to where we are today in which on June 10th Amazon paused facial recognition and on June 25th U.S. law-makers introduced a legislation that banned it because of its unfairness and inequality. I believe that facial recognition violates our 14th Amendment rights because of the act of misidentification and the act of misrepresentation in our society of our computers telling us who we are and what we can do. This will lead to social control and we need to act upon the harmful effects of facial recognition now.

5 views0 comments

Recent Posts

See All

Comments


bottom of page