Viewport width =
July 30, 2018 | by  | in Super Science Trends |
Share on FacebookShare on Google+Pin on PinterestTweet about this on Twitter

Super Science Trends

B.I: Biased Intelligence
CW: Discussion of racism and homophobia.

On August 16th 2017, Chukwuemeka Afigbo, a worker in tech from Nigeria, posted a video to Twitter of an automatic soap dispenser in a public bathroom.

The video begins with a white man’s hand under an automatic soap dispenser. Dutifully, the machine disperses a jet of soap into his open palm. Afigbo  then asks his friend Noel, a black man, to put his hand underneath the dispenser. He does, and the dispenser does nothing. Noel waves his hand up and down to try and get the dispenser to “notice” him, a typical ritual to appease the sensors, but the dispenser still does nothing.
Both men conclude that his hand is “too black” for the machine, which is coded only to dispense when it recognises one kind of skin definition. A simple design flaw turned a modern convenience into an alienating experience for both him and his friend. Just to hammer the point home, Noel then grabs a (white) paper towel and holds it under the dispenser. Sure enough, the machine dispenses soap onto it.
Afigbo shared the video in an attempt to address “the importance of diversity in tech and its impact on society”, and the video quickly went viral, as conversations began around biases in tech development. We assume that machines are incapable of prejudice, but there is always a value judgement made by the people who build them about how they should work and for whom they’re intended. Where this gets more complicated is our current use of computer algorithms, which use existing data to make broad assumptions about people based on select information. You don’t have to work in technology to know that they can effectively become a recipe for perpetuating biases.

In a tone-deaf study from the University of Stanford last year, researchers used facial recognition software on 14,000 profile pictures on an American dating website paired with an algorithm that recorded the sexual orientation information listed on each profile.
The resulting software made composite images of “average” straight and queer faces, and claimed to be able to distinguish between queer and heterosexual faces with 81% accuracy for men and 71% accuracy for women, dubbed by outlets as an “gaydar AI”.
The Human Rights Commission and GLAAD both heavily criticised the study for effectively creating a persecution search engine. To quote Ian Malcolm, they were so preoccupied with whether or not they could, they didn’t stop to think if they should. Even then, at best the data is only really useful for benign sociological uses, like learning about dating cultures. The Stanford researchers rejected their concerns for “lacking scientific training”, with one of them doubling down to say the study could go on to demonstrate whether facial recognition software can predict political orientation or potential criminality. Which… uuurgh, can we please not bring phrenology back into vogue within my lifetime? The only thing that my furrowed brow will tell you about my character is that I have no tolerance for pseudoscience. I can’t imagine any queer researcher would be terribly concerned with determining the objective “gay face” either.
As our societies become more automated and dependent on algorithms, these issues will only become more commonplace, unless they’re addressed in the design stage. Tech companies and researchers should either anticipate a diverse user base or, better yet, hire a diverse staff who can inform development with their own life experiences. It’s important to be critical of who makes this technology and for what purpose; you don’t want to wonder what will happen when you’re left up to your own devices.

Share on FacebookShare on Google+Pin on PinterestTweet about this on Twitter

About the Author ()

Add Comment

You must be logged in to post a comment.

Recent posts

  1. Newsthub: No need to kill cats Mittens, owners should be responsible – Wellington Mayor Justin Lester
  2. Where Does Your Student Services Levy Go?
  3. Presidential Address
  4. Simran Rughani Resigns from VUWSA
  5. Score Steamed Hams with Seymour for Society Soirée
  6. VUWSA Launches Student Mental Health Campaign
  7. Tragicomic Webseries
  8. Issue 18, Vol 81: Under the Surface
  9. NT: Te Ara Tauira
  10. Queer Coverage: Local, National, and International LGBTQIA+ News
Website-Cover-Photo7

Editor's Pick

This Ain’t a Scene it’s a Goddamned Arm Wrestle

: Interior – Industrial Soviet Beerhall – Night It was late November and cold as hell when I stumbled into the Zhiguli Beer Hall. I was in Moscow, about to take the trans-Mongolian rail line to Beijing, and after finding someone in my hostel who could speak English, had decided