Wednesday, August 17, 2016

Web Secret 428: the Bechdel Test for computer technology

The Bechdel test is a well-known measurement of gender bias in movies that originated in 1985 in the comic strip "Dykes to Watch Out For."

To pass the test, a movie must have three things:
  1. Two female characters (preferably named),
  2. Who talk to each other,
  3. About something other than a man.
Almost half of 2015’s top movies failed the Bechdel test.

Think that's disturbing?

Now think about computer technology, anything having to do with computer technology, such as social media, the algorithms underlying all the electronic devices we use on a daily basis, and AI.

Wait!

Before you do that, let me tell you about the first social media conference I attended in 2008. Facebook was 4 years old. There were hundreds of attendees at the Javitz Center - Manhattan's cavernous convention hall.

I counted exactly 5 women attendees.

Five.

It's gotten a bit better since then.

But overwhelmingly - computer tech is a male dominated field.

And as I waded through the crowd of 25 year old men, I thought to myself, "Wow - I have a feeling this is going to be a problem - what is the world missing out on because the female perspective is absent?"

That question is being answered every day since then:

1. no women, no color. Overwhelmingly, computers, smartphones and gizmos of every kind are grey, grey, grey. As a basis of comparison, Urban Decay's Vice Lipstick collection comes in 100 different shades.

2. Okay, that was a little sarcastic. How about most gaming consoles feature first person shooters hunting monsters and overly voluptuous women in scanty outfits?

3. Still sarcastic. Let's get serious. How about a June 2016 New York Times editorial "Artificial Intelligence’s White Guy Problem" ? I quote:

"...the very real problems with artificial intelligence today, which may already be exacerbating inequality in the workplace, at home and in our legal and judicial systems [is that] sexism, racism and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape [who we are.]"

The article continues:

"We need to be vigilant about how we design and train these machine-learning systems, or we will see ingrained forms of bias built into the artificial intelligence of the future.

Like all technologies before it, artificial intelligence will reflect the values of its creators. So inclusivity matters... Otherwise, we risk constructing machine intelligence that mirrors a narrow and privileged vision of society, with its old, familiar biases and stereotypes."


How to get more women, more minorities, older people and other constituencies involved in computer tech? It's not going to be easy.

But it has to happen.

And soon.

No comments:

Post a Comment