Bias in the Binary

Feb 4, 2020 | 0 comments

People aren’t the only ones to make judgements based on skin color and gender. Computers do it, too.

Artificial intelligence (AI) works just like the old saying goes, “Garbage in, garbage out.” Bias can appear in any AI system when the data that’s used to train it is skewed.

When biased data is inputted to train AI, that system spits out racial, gender and even ideological biases in its results. What does this look like in real life?


The story of Joy Buolamwini is a powerful example. As a student at MIT, she noticed that facial analysis software couldn’t detect her dark skin until she literally put on a white mask. Only then did the software pick up on her face. Joy now claims a spot on Forbes’ 2019 30 Under 30 List for creating and leading the Algorithmic Justice League to identify and correct biases in AI. Her TED Talk is an amazing explanation of the challenge we face with AI and how to start correcting it. I’m following her lead!

It only makes sense that the people behind the computer screen accurately reflect the rest of the population looking at it. If women make up half of Earth’s population, then why do they only account for 15 percent of AI research staff at Facebook, and 10 percent at Google?

It’s stats like those that persuaded me to become involved with the T.D. Jakes Foundation – to help bridge the gap between the abundant human potential around us and the millions of STEM job opportunities available! I get giddy just thinking about ways to increase diversity and gender equity in these fields. But it’s going to take all kinds of us—literally!






Linked In Facebook Instagram  Twitter
Share This