How Biased AI is Holding Us Back, and Two Things We Can Do About it

Like Hollywood and Washington, D.C., Silicon Valley and the tech world must step up to combat entrenched biases.

Guest Commentary, Guest Commentary

March 5, 2018

4 Min Read

From the largest and most successful tech corporations to the smallest start-ups just finding their footing, most will agree that increasing diversity is in the best interests of customers, employees, and the general public. However, we in the tech world often fail to recognize the impact of our own biases. We sometimes think that, because our products and services are based on 0’s and 1’s, everything we put out into the world is fair and logical. Not true! This International Women’s Day (Thursday), let’s take a closer look at the biases that inhabit so much of our work, as well as some of the ways we can work toward a culture of inclusive AI.

The theme of this year’s International Women’s Day is #PressforProgress, a call for gender parity across countries, industries, and all kinds of organizations. As we celebrate the achievements of women and push for equality, we should recognize that harmful biases affect a range of communities. Examples of biased AI algorithms that have held us back include:

  • Microsoft and IBM’s facial recognition technologies failing to identify women, especially black women, at much higher rates than white men

  • YouTube and Twitter blocking LGBTQ content that would otherwise be deemed “safe-for-work”

  • Facebook’s “trending topics” list incorporating alleged liberal biases

  • Apple’s “animoji” feature portraying emoji created by Asian users as constantly squinting

Yes, each of these companies has pledged to address these controversial algorithms. And, there is no evidence that any of the algorithms behind them arose out of conscious decisions to exclude anyone. Still, we as an industry cannot pretend they are isolated incidents. The fact is that products and services based on artificial intelligence necessarily reflect the biases of the people who designed them.

The good news is that this simple truth also shows us the way forward. Engaging a diverse set of stakeholders throughout the AI lifecycle -- ensuring that those who are coding, developing the algorithms, analyzing the data and communicating it out represent the diverse populations touched by it -- can help us prevent biased AI from impacting the great products and services we put out into the world. Here are two of the most important ways to get started:

1. Creating a culture of inclusive AI begins with increasing access for diverse groups of young people, career-changers, and professionals interested in the field. The good news is that there are many organizations in the fight to #PressforProgress, including Black Girls Code, Girls Inc., PyLadies, and more, and we are starting to see results. One recent, promising outcome is that the number of female students who took an AP computer science exam in 2017 increased by 135 percent from 2016. This represents excellent progress, but we are still far from gender parity: Female students made up only 27% of the total number of students who took the exam last year. In my role as a senior data scientist at Metis, a data science training provider, I try to do my part to close the gap by seeking out diverse students to mentor, challenge, and celebrate. We also provide a bootcamp tuition scholarship for women and members of other underrepresented demographic groups. Cultivating diversity in these ways is actually a winning strategy for companies. McKinsey research shows those in the top quartile for racial and ethnic diversity are 35% more likely to have financial returns above their national industry medians, and companies in the top quartile for gender diversity are 15% more likely to do the same. It’s not hard to connect the dots and see that creating a culture of inclusive AI is a win-win for individuals and the organizations they work for!

2. In addition to cultivating younger talent in the field, those of us working in AI need to include diverse senior talent in our leadership structures. I’m proud to work for a tech company where the majority of leadership roles are filled by women, but I know this is not the norm. Again, not only is inclusion of all kinds a worthy goal to strive for in terms of morality and justice, it’s also a sound business strategy. According to McKinsey, for every 10% increase in racial and ethnic diversity on the senior executive team, earnings before interest and taxes rise 0.8%. Conscious decisions to fight bias and include deserving applicants in leadership roles are some of the first choices that have to be made on the path toward a culture of inclusive AI that benefits everyone.

But more needs to be done. The more students from diverse backgrounds we can attract to the field, and the more we support them as they seek to develop their skills, the stronger the tide of talent will be to change our culture from the bottom up. Simultaneously, the more we include women and individuals from historically underrepresented groups in leadership roles, the greater success we will have creating a culture of inclusive AI from the top down. As we work on the next great advancements in artificial intelligence that will transform the ways we work, play, and live, it’s crucial that we not leave anyone behind.


Sophie Searcy is a Senior Data Scientist at Metis, a leading data science training provider.

About the Author(s)

Guest Commentary

Guest Commentary

The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT professionals in a meaningful way. We publish Guest Commentaries from IT practitioners, industry analysts, technology evangelists, and researchers in the field. We are focusing on four main topics: cloud computing; DevOps; data and analytics; and IT leadership and career development. We aim to offer objective, practical advice to our audience on those topics from people who have deep experience in these topics and know the ropes. Guest Commentaries must be vendor neutral. We don't publish articles that promote the writer's company or product.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights