Is Diversity Part of Your Technology Strategy?
As we move into an age of artificial intelligence tech teams have to strive to eliminate bias in their AI apps.
When innovations in intelligent automation elevated robotics from factory machines that work alongside people to smart machines that work with people, organizations began exploring the potential benefits from both an internal and external perspective.
The introduction of Robotic Process Automation (RPA) and Artificial Intelligence (AI) has continued to advance to other technologies over time, however, RPA was a great place to start the human/machine collaboration as it’s a low-risk, relatively low-investment technology. RPA features software robots – more commonly referred to as bots – that can be programmed to mimic specific human actions, such as data retrieval needed for tax compliance or financial statements, at high speeds and with great efficiency.
The repetitive, mundane activities which bots do best just happen to be the least appealing tasks for millennials, freeing up talent to focus on more strategic, value-added and purposeful work. Previously untapped talent in less visible ranks and roles that had not yet had the time or platform to innovate as much now emerge. A win-win.
How talent drives strategy
Co-botting, humans and robots working together, obviously presents a number of benefits but it presents an interesting challenge: how do businesses ensure implementation of new technology tools align with key talent objectives, such as skills training, leadership development and diversity and inclusiveness (D&I).
The first two might seem obvious, but D&I, perhaps not so much. In truth, robotics and AI affect every layer of the D&I spectrum from programming technology tools without amplifying biases to leveraging them to optimize the business benefits of D&I through recruiting, learning and teaming.
A recent article in Science Magazine highlighted the potential hazards of AI, noting that when algorithms learn words by from human input, they develop biases of their own. And, research with Implicit Association Tests (IATs) shows that bots are susceptible to learning their own set of stereotypes from their human programmers. An IAT is a psychological testing tool that flashes images on a screen, enabling people to react to the images they see and thereby reveal subconscious associations. In this case, the programmers unconsciously linked men with analytical and mechanical skills and women with child care and homemaking chores, for instance.
In his book, The Future of Work, author Jacob Morgan advises strategy before technology. Implementing a talent-focused strategy before intelligent automation helps companies avoid pitfalls inherent in designing machines to mimic human capabilities. When there is increasing automation, D&I become critical enablers to ensure that we are mitigating biases in the development, and minimizing inequities and inaccuracies in usage and impact.
Simply put, diverse teams allow organizations to uncover blind spots, see around corners and readily identify unintended biases. Incorporating diverse and inclusive thinking within a team feeds more robust automation, and, conversely, that automation promotes best in class D&I objectives.
We have more to learn
Recognizing that we are still early in our journey, organizations should be diligent about acknowledging the “known unknown” — that threats of potential bias always lurk in the background — which can be done by:
Introducing organization-wide, overarching D&I reviews with technology as a factor.
Conducting formal reviews of new automation programs with an eye on factors relevant to the local market context, while staying consistent with the organization’s core values.
Introducing an outside perspective to aid in the creation and review of bot programming, to bring in an objective arbiter of what constitutes un-biased programming.
As with most things in business, these decisions often come down to strong leadership, unshakeable commitment and constant focus on everything that matters most to sustaining the organization’s purpose and its business objectives. Technology is not neutral, and it’s our responsibility to make it positive. To that end, we offer four principles of a strong D&I technology strategy:
Stay alert to potential bias or to the alienation of our people.
Prevent machines from “inheriting” human bias and passing it along to future generations of technology tools.
Manage the relationship between people and changing technologies, particularly in establishing and maintaining trust, and the long-term effect that technology may have on shaping human behaviors.
Actively listen to professionals to gauge their sense of belonging.
This next stage will be exciting and filled with opportunities. Being proactive and mindful of the intersection between automation and D&I as we progress along this journey will allow organizations to stay connected with their people and continue to reinforce business strategies — with people front and center.
Karyn Twaronite
Martin Fiore is EY Americas Tax Talent Leader and Karyn Twaronite is EY Global Diversity & Inclusiveness Officer.
About the Author
You May Also Like