Companies using technology to close the gender gap, but experts warn of pitfalls
Posted July 22, 2019 8:00 am.
Last Updated July 22, 2019 11:04 am.
This article is more than 5 years old.
When the Bank of Nova Scotia looked to brainstorm new ways to close the gender gap, it turned to what it had been deploying in nearly every other facet of its business: technology.
The Toronto-based lender analyzed a “myriad of data sets” for its Canadian employees — everything from internal social media interactions to past performance data — to come up with variables that correlated with success, said Permpreet Sidhu, Scotiabank’s vice-president of performance and inclusion.
The result was a set of key metrics and an “emerging leader index” the bank has used since November to identify employees that should be encouraged to move up the corporate ladder while helping to strip out the unconscious bias that can sometimes drive promotion decisions.
“What we are trying to do is interrupt some of the bias that comes in, even when we are identifying who to develop 12 months before that job even becomes available… The index will put a broader slate of candidates on the radar for leaders,” said Sidhu.
It’s the latest example of technology used to combat unconscious bias in the workplace and increase diversity among the ranks.
Other strategies in the corporate sector include a tool that can detect biased language in online workplace chats, using software to strip out gender identifiers from LinkedIn pages while recruiting and employing artificial intelligence to neutralize the promotion process.
“We’re at the frontier right now, where people are running all these experiments,” said Sarah Kaplan, director of the University of Toronto’s Institute for Gender and the Economy.
“And we really don’t know what’s effective yet, but everyone is trying and they have this hope that AI or other automated algorithms might help.”
Catalyst Group, an organization dedicated to promoting equal rights for women in the workplace, in March launched a plug-in for the Slack messaging platform called BiasCorrect.
It is programmed to detect 25 words and phrases with a gender bias such as “she is so pushy.” Once installed, BiasCorrect should automatically flag the potentially problematic phrase to the Slack user and offer an alternative such as “she is so persuasive.”
Serena Fong, Catalyst’s vice-president of strategic engagement, said they have made the underlying code public to allow for users to add more phrases or adapt it for other messaging platforms.
Catalyst hopes that this plug-in will help people become more aware of their unconscious biases and the impact of their words, and contribute to a more inclusive workplace overall.
“There is no quick fix to the problem of unconscious bias,” she said.
However, there is concern the technology can be just as biased as the person who programmed it or the underlying information used.
“AI is not some panacea… We just have to be very thoughtful about it. It’s not like you can remove the hand of the human just by applying these bots,” Kaplan said. “The bots might actually be amplifying bias.”
Two professors from the Massachusetts Institute of Technology and London Business School conducted a real-world experiment that showed a gender-neutral ad for a STEM (Science, Technology, Engineering and Math) job posting online was shown more frequently to men than it was to women.
MIT’s Catherine Tucker and London’s Anja Lambrecht ran an ad on Facebook, Instagram, Twitter and other sites through Google’s ad display network. On each site, an algorithm optimized the ad to get the most views, which resulted in more male eyeballs than female, according to an article in Scientific American.
Women generally make more household purchasing decisions than men, and in turn, marketing algorithms place a premium on female views of an ad, and in turn costs more. Showing the ad to men was more cost effective, Tucker told Scientific American.
Kaplan said stripping out names and other identifiers that would signal an applicant’s gender may have unintended consequences.
The underlying data — such as the schools on a candidate’s resume — may be influenced by gender, she added.
“In fact, their gendered experience throughout their entire careers have shaped all their steps along the way, and you really need to account for that in the selection process,” she said.
Scotiabank has been mindful of this risk, and has eliminated some data from their index as a result.
The bank has chosen to remove education from its model and instead use on-the-job experiences, Sidhu said.
“You have to be aware and you have to be responsible with the data,” she said. “And eliminate those fields are inherently having bias, so that you’re not continuing to exacerbate the challenges.”
Armina Ligaya, The Canadian Press