Can AI be sexist? You bet it can.

symbol-1179119_1280It’s tempting to assume that AI in banking is non-sexist and cannot be otherwise. After all, AIs are genderless, so how could they possibly identify with or prefer one gender over another? 

Except, well, AIs mine databases. If biases happen to be built into data-gathering processes—and not necessarily by design—AIs can hardly avoid emerging with similar biases. 

In her article for Finextra, “Women in finance: How AI is shining a light on diversity,” Senior Reporter Madhvi Mavadiya shines a much-needed spotlight on the basic problem as it pertains to hiring: 

… technology can be as biased as humans if it replicates past hiring decisions and in the past, AI recruitment tools have realised that it discriminated against women because it attempted to find employees like its current workforce, namely, men.

Mavadiya’s focus is on hiring, but it’s not unreasonable to wonder if similar prejudices find their way into other AI-driven financial decision-making, such as product development and targeted marketing. 

Sexism is certainly good at passing under the radar in other areas. In her article “Can snow plowing be sexist? Yes it can,” Susan R. Madsen, Ed.D., Orin R. Woodbury Professor of Leadership & Ethics in the Woodbury School of Business at Utah Valley University and Founding Director of the Utah Women & Leadership Project, wrote that in Karlskoga, Sweden,

… local officials were trying to implement gender-equality initiatives and someone remarked that at least snow removal was safe from “the gender people.” But as the data were analyzed, they discovered that the transportation patterns of men and women were different, and that only men were considered in snow plowing decisions.

Madsen cites other examples:

… when mostly male doctors outlined the typical symptoms of heart attacks in medical textbooks, it went unnoticed that women present very differently. As a result, women suffering heart attacks today are still treated half as often as men because their symptoms aren’t “typical.” And recent studies show that because crash test dummies are based on male specifications, accidents result in more injuries for women … when data are not collected on all genders, decisions are not benefiting all residents.

What started me mulling this topic was this year’s International Women’s Day. It was on March 8, and celebrations took place around the world. (Check out this piece in The Guardian for not just stunning but inspiring photos.) But then the pandemic hit, demanding immediate attention, which is why I’m writing about International Women’s Day only now.

I might add that I find it not a little daunting to write about women’s issues, since, as you may have gathered from my name and photo, I hardly know what it’s like to be a woman. But I was born to a woman, I’m married to one, and I have two daughters, so at the very least I can rightfully claim that women’s issues matter to me.

In his book A People’s History of the United States, historian Howard Zinn wrote:

It is possible, reading standard histories, to forget half the population of the country. The explorers were men, the landholders and merchants men, the political leaders men, the military figures men. The very invisibility of women, the overlooking of women, is a sign of their submerged status.

Yes, we’ve made progress. But then, the United States still hasn’t passed the Equal Rights Amendment, which the Senate sent to the states for ratification 48 freaking years ago. The 19th Amendment, which guarantees a woman’s right to vote, barely squeaked by only a century ago. My parents’ generation hotly debated whether women should be “permitted” to work outside the home, a debate that continues in many parts of the country. Only recently have law enforcement and the courts considered that maybe, just maybe, a woman should have a right not to be sexually harassed, a notion that many yet greet by digging in their heels. And the United States Congress apparently remains unconvinced that equal pay for equal work is fair and long overdue. 

It doesn’t help that mythology about innate, sex-linked abilities refuses to die. No, women aren’t intrinsically better at communication and intrinsically worse at math and science than men—but telling them that they are can have a deleterious effect. In her book Delusions of Gender, Cordelia Fine describes an experiment in which the mere act of having women note their gender on a form tended to lower their confidence in their own math abilities. 

I was about to say “don’t get me started.” Too late. 

Anyway: We in the financial services industry need to recognize that when data are sexist, AIs that mine them are doomed to be sexist as well. Right now, while AI is still in its infancy, is the time to make sure we don’t let that (continue to) happen. If fairness isn’t enough of a motivator, consider what else Madhvi Mavadiya pointed out:

… companies with more balanced gender ratios outperform their peer segments. On this, Tracey Davies, president of Money20/20, states that “for a financial services industry in the midst of a generational shift in technology, business models and consumer expectations, the benefit of more diverse leadership perspectives—gender and otherwise—can only improve the outcome for customers, regulators and stakeholders.”

It’s a rare opportunity when profit and fairness intersect. Let’s not waste it.

Comments are closed.