AI is permeating the business world today the way electricity did at the turn of the last century. We can already see that businesses that put AI at their core can reach more people, more quickly and make more money. As Harvard Business School professors Marco Iansiti and Karim Lakhani show in their book, Competing in the Age of AI, Strategy and Leadership When Algorithms And Networks Run The World, Ant Financial serves more than 10 times as many customers as the largest US banks with less than one tenth the number of employees.
The fuel for artificial intelligence is data, which is by definition historic. Unless we are careful about what data sets the AI learns on, we might end up creating algorithms for a Nazi utopia.
AI Amplifies Bias
In her book, Invisible Women, Exposing Data Bias In A World Designed For Men, Caroline Criado Perez sites a 2017 images study. In this study, images of cooking were over 33% more likely to involve women than men. Yet the algorithms trained on this dataset connected pictures of kitchens with women 68% of the time.
The higher the original bias, the stronger the amplification effect. In fact, the algorithm labeled a balding man standing in front of a stove as female. This makes standing near a stove a better indicator of being female than being bald.
The gender pay gap stubbornly remains at around 20%. Women receive less than 1% of venture capital funding in the UK, which rises to just 3% in the US. Is this what we want to recreate at scale?
When we create AI algorithms, we have the chance to create the world we want to live in.
Developers Can’t Fix the Problem Alone
We cannot leave algorithm design just to the developers simply because not enough of them are women. In the US, women hold only a quarter of the jobs in computing and leave the tech sectors at twice the rate of men. This is even lower for women of color. Black women hold only 3% of jobs among women in tech and Latina women just 1%.
Even with the best intentions, left unchecked, developers bake bias into the algorithms.
In 2014, Amazon began creating computer programs to make hiring developers easier. One of the algorithms began by analyzing the patterns in resumes submitted to the company over a decade. Since most successful applicants were male, the algorithm decided that being male was a deciding factor. It went on to penalize applications from women and downgraded graduates from all female colleges.
This algorithm was eventually dropped and the team that made it was disbanded. But how many women had to lose out on opportunities for that to happen?
Now for Some Good News
It is the responsibility of non-technical professionals and regulators to understand the basics of how algorithms and technology products are made. The good thing is that it is not that hard.
An algorithm is simply a set of rules for computers to follow. The algorithm for the area of a triangle is (base x height)/2. As long as you know the basics of how algorithms are made and how to question the data that goes into them, you can have a major impact.
My favorite subject at college was Ancient Greek drama, and I began my career in PR. Yet, I have built a technology company and participated in creating algorithms. I do not write the code, but I work on the logic and question the data.
The technology industry is one of the few sectors to emerge stronger from this crisis. As our work and social activities have gone online, the tech sector has been the benefactor. Women have brains. Women have voices. Let’s use our brains to learn about the business of tech and let’s use our voices to shape it.
If you do, not only will you boost your career, you will also help shape a fairer society.