Today, we in the developed world live facilitated lives and the degree of “help” we get from algorithmically empowered machines is exponentially increasing. Be it in our car rides to work, or in auto-correcting our spelling inadequacies, or perhaps in simply suggesting what we would like to buy next, in each instance we are being intellectually lifted to new levels by mathematical tools running behind the scenes. It is remarkable just how quickly these changes have entered society and how quickly we have assumed them and, in many cases, have rapidly become dependent upon them. So dependent that many times we simply overlook their omnipresence, preferring instead to ignore just how much our data trails teach others about us or remind us of our own limitations.
This story in fact starts much further back than you might think. The very first algorithm designed to run on a machine was written by Ada Lovelace, in 1843. Breaking with all traditions of the day, Ada, the estranged daughter of an English Baron, was deeply schooled in mathematics and science by the top tutors of her time. Ada excelled early on and provided key insights into the design and possibility of the first computer ever built – Charles Babbage’s Analytical Engine. But perhaps most remarkable is how Ada imagined that these simple computational machines could do so much more. She saw before a computer even existed how one could use computation and algorithms to create art, interpret music and much, much more. Ada called this “poetical science.”
Babbage’s engineering project hit bumps. Ada offered her friend more than just her powerful intellectual contributions; she offered to finance him as well, but Babbage declined. Underfunded, the analytical machine was never fully completed and thus never had a chance to run Ada’s code. Computing as we now know it had to wait a full 100 years to be reborn.
But why? To begin to understand Babbage’s self-destructive behavior, it helps to consider heuristics – the processes by which we think, learn and decide. Our brains have to deal with a tremendous volume of inputs and, as a result, a substantial fraction of our intellectual processing happens subconsciously and relies heavily on what are called cognitive associations. These are mental patterns that help us define expectations and are built to help us make quick decisions and to protect us from potential dangers. These are also a large part of our habit loops that shape our actions. But the power and utility of cognitive associations can also set the stage for deep and recalcitrant biases.
In 1835, women were not taught math and sciences. In fact, it was commonly believed that women were simply incapable of understanding these types of technical concepts. Yet Ada was given access to STEM materials and just like many men, she not only understood but excelled in these areas. It is fair to say that this truly challenged Babbage’s expectation of Ada’s capabilities. Beyond her tremendous analytical skills, Ada had the capacity and offered to underwrite the cost required to make his machine possible, without even the need for her husband’s involvement. Yet, this again could not fit in to Babbage’s mental model for Ada, a woman. Charles’ bias blocked his imagination, and the project to build the world’s first working computer failed as a result.
Biases derived from cognitive associations are not limited to perceptions of women. Studies show that if an identical wine is served in bottles that display coveted wine regions vs. regions not known for wine, taste testers will score the identical wines remarkably different – with the wine labeled from the known region perceived as much better. Beyond that, those same wine drinkers believe that their food and dinner guests are more enjoyable than those who had to consume the “inferior” but identical wine.
Studies involving hiring bias report similar findings. Identical resumes that differ by only the name of the applicant have a profound impact on the number of interview call backs seen. Resumes with standard white male names are more effective than those implying African Americans.
Pen names have been used for millennia for diverse purposes – in some cases to enable a single person to write in diverse genres without confusing his or her readership, but in other cases to enable women writers to overcome the cognitive bias that writing was a male bastion.
Thankfully, we are making some progress in resetting the cognitive associations from which societies and individuals operate. But these changes are difficult and very, very slow to achieve. It requires mindfulness and active leadership participation to ensure that diversity is the new mental benchmark. Yet as straight-forward as that might seem, it is in fact more complicated.
Having diverse work forces is simply a start. To build the new cognitive associations we need those diverse work forces (and the workers) to truly succeed. Studies clearly show that if individuals feel that they are under qualified their actual performance suffers as well. The reciprocal is also true. Individuals who are encouraged to believe that they are particularly well-suited for a task do in fact outperform those who are equally competent but are less confident – a phenomenon often called the Pygmalion or Rosenthal Effect. To ensure that the exceptional and well-trained women and minority graduates from STEM programs entering the workforce enjoy successful careers, leaders must be mindful about bias and its implications on employee encouragement and mentorship programs.
Diversity and inclusion is not simply a fairness or ethical objective, it is truly a competitive imperative. Teams that bring competent and diverse thinking are clearly more impactful and the world has never needed them more. To move to higher levels of performance we must collectively “out” the subconscious elements that are holding us back and build together the winning teams of the future. To “win” in diversity we must “win” with inclusion — by being certain that all those on the team understand the importance of their roles and are fully embraced in maximizing their contributions.
It is impossible to imagine what the world might look like today if we had a full 100 year head start in computing with Ada’s algorithms leading the way.