Article by Rebecca Danks, Product Owner at Yapily

Woman typing on laptop, flexible working, gender biasIt’s no secret that the technology industry has a diversity problem. This is particularly seen within product and dev teams in fintech – where products developed by small groups of people result in inherent biases that are having a knock-on effect with how customers experience financial products and services.

This grows in importance, as payments are at the centre of the majority of customer experiences, so we have to ensure we don’t alienate these audiences before we even begin.

Biased data leads to biased decisions

We’ve already seen a number of high profile cases where products have proved inherently biased. Last year, Apple came under fire when its Goldman Sachs-backed credit card was found to offer men better rates than women – even when they had equal credit ratings. Further, take the software designed to warn consumers using Nikon’s cameras when the person being photographed is blinking – it tends to be biased against certain races.

Though biased decision-making isn’t unique to AI, the growing scope of its impact on fundamental aspects of our lives makes it very important to ensure all biases are mitigated. However, while hiring more women or ethnic minorities to develop fintech products will contribute towards ensuring some gender and ethnic biases are removed, it isn’t quite that simple.

Most machine learning algorithms are trained using large, annotated datasets. In natural language processing for example, standard algorithms are trained on datasets made up of billions of words. Often, these datasets are compiled from scraping the internet for relevant data which in turn is then annotated by humans. It’s these methods that can unintentionally lead to gender, ethnic and cultural biases in datasets.

Often this results in certain groups being overrepresented or underrepresented compared to others. For example, historically, men had the highest salaries and therefore built the strongest credit lines. So it’s not surprising that any lending decision made by algorithms trained on data sets of historical credit lines would be inherently biased in favour of men. There’s simply more data out there pertaining to men having strong credit histories.

To take another example, Google’s translate tool was found to change phrases referring to women into “he said” or “he wrote” when translating news articles from Spanish to English. This inherent gender bias was also found when the tool translated into languages where words are gendered – with words such as “nurse” automatically being changed to feminine while “doctor” was translated as masculine. The flaw, which has already been remedied by Google, was found to inadvertently replicate gender biases that existed in pre-translated examples on the internet.

Customer-led product development is key

While a lack of diversity, especially with regards to women, can’t be ignored within technology product development, the causes of machine learning biases are much more nuanced. When creating new products, the ultimate goal has to be that they are accessible, can be used by everybody and offer something people actually need. While this requires a diverse workforce, it also requires putting the customer at the heart of all product development.

That’s because even in the most diverse workforce there will be some level of unconscious bias. So it’s important to check in on unconscious bias at all stages of product development and ensure that everyone in the team has had unconscious bias training. Along with this, finding out what your end user actually wants is really important when trying to remove all biases from development; as it helps avoid building products from within an echo chamber.

Once products have been sent to market, it’s equally important that the teams who maintain the service also keep the users front of mind. They need to ensure that any machine learning algorithms are carefully audited so feedback loops don’t occur and biases are stopped from creeping in.

Ultimately, there’s no silver bullet for overcoming biases in fintech product development. But they can be mitigated by having diverse teams who clean historical datasets for training models, while applying common sense to the end decision. By catching these inherent biases during development, we can stop the negative knock-on effects for customers when they experience financial products and services.

Rebecca DanksAbout the author

Rebecca Danks is the Product Owner at Yapily, an enterprise connectivity company. Starting her career in Innovation at Nationwide, Rebecca has been building and developing fintech products for more than four years. During her time at Nationwide, Rebecca shaped and built product concepts to improve the financial wellbeing of millennials. Since joining Yapily, Rebecca’s focus is to deliver product roadmap of new bank integrations, enabling companies and customers worldwide to benefit from open banking.

We are excited to introduce our first ever global virtual conference, Disrupt. Innovate. Lead.

This unique learning experience is aimed at individuals working in technology who would like broaden their industry knowledge, learn new skills and benefit from the thought leadership of some of the brightest minds in the tech industry.

BOOK NOW