woman reading the news on her phone

Article by Cindy White, CMO, Mitek Systems

For those who identify as transgender or non-binary, human-computer interfaces are almost never built with them in mind.

Facial recognition software is a booming industry, set to be worth USD 8.5 billion by 2025. It’s found in everything from applications for banking apps to airport security. However, instead of making life easier, they reinforce existing biases and discriminate against these groups.

To create a more fair and equal playing field in how artificial intelligence (AI) technologies impact our lives, we must listen, learn and act. As we wave goodbye to Pride Month, it’s high time to put our collective education on equality into practice in a world where digital access for all is a daily requirement.

Equal is equal. Why should some identities work better than others?

A lack of algorithmic inclusivity in technological design has generated digital exclusion through unintentional bias often learned by biometric systems.

Typically, facial recognition software determines the user’s gender simply by scanning their face and assigning male or female labels, based on previous data analysed and learned by the machine. Superficial features such as the amount of makeup on the face, or the shape of the jawline and cheekbones will put your gender into a binary category. As a result, these systems are unable to properly identify non-binary and trans people and match them to their official identity with ease. This is exacerbated by the fact that one third of those who have transitioned do not have updated IDs or records. As it stands, most facial recognition tech can’t accommodate these differences.

This is not an issue isolated to one brand of technology: several researchers have shown how ineffective facial recognition technology is generally in recognising non-binary and transgender people, including Os Keyes’ paper on automatic gender recognition (AGR) systems.

The inability to identify people within these groups has consequences in the real world. It can lead to people being mistreated, unable to get approved for financial products and services or facing issues with the government or police due to misidentification. People who aren’t represented lose the ability to be acknowledged and fight for their freedoms and rights.

Inclusion starts with research and development

The key to solving this begins with noting that biometrics are not within themselves biased; they are not making any decisions based on human values. Bias and inequality in biometric technologies are caused by a lack of diverse demographic data, bugs, and inconsistencies in the algorithms. For example, if the training data primarily includes information related to just one demographic, the learning models will disproportionately focus on the characteristics of that demographic.

To build equitable access for all communities using biometrics, we need to think through which is appropriate to prove identity, and which layers of biometrics are most appropriate for proving if the person is real. Organisations must think beyond the current framework.

Though technology cannot fully solve biases against minority groups, there are opportunities to create new technologies that may help address some of this discrimination. The key is to ensure that trans and non-binary people are involved in the design process. Research from the Universities of Michigan and Illinois conducted design sessions with 21 members of the community. They detailed four types of trans-inclusive technologies: those for changing bodies, changing appearances and/or gender expressions, safety, and finding resources. The results: centering trans people in the design process made the designs more inclusive.

Let’s listen

Success starts with listening before acting. As society strives for equality, safety, and fairness for everyone, it’s crucial that those who will benefit most from the technology changes are at the heart of the design process.

About the author

Cindy WhiteInternational marketing executive with extensive experience across B2B and B2C, Cindy White has a proven record of innovation and leadership and a passion for building brands and product marketing. As Mitek Chief Marketing Officer, Cindy leads the company’s global marketing, brand, communications, product marketing, customer acquisition and partner programs.

Before joining Mitek, Cindy was Vice President of Marketing at FICO, where she developed a deep interest and expertise in fraud prevention. Previously, she was Director of Worldwide SMB Marketing for Microsoft, leading a global team chartered with the roll out of Office 365 and supporting the success of more than 85 million customers worldwide.

Passionate about leadership, with more than 25 years of experience in technology, Cindy has developed a reputation for building high-performing teams in complex and fast-paced environments. A native of South Africa, Cindy holds a degree in communications from the Chartered Institute of Marketing Management in Johannesburg.


WeAreTechWomen covers the latest female centric news stories from around the world, focusing on women in technology, careers and current affairs. You can find all the latest gender news here

Don’t forget, you can also follow us via our social media channels for the latest up-to-date gender news. Click to follow us on Twitter, Facebook and YouTube