Opinion: Will gender equality be the victim as artificial intelligence drives development?

The lack of a gender justice lens could prove to be a major hurdle to inclusive development , especially in Artificial Intelligence applications.

Gender balance in Artificial Intelligence (AI) applications is crucial to the goal of bringing about positive social change through use of technology. With artificial intelligence becoming ever more widespread, gender diversity in platform, data and AI governance can present solutions to gender inequities, protect and empower communities facing gender-related violence, and support diversity in the technology industry.

Some crucial issues which we need to look at are:

The right to internet

As per a UNESCO report, women have a very low share in advanced technology jobs which include non-­routine, cognitive tasks that are in demand in the digital economy. This is a result of a lack of access to technology— a major problem in India. Data suggests that only 46% of Indian women between the ages of 15 to 65 own a mobile phone as compared to 56% ownership among Indian men.

One way for Indian law to further the rights of women in India is by addressing the principle of Right to the Internet. In 2020, the Kerala High Court  recognized that mobile phones and internet access through it are part and parcel of day-to-day life and an essential part of the infrastructure of freedom of speech and expression. 

The court had looked at resolutions adopted by the United Nations Human Rights Council and the General Assembly which unequivocally assert that internet access plays a key role in accessing information, and its close link to education and knowledge. The court took the view that the right to be able to access internet fell under Article 21 of the Constitution, which guarantees two rights: Right to life, and Right to personal liberty.

However, in a subsequent petition filed in the Supreme Court for restoration of 4G internet services in Jammu & Kashmir, the government had argued that the right to access the internet is not a fundamental right. But on January 10th, 2020, the Supreme Court reaffirmed that access to the internet is a fundamental right of the people of Jammu and Kashmir. The apex court said that it comes under Article 19 of the Constitution (protection of certain rights regarding freedom of speech) and that an internet ban and section 144 can be imposed only when it is unavoidable.


Read more: Helping boys and men become change agents in the gender equality movement


Role of labour

Recent research by the IMF and the Institute for Women’s Policy Research found that women are at a significantly higher risk of displacement than men due to job automation. Further study by the World Economic Forum highlights that over 57% of the jobs that are set to be displaced by digital automation between now and 2026 belong to women – especially mid­level, routine, cognitive jobs, where women dominate. For instance, in a Gender Balance Workforce Survey among women working in the gaming industry in the UK, 45% felt that their gender was a limiting factor in their career progression and 33% said they had faced harassment or bullying because of their gender.

This calls for an urgent relook at inclusivity practices of women in technology including hiring at the executive level and in hiring of non-male programmers. As a first step towards this objective, California passed the California Senate Bill No. 826, which mandates that a minimum number of women be included on corporate boards. Today in the US, women make up almost half (47%) of the workforce, but they hold less than one-third (28%) of the leadership positions in tech companies.

This emphasises the need for higher participation of women and gender experts in the process of principle formulations at the foundation level. There also needs to be an improvement in representation of women in technical roles globally and in tech companies’ boardrooms. Companies thus need to create robust gender-inclusive Artificial Intelligence principles, guidelines and codes of ethics to enable the same.

Some common principles companies include in their policies include terms such as transparency, fairness, responsibility etc. Taking the example of the term ‘fairness’, a study reveals that till date there is no unified definition of algorithmic fairness. The need for inclusion of more feminist principles such as access to internet, languages, information to make informed decisions and privacy are still not common in the world of tech companies.

An example of biased Artificial Intelligence was when Amazon’s secret experimental hiring AI ended up being biased against female candidates. One of their algorithms to vet resumes submitted to the company over a 10 year period had led to most applicants being men. This algorithmic bias was so skewed in favour of male candidates that it penalised candidates with the word “women” in their resume. Despite Amazon editing the program to make the terms gender neutral, the algorithm still allegedly favoured male candidates. Eventually, they had to scrap the algorithm from their recruiting methods.

White woman humanoid creating artificial intelligence 3D rendering
Will gender equality fall prey as AI drives social change? Representational image. Courtesy: theaiorganisation.com

Data sets, the starting point

Many Artificial Intelligence data sets may not keep this in mind, even if it is “apparently” women centric. A data set is a collection of data which is treated as a single unit by the computer that processes it. This means that separate pieces of data are used to train an algorithm to predict a pattern inside the whole data set. Data sets are the first step in creating any AI model and hence crucial to ensure the model is without bias.

Women are a multifaceted and heterogeneous group and face diverse realities. These include women living in rural and remote areas, indigenous women, women from ethnic or religious minorities, women living with disabilities, HIV/AIDS, etc. 

For instance, the word Chamar, used on Twitter is not just a casteist slur. Use of this term is punishable under the Scheduled Castes and Tribes (Prevention of Atrocities) Act, 1989. Yet, social spaces today are structurally organised in a manner that women from different castes and classes experience discrimination and violence very differently. This intersectionality or relationship between discrimination and gender is an important aspect which data sets need to take into account to avoid exclusions and biases.


Read more: “Policy changes can help make Bengaluru an equal city for women”


Any AI-generated information depends on patterns, predictions and recommendations which are a reflection of the accuracy, universality and reliability of the data used, and the inherent assumptions and biases of the developers of the algorithms that use this data.

Most designers, coders or developers start with a standard user in mind and, in doing so, set in motion patterns of discrimination hidden under an assumption that the system is viewing the user in a neutral manner. Given that only 22% of professionals in AI and data science fields are women, and are more likely to occupy jobs associated with less status, gender bias can and does creep into AI systems.

For example, if an online platform wants to identify its highest purchasing customers and wants to give them additional benefits through their platform, an AI model would be created with details of the customers as data sets. However, the quality of this data depends on ensuring there are no biases or blindspots in it. For example, in 2015, Google’s photo-categorization software was found to be labelling Black people as gorillas. Even though Google corrected this through a bandaid solution of removing the word gorillas, this serves as an important example of racial bias in data sets that then get fed to an AI.

A 2019 UNESCO Report shows that gender biases are present in Artificial Intelligence (AI) data sets in general and training data sets in particular. It is important to overcome this built in bias and to ensure that data sets represent all populations, especially those it will affect.  At a more granular level, humans generate, collect, and label the data that goes into data sets and determine what variables and rules the algorithms learn from these data sets to make predictions. Both these stages can introduce biases that then become embedded in AI systems.

Gender equality datasets and AI applications
Most designers, coders or developers start with a standard user in mind and, in doing so, set in motion patterns of discrimination. Representational image by Gerd Altmann/Pixabay (CC)

How gender bias creeps in

For instance, “hers” is not recognized as a pronoun by the most widely used technologies for Natural Language Processing (NLP), including in such programmes created by Amazon Comprehend, Google Natural Language API, and the Stanford Parser. Another shocking example happened in 2019, when an Apple application was found to offer smaller credit lines to women than to men with similar credit scores. The company stated its algorithm was gender-blind but admitted that the algorithms used to set limits were inherently biased against women.

Preventing such gender biases in software applications calls for better corporate governance. This includes diversity in hiring and retention practices, and enabling a work culture where gender equality principles are explicit and prioritise accountability. 

A computer is only as good as the people behind it. That is a fundamental aspect that needs to be kept in mind while training and implementing AI solutions for better gender equality.

Looking inward, during the pandemic, India failed to address the needs of women or the issue of their access to the internet and digital services.  

India must look at fulfilling roles for women in digital services. Currently, women’s voices are hardly represented in sectors like cloud computing or digital deliverable services. Societies must be more receptive to the needs of gender minorities and enable forms of co-ownership – where previously overlooked communities can take a seat at the table and build a better digital future for all.

Also read:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Similar Story

‘Banni Nodi’: How a place-making project is keeping history alive in modern Bengaluru

The Banni Nodi wayfaring project has put KR market metro station at the heart of a showcase to the city's 500-year urban history.

KR market metro station is more than a transit hub in Bengaluru today, as it stands at the heart of a project that showcases the city's 500-year urban history. The Banni Nodi (come, see) series, a wayfinding and place-making project, set up in the metro station and at the Old Fort district, depicts the history of the Fort as well as the city's spatial-cultural evolution. The project has been designed and executed by Sensing Local and Native Place, and supported by the Directorate of Urban Land Transport (DULT) and Bangalore Metro Rail Corporation Limited (BMRCL).  Archival paintings, maps and texts,…

Similar Story

Wounds of cyber abuse can be deep, get expert help: Cyber psychologist

Cyber psychologist Nirali Bhatia says that parents, friends and relatives of sufferers must not be reactive; they should be good listeners.

As technology has advanced, cyber abuse and crime has also increased. Women and children are particularly vulnerable, as we have seen in our earlier reports on deepfake videos and image-based abuse. In an interview with Citizen Matters, cyber psychologist, Nirali Bhatia, talks about the psychological impact on people who have been deceived on the internet and the support system they need. Excerpts from the conversation: What should a person do, if and when they have fallen prey to a deep fake scam or image abuse? We need to understand and tell ourselves it is fake; that itself should help us…