Artificial Intelligence and gender equality
A study by the Berkeley Haas Center for Equity, Gender and Leadership analysed 133 AI systems across different industries and found that about 44 per cent of them showed gender bias, and 25 per cent exhibited both gender and racial bias.
Beyza Doğuç, an artist from Ankara, Turkey, encountered gender bias in Generative AI when she was researching for a novel and prompted it to write a story about a doctor and a nurse. Generative AI creates new content (text, images, video, etc.) inspired by similar content and data that it was trained on, often in response to questions or prompts by a user.
The AI made the doctor male and the nurse female. Doğuç continued to give it more prompts, and the AI always chose gender stereotypical roles for the characters and associated certain qualities and skills with male or female characters. When she asked the AI about the gender bias it exhibited, the AI explained it was because of the data it had been trained on and specifically, “word embedding” – which means the way certain words are encoded in machine learning to reflect their meaning and association with other words – it’s how machines learn and work with human language. If the AI is trained on data that associates women and men with different and specific skills or interests, it will generate content reflecting that bias.
“Artificial intelligence mirrors the biases that are present in our society and that manifest in AI training data,” said Doğuç, in a recent interview with UN Women.
Who develops AI, and what kind of data it is trained on, has gender implications for AI-powered solutions.
Sola Mahfouz, a quantum computing researcher at Tufts University, is excited about AI, but also concerned. “Is it equitable? How much does it mirror our society’s patriarchal structures and inherent biases from its predominantly male creators,” she reflected.
Mahfouz was born in Afghanistan, where she was forced to leave school when the Taliban came to her home and threatened her family. She eventually escaped Afghanistan and immigrated to the U.S. in 2016 to attend college.
As companies are scrambling for more data to feed AI systems, researchers from Epoch claim that tech companies could run out of high-quality data used by AI by 2026.
Natacha Sangwa is a student from Rwanda who participated in the first coding camp organized under the African Girls Can Code Initiative last year. “I have noticed that [AI] is mostly developed by men and trained on datasets that are primarily based on men,” said Sangwa, who saw first-hand how that impacts women’s experience with the technology. “When women use some AI-powered systems to diagnose illnesses, they often receive inaccurate answers, because the AI is not aware of symptoms that may present differently in women.”
If current trends continue, AI-powered technology and services will continue lacking diverse gender and racial perspectives, and that gap will result in lower quality of services, biased decisions about jobs, credit, health care and more.