By: Matison Miller
Published on: May 4, 2023
Gender biases are deeply ingrained in American society, limiting women’s career options.[1] Educational institutions, employment opportunities, and familial structures reinforce gender stereotypes and impose reductionist assumptions about the characteristics and roles of men and women.[2] Despite decades of work to close the employment gap between men and women, forty percent of individuals surveyed by the United Nations feel that men make better business executives and have more of a right to jobs than women when jobs are scarce.[3] As employers develop and rely more heavily on tools created to streamline the hiring process, like artificial intelligence (“AI”) programs, the threat of perpetuating systemic gender stereotypes increases. Most AI programs used today rely on the input of data or text examples by programmers to learn information and create a response. AI reflects the biases of its designers and is limited by provided datasets in its decision-making abilities.[4] Without adequate attention, programs created to assist in lengthy employment processes, like ChatGPT,[5] may harm women’s abilities to secure viable employment or sue for sex-discrimination.
Studies have already shown the design and use of AI programs can significantly impede women’s lives.[6] Flawed AI programs may enhance pre-existing biases, preventing women from entering the workforce. For example, the head of the computation and language lab at U.C. Berkeley, “got the bot to write code to say only white or Asian men would make good scientists.”[7] Employers could create a program that results in only hiring specific groups of men, discriminating against women or other minority groups. Professor Melanie Mitchell, a researcher of recognition in artificial intelligence systems at Santa Fe Institute, claims that problems come from programs relying on data or input[8]that may be racist or sexist.[9] Gender gaps can widen because of misinformed algorithms, which primarily come from a limited dataset input and involve fewer women from the start.[10] Currently, very few women work in tech, and the work conditions ensure that not many will stay.[11] The exclusion of women in the tech industry creates flawed datasets that lack representative data, resulting in AI that perpetuates implicit bias.[12]
Women may attempt to sue their employers for sex discrimination resulting from flawed AI but will likely find it difficult to succeed. Sex-based discrimination during the hiring process takes place when an employer intentionally discriminates against a qualified candidate based on their sex.[13] Title VII of the Civil Rights Act prevents employers from using race, sex, religion, or national origin as a motivating factor for any employment practice, such as hiring an employee, firing an employee or promoting an employee.[14] To bring a claim, a woman must establish that the AI explicitly discriminated against her during the screening process.[15] This may make the already difficult task of succeeding in an employment discrimination claim even harder.[16] An algorithm may analyze tone and vocabulary to determine whether a candidate exhibits the preferred traits compared to the job description.[17] Gender coded-language, words that are commonly associated with either men or women,[18] often creates gender biases that inhibit a woman’s job performance; thus, when language is male-coded, the algorithm may be more likely to choose a male candidate.[19] A woman challenging sex-based discrimination caused by gender-coded language is then required to prove to the court the link between the language comparison and explicit discrimination against her. This provides shaky ground for a successful lawsuit and may create an opportunity for corporations to practice more discriminatory hiring practices without recourse. Therefore, as AI develops, datasets should include diverse datasets to prevent implicit bias against women in the workplace.
[1] U.N. Dev. Program, Almost 90% of Men/Women Globally are Biased Against Women, UDP (Mar. 5, 2020), https://www.undp.org/press-releases/almost-90-men/women-globally-are-biased-against-women.
[2] See Ayesha Nadeem et al., Gender Bias in AI: A Review of Contributing Factors and Mitigating Strategies, ACIS 2020 Proceedings 1, 6 (2020), https://aisel.aisnet.org/acis2020/27; see also Gender Bias in Search Algorithms Has Effect on Users, New Study Finds, N.Y.U. (Jul. 12, 2022), https://www.nyu.edu/about/news-publications/news/2022/july/gender-bias-in-search-algorithms-has-effect-on-users–new-study-.html (showing study participants, both male and female, judged members of these professions as more likely to be a man than a woman).
[3] See U.N. Dev. Program, supra note 1 (concluding close to ninety percent of men and women hold some sort of bias against women).
[4] Antony Brydon, Why AI Needs Human Input (And Always Will), Forbes (Oct. 30, 2019), https://www.forbes.com/sites/forbestechcouncil/2019/10/30/why-ai-needs-human-input-and-always-will/?sh=11cc4b855ff7.
[5] ChatGPT is a language model developed by OpenAI. ChatGPT gathers information from datasets of text, the internet, and various other sources to generate a response to users’ questions. Users can ask ChatGPT to write a set of code, summarize books, design crochet patterns, or compose a song. Natalie, ChatGPT General FAQ, OpenAI,https://www.help.openai.com/en/articles/6783457-chatgpt-general-faq.
[6] See, e.g., Carmen Niethammer, AI Bias Could Put Women’s Lives at Risk – A Challenge for Regulators, Forbes (Mar. 2, 2020), https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-womens-lives-at-riska-challenge-for-regulators/?sh=17a0d001534f.
[7] See Davey Alba, OpenAi Chatbot Spits Out Biased Musings, Despite Guardrails, Bloomberg L. (Dec. 8, 2022), https://www.bloomberg.com/news/newsletters/2022-12-08/chatgpt-open-ai-s-chatbot-is-spitting-out-biased-sexist-results (noting that OpenAI has since updated the ChatGPT to respond “it is not appropriate to use a person’s race or gender as a determinant of whether they would be a good scientist.”).
[8] Data like job descriptions, statistics, or progress reports.
[9] Alba, supra note 5.
[10] See id. (stating only twenty-two percent of AI professionals globally are women).
[11] See Ashton Jackson, 38% of Women in Tech Plan on Leaving their Job Within the Next 2 Years – Here’s Why, CNBC(Nov. 24, 2021), https://www.cnbc.com/2021/11/24/38percent-on-women-in-tech-plan-on-leaving-their-job-in-the-next-2-years.html (following a study of 1,000 women working in the tech industry, 38% of female participants said they plan to leave their jobs in tech altogether and 46% say their organizations do not actively promote gender equality in hiring and culture).
[12] See generally Gary D. Friedman & Thomas McCarthy, Employment Law Red Flags in the Use of Artificial Intelligence in Hiring, ABA (Oct. 1, 2020), https://www.americanbar.org/groups/business_law/publications/blt/2020/10/ai-in-hiring/ (explaining that employers are gravitating more towards the use of AI algorithms to ease the hiring process by inputting candidate qualifications and hoping that AI will identify qualified candidates).
[13] See, e.g., Sex-Based Discrimination, U.S. Equal Emp. Opportunities Comm’n, https://www.eeoc.gov/sex-based-discrimination (forbidding discrimination regarding any aspect of employment including hiring, firing, pay, job assessments, promotions, layoff, training, fringe benefits, and any other term or condition of employment).
[14] See Civil Rights Act, S.1177, 114th Cong. tit. VII (1964) (requiring the challenging party to provide demonstrative evidence of sex-based discrimination).
[15] See, e.g., McDonnell Douglas Corp. v. Green, 411 U.S. 792, 802 (1973) (establishing a claimant must establish a prima facie case of discrimination showing that he belongs to a minority group, he applied and was qualified for a job for which the employer was seeking applicants, he was rejected, and after his rejection, the position remained open and the employer continued to seek applicants); see also Price Waterhouse v. Hopkins, 490 U.S. 228, 249 (1989) (holding that the employer must have made the decision because the applicant or employee was a woman and the employer was acting on the basis of a belief that a woman fulfill the role due to gender stereotypes).
[16] See Maryam Jameel & Joe Yerardi, Workplace Discrimination is Illegal. But our Data Shows it’s Still a Huge Problem, Vox News (Feb. 28, 2019), https://www.vox.com/policy-and-politics/2019/2/28/18241973/workplace-discrimination-cpi-investigation-eeoc (explaining that because workplace discrimination can often manifest in subtle ways, such as the assignments workers are given, the pay or benefits they receive, and the ways their performance is judged or rewarded, most discrimination claims fail due to insufficient evidence of unequal treatment).
[17] See Friedman, supra note 12 (using HireVue, a video interview platform, as an example of how and AI platform can analyze applicant data).
[18] For example, “fireman” or “policeman.”
[19] See generally Christine Ro, The Coded Language that Holds Women Back at Work, BBC (Aug. 3, 2021), https://www.bbc.com/worklife/article/20210730-the-coded-language-that-holds-women-back-at-work (explaining that the language used in job applications, websites, and marketing materials often deter qualified women from applying and succeeding in certain positions).