Algorithmic Bias and Gender Representation: Feminist Perspectives on AI-Driven Marketing
Authors
Manav Rachna International Institute of Research Studies Faridabad (India)
Manav Rachna International Institute of Research Studies Faridabad (India)
Article Information
DOI: 10.47772/IJRISS.2024.916SCO0020
Subject Category: Artificial Intelligence
Volume/Issue: 9/16 | Page No: 206-215
Publication Timeline
Submitted: 2025-10-17
Accepted: 2025-10-22
Published: 2025-11-06
Abstract
The use of Artificial Intelligence and Machine Learning in marketing is increasing, leading to more personalized content and targeted advertising. However, concerns are being raised about biased algorithms, particularly regarding gender representation. Due to the use of biased historical data, AI systems can reinforce gender stereotypes and exclude women and non-binary individuals from marketing campaigns. This paper examines the implications of algorithmic bias in AI-based marketing from a feminist perspective, drawing parallels to critiques of gender portrayal and discrimination in literature. Feminist theories argue that technology is shaped by the biases of its designers. This can be seen in AI-powered marketing, where biased algorithms lead to advertisements that reinforce traditional gender stereotypes. This study reveals how these biases in digital advertising further marginalize and objectify marginalized groups. The research also draws parallels between the fight for gender equality in storytelling and the challenges faced in the digital world. A Room of One's Own by Virginia Woolf delves into the persistent issues of discrimination, dominance, and representation of gender, which are still relevant today. The book emphasizes the importance of women having autonomy and accurate portrayal, mirroring the current movement towards diverse and empowering AI models. The research highlights the need for a feminist and intersectional approach to address bias in marketing algorithms. It stresses the significance of using diverse training data and promoting transparency in creating ethical AI systems. The inclusion of literature in discussions is crucial in reshaping societal norms for a more equitable environment in AI-driven marketing. This paper emphasizes the importance of collaboration between technology, policies, and feminist discussions to ensure fair and diverse gender representation in AI marketing.
Keywords
Artificial Intelligence Marketing, Algorithm bias, Machine learning, Gender discrimination
Downloads
References
1. Gorska, A. M., & Jemielniak, D. (2023). The invisible women: uncovering gender bias in AI-generated images of professionals. Feminist Media Studies, 23(8), 4370–4375. https://doi.org/10.1080/14680777.2023.2263659 [Google Scholar] [Crossref]
2. Crenshaw K. Demarginalizing the intersection of race and sex: A black feminist critique of antidiscrimination doctrine, feminist theory and antiracist politics. Droit et société. 2021;108:465 [Google Scholar] [Crossref]
3. Shrestha S, Das S. Exploring gender biases in ML and AI academic research through systematic literature review. Frontiers in Artificial Intelligence. 2022;5:976838 [Google Scholar] [Crossref]
4. Chouldechova, A., & Roth, A. (2018). The frontiers of fairness in machine learning. ACM Conference on Fairness, Accountability, and Transparency, 117-122. [Google Scholar] [Crossref]
5. Kannan, S., Allen, K., Mishra, S., & Patel, J. (2021). Gender classification and intersectional bias in Al: Review, challenges, and mitigation strategies. Frontiers in Big Data, 4, 33. [Google Scholar] [Crossref]
6. The Role of Generative AI in Shaping Human Rights and Gender Equity: A Critical Analysis [Google Scholar] [Crossref]
7. Dahya, N., Jenson, J., & Fong, K. (2017). (En)gendering videogame development: A feminist approach to gender, education, and game studies. Review of Education, Pedagogy, and Cultural Studies, 39(4), 367–390. https://doi.org/10.1080/10714413.2017.1344508 [Google Scholar] [Crossref]
8. Nyrup, Rune, Charlene H. Chu, and Elena Falco. Digital ageism, algorithmic bias, and feminist critical theory. Oxford: Oxford University Press, 2023 [Google Scholar] [Crossref]
9. HANCI, Öznur. "GENDER BIAS OF ARTIFICIAL INTELLIGENCE." [Google Scholar] [Crossref]
10. Ongena and Popov. (2016). Gender Bias and Credit Access. Source: https://www.researchgate.net/publication/312068282_Gender_Bias_and_Credit_ [Google Scholar] [Crossref]
11. Nadeem, A., et al. (2020). Gender Bias in AI: A Review of Contributing Factors and Mitigating Strategies. Source: https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1048&context=acis2020 [Google Scholar] [Crossref]
12. Kelly, S. and Mirpourian, M. (2021). Algorithmic Bias, Financial Inclusion, and Gender. Source:https://www.womensworldbanking.org/wp-content/uploads/2021/02/2021_Algorithmic_Bias_Report.pdf [Google Scholar] [Crossref]
13. Thongtep, K. (2024). The Role of AI in Marketing: Gender Bias Problems in Thailand. Feminist AI. Retrieved from https://feministai.pubpub.org/pub/qbkkxes2 [Google Scholar] [Crossref]
14. Dastin, J. Amazon scraps secret AI recruiting tool that showed bias against women. In Ethics of Data and Analytics; Auerbach Publications: Boca Raton, FL, USA, 2018; pp. 296–299. [Google Scholar] [Google Scholar] [Crossref]
15. Adam, A. (1995), 'A feminist critique of artificial intelligence, European Journal of Women's Studies, Vol. 2, No 3, pp. 355-377 doi: 10.1177/135050689500200305. [Google Scholar] [Crossref]
16. Adam, A. (1998), Artificial Knowing: Gender and the thinking machine, Routledge. [Google Scholar] [Crossref]
17. Adams, A. and Berg, J. (2017), 'When home affects pay: an analysis of the gender pay gap among crowdworkers' (https://ssrn.com/ abstract=3048711). [Google Scholar] [Crossref]
18. Smith, R. A. (2002), 'Race, gender, and authority in the workplace: theory and research', Annual Review of Sociology, Vol. 28, No 1, pp. 509-542. doi:10.1146/annurev.soc.28.110601.141048. [Google Scholar] [Crossref]
19. Stathoulopoulos, K. and Mateos-Garcia, J. C.(2019), 'Gender diversity in Al research', Nesta - The Innovation Foundation. doi:10.2139/ ssrn.3428240. [Google Scholar] [Crossref]
20. Thelwall, M. (2018), 'Gender bias in machine learning for sentiment analysis', Online Information Review, Vol. 42, No 3, pp. 343-354. doi:10.1108/ OIR-05-2017-0153. [Google Scholar] [Crossref]
Metrics
Views & Downloads
Similar Articles
- The Role of Artificial Intelligence in Revolutionizing Library Services in Nairobi: Ethical Implications and Future Trends in User Interaction
- ESPYREAL: A Mobile Based Multi-Currency Identifier for Visually Impaired Individuals Using Convolutional Neural Network
- Comparative Analysis of AI-Driven IoT-Based Smart Agriculture Platforms with Blockchain-Enabled Marketplaces
- AI-Based Dish Recommender System for Reducing Fruit Waste through Spoilage Detection and Ripeness Assessment
- SEA-TALK: An AI-Powered Voice Translator and Southeast Asian Dialects Recognition