
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XVI October 2025 | Special Issue on Sociology
resulting in the devaluing of marginalized communities and the perpetuation of systemic injustices. Viewing this
issue through a feminist lens highlight how societal biases are embedded in technology, emphasizing the need
for ethical AI that prioritizes diversity and equitable representation. The novel A Room of One's Own, authored
by Virginia Woolf, offers a compelling comparison to this ongoing discussion. Woolf maintains that for women
to contribute meaningfully to society, they must have financial autonomy and the ability to think and innovate
freely. Similarly, in the field of AI-based marketing, promoting diversity in terms of gender representation
requires breaking down structural barriers that favor dominant perspectives in data collection and algorithmic
decision-making. Just as Woolf criticizes the exclusion of women from literary and intellectual spaces, feminist
critiques of AI reveal the exclusion of marginalized voices in the digital economy. Therefore, addressing
algorithmic bias is not simply a technical issue, but a vital matter of equity, empowerment, and the right to shape
the digital landscape.
Work Cited
1. Gorska, A. M., & Jemielniak, D. (2023). The invisible women: uncovering gender bias in AI-generated
images of professionals. Feminist Media Studies, 23(8), 4370–4375.
https://doi.org/10.1080/14680777.2023.2263659
2. Crenshaw K. Demarginalizing the intersection of race and sex: A black feminist critique of
antidiscrimination doctrine, feminist theory and antiracist politics. Droit et société. 2021;108:465
3. Shrestha S, Das S. Exploring gender biases in ML and AI academic research through systematic
literature review. Frontiers in Artificial Intelligence. 2022;5:976838
4. Chouldechova, A., & Roth, A. (2018). The frontiers of fairness in machine learning. ACM Conference
on Fairness, Accountability, and Transparency, 117-122.
5. Kannan, S., Allen, K., Mishra, S., & Patel, J. (2021). Gender classification and intersectional bias in
Al: Review, challenges, and mitigation strategies. Frontiers in Big Data, 4, 33.
6. The Role of Generative AI in Shaping Human Rights and Gender Equity: A Critical Analysis
7. Dahya, N., Jenson, J., & Fong, K. (2017). (En)gendering videogame development: A feminist approach
to gender, education, and game studies. Review of Education, Pedagogy, and Cultural Studies, 39(4),
367–390. https://doi.org/10.1080/10714413.2017.1344508
8. Nyrup, Rune, Charlene H. Chu, and Elena Falco. Digital ageism, algorithmic bias, and feminist critical
theory. Oxford: Oxford University Press, 2023
9. HANCI, Öznur. "GENDER BIAS OF ARTIFICIAL INTELLIGENCE."
10. Ongena and Popov. (2016). Gender Bias and Credit Access. Source:
https://www.researchgate.net/publication/312068282_Gender_Bias_and_Credit_
11. Nadeem, A., et al. (2020). Gender Bias in AI: A Review of Contributing Factors and Mitigating
Strategies. Source: https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1048&context=acis2020
12. Kelly, S. and Mirpourian, M. (2021). Algorithmic Bias, Financial Inclusion, and Gender.
Source:https://www.womensworldbanking.org/wp-
content/uploads/2021/02/2021_Algorithmic_Bias_Report.pdf
13. Thongtep, K. (2024). The Role of AI in Marketing: Gender Bias Problems in Thailand. Feminist AI.
Retrieved from https://feministai.pubpub.org/pub/qbkkxes2
14. Dastin, J. Amazon scraps secret AI recruiting tool that showed bias against women. In Ethics of Data
and Analytics; Auerbach Publications: Boca Raton, FL, USA, 2018; pp. 296–299. [Google Scholar]
15. Adam, A. (1995), 'A feminist critique of artificial intelligence, European Journal of Women's Studies,
Vol. 2, No 3, pp. 355-377 doi: 10.1177/135050689500200305.
16. Adam, A. (1998), Artificial Knowing: Gender and the thinking machine, Routledge.
17. Adams, A. and Berg, J. (2017), 'When home affects pay: an analysis of the gender pay gap among
crowdworkers' (https://ssrn.com/ abstract=3048711).
18. Smith, R. A. (2002), 'Race, gender, and authority in the workplace: theory and research', Annual
Review of Sociology, Vol. 28, No 1, pp. 509-542. doi:10.1146/annurev.soc.28.110601.141048.
19. Stathoulopoulos, K. and Mateos-Garcia, J. C.(2019), 'Gender diversity in Al research', Nesta - The
Innovation Foundation. doi:10.2139/ ssrn.3428240.
20. Thelwall, M. (2018), 'Gender bias in machine learning for sentiment analysis', Online Information
Review, Vol. 42, No 3, pp. 343-354. doi:10.1108/ OIR-05-2017-0153.