International Journal of Research and Innovation in Social Science

Submission Deadline- 11th September 2025
September Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-04th September 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-19th September 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Integrating Social Exchange and Objectification Theories in AI-Driven Platforms for Enhancing Gender-Based Violence Reporting and Virtual Psychosocial Support in Zimbabwean Higher and Tertiary Education Institutions.

  • July Ndemo
  • Monica Madyembwa
  • Jemitias Mapira
  • 6242-6255
  • Sep 3, 2025
  • Education

Integrating Social Exchange and Objectification Theories in AI-Driven Platforms for Enhancing Gender-Based Violence Reporting and Virtual Psychosocial Support in Zimbabwean Higher and Tertiary Education Institutions.

July Ndemo1; Monica Madyembwa2; Jemitias Mapira3*

1,3Department of Physics, Geography and Environmental Science, School of Natural Science, Great Zimbabwe University, Masvingo, Zimbabwe.

2Department of Mathematics and Computer Science, School of Natural Science, Great Zimbabwe University, Masvingo, Zimbabwe.

*Corresponding author

DOI: https://dx.doi.org/10.47772/IJRISS.2025.903SEDU0454

Received: 29 July 2025; Accepted: 04 August 2025; Published: 03 September 2025

ABSTRACT

Gender-Based Violence in Zimbabwean Higher and Tertiary Education Institutions remains a significant issue, with many survivors reluctant to report incidents due to stigma, fear of retaliation, and institutional inaction. While various interventions exist, there remains a gap in understanding how theoretical frameworks can inform the development of artificial intelligence-powered platforms for secure reporting and virtual psychosocial support. This study addresses this gap by integrating Social Exchange Theory and Objectification Theory to guide the creation of an artificial intelligence-driven platform for Gender-Based Violence disclosure. Social Exchange Theory posits that survivors of gender-based violence evaluate the potential advantages of reporting, against the possible drawbacks. In contrast, Objectification Theory explores how the societal tendency to reduce women to mere physical objects, contributes to the internalization of shame, which in turn reduces the likelihood that survivors will to seek help. A stratified sampling methodology was utilized to ensure a comprehensive and representative sample. The sample was divided into strata based on key demographic variables such as age, gender, academic discipline, and socio-economic background. Data collection involved focus group discussions and in-depth interviews. This study explored the integration of Social Exchange Theory and Objectification Theory in the development of an artificial intelligence-powered model for Gender-Based Violence (GBV) reporting and virtual psychosocial support within Zimbabwean higher and tertiary education institutions. Preliminary findings indicated that 30% of participants believed that incorporating these theories into the AI model would enhance survivors’ willingness to report GBV, addressing concerns related to cost-benefit analysis and societal objectification. Additionally, 35% emphasized the necessity of cognitive-behavioral therapy (CBT)-based virtual support to improve mental health outcomes, while 25% highlighted the importance of embedding technological diffusion frameworks to ensure effective adoption and sustainability. The study recommends a multidisciplinary approach to AI model development, incorporating sociological, psychological, and technological perspectives to create a scalable and ethical solution for GBV prevention and support.

Key words: Gender-Based Violence (GBV); Higher and Tertiary Education Institutions; Virtual psychosocial support; AI-Driven Platforms; Social stigma

BACKGROUND OF THE STUDY

Over the past decade, the integration of advanced technologies significantly enhanced efforts to combat gender-based violence (GBV) within higher and tertiary education institutions (HTEIs). Initially, interventions primarily utilized traditional internet-based platforms to provide support and resources to survivors. However, advancements in artificial intelligence (AI) facilitated the development of sophisticated tools aimed at enabling anonymous reporting and delivering virtual psychosocial support, thereby broadening the scope and efficacy of GBV interventions. ​ In developed countries, AI-driven solutions were implemented to enhance victim accessibility and provide real-time assistance. These platforms were integrated with natural language processing and machine learning algorithms to tailor responses to individual users, offering personalized support and legal guidance. The rAInbow chatbot that used natural language processing algorithm, was developed through a collaboration between AI for Good, the Sage Foundation, and the Soul City Institute for Social Justice. This initiative aimed to provide support to individuals experiencing domestic abuse by utilizing Facebook Messenger to deliver tailored conversations, to women facing domestic abuse. Since its launch, rAInbow has engaged in over one million conversations, assisting approximately 20,000 users.

​In 2020, ActionAid Arab Region (AAAR), a branch of the international non-governmental organization ActionAid operating primarily in Jordan which is in Western Asia, launched the Darb Alaman chatbot in Jordan to provide confidential support to individuals facing gender-based violence (GBV). The chatbot operated through widely used platforms such as Facebook Messenger and WhatsApp, making it accessible to a broad audience. Therefore, by 2022, Darb Alaman had been accessed by 3,402 women, with 250 receiving direct psychosocial support. AAAR focused on achieving social justice, gender equality, and poverty eradication by prioritizing the leadership of women and young people, especially those living in poverty and exclusion. The organization collaborated with grassroots organizations and civil society actors that emphasized youth and women’s participation in local governance.

​ ​In 2020, two significant WhatsApp-based chatbots were developed in Southern Africa to support individuals affected by gender-based violence (GBV). In Botswana, Xavier Africa an award-winning software development agency headquartered in Gaborone, Botswana collaborated with the Botswana Gender-Based Violence Prevention and Support Centre (BGBVC) to create the Ame chatbot, a digital tool that allowed survivors to report GBV incidents anonymously and connect with trained counselors a feature that proved particularly valuable during the COVID-19 pandemic. Concurrently, in South Africa, the Behavioral Insights Team, in partnership with Praekelt.org and Wits University, launched the ChattyCuz chatbot, which employed interactive, gamified content to engage young women. A study involving 19,643 participants demonstrated that users of ChattyCuz experienced a reduction in reported intimate partner violence from 62% in the control group to 56% among users. These initiatives highlighted the potential of AI driven digital tools using machine learning and natural language processing algorithms, in providing accessible support and resources to individuals affected by GBV.

Despite these advancements, several challenges persisted in the deployment of AI-driven GBV interventions. Notably, issues such as limited empathetic engagement, where automated responses may have failed to fully address the nuanced emotional needs of survivors, were identified. Additionally, algorithmic biases could inadvertently reinforce existing gender stereotypes, and ethical concerns regarding data security and privacy complicated the safe handling of sensitive information. These challenges underscored the necessity for continuous evaluation and refinement of AI systems to ensure they effectively and ethically served the needs of GBV survivors.​

​The BRAVEMIND system, a virtual reality (VR) exposure therapy tool, was developed in 2004 by researchers at the University of Southern California’s Institute for Creative Technologies (USC ICT) in collaboration with the U.S. Army Research Laboratory. Designed to assess and treat post-traumatic stress disorder (PTSD) in military personnel, BRAVEMIND immerses patients in computer-generated environments that replicate traumatic experiences, facilitating controlled exposure to distressing stimuli. Clinical trials have demonstrated significant reductions in PTSD symptoms among users of the system. linical trials demonstrated significant reductions in PTSD symptoms, with one study reporting a decrease in PTSD checklist scores from 54.4 to 35.6 after eleven sessions, and another finding that 45% of active-duty soldiers no longer met the criteria for PTSD after seven sessions. While VR interventions such as BRAVEMIND showed strong potential in extending therapeutic options beyond traditional modalities, challenges remained in fully personalizing the virtual experience and avoiding cultural stereotyping in environment design. Nonetheless, these systems represented a critical innovation in providing immersive and controlled settings where survivors could safely confront and process their trauma.There is limited explicit information that indicates the integration of artificial intelligence (AI) in its core functionalities.

​The integration of Artificial Intelligence (AI) and Virtual Reality (VR) technologies in addressing Gender-Based Violence (GBV) within Higher and Tertiary Education Institutions (HTEIs) offers promising avenues for support and intervention. Incorporating feminist perspectives in designing AI tools ensures that technology serves the nuanced needs of survivors without perpetuating harmful stereotypes. Efforts are underway to develop standards and methodologies aimed at mitigating algorithmic biases and enhancing transparency in AI systems, reflecting a commitment to refining AI applications to ethically and effectively support GBV survivors. However, it is imperative to address associated challenges to ensure these technological solutions are effective, empathetic, and secure for survivors. Continuous evaluation, ethical considerations, and survivor-centered approaches must guide the development and implementation of these interventions to maximize their efficacy and accessibility. For instance, AI-powered chatbots are being developed to provide immediate support to victims of GBV, offering 24/7 assistance and simplified reporting mechanisms, aiming to make the reporting process quicker, safer, and more efficient while connecting survivors to appropriate support services. Moreover, VR therapy models are being tailored for the rehabilitation and support of GBV victims, utilizing human-centered design processes to create immersive environments that facilitate healing and empowerment for survivors. Additionally, educational programs are being developed to recognize and report manifestations of digital gender-based violence, aiming to protect the rights and safety of women in the digital world.

In the Southern African Development Community (SADC) region, developing countries including Zimbabwe, resource constraints and infrastructural limitations have led to the adaptation of simpler technologies, such as SMS-based reporting systems and mobile applications, to address gender-based violence (GBV). Initiatives in countries like South Africa and Nigeria have utilized mobile platforms to overcome connectivity challenges, extending support to rural and under-resourced areas (Ekeh, Apeh, Odionu, & Austin-Gabriel, 2025). Despite these innovations, many systems fall short in integrating comprehensive psychosocial support and often neglect local sociocultural dynamics. Furthermore, the absence of sociological perspectives, such as Social Exchange Theory and Objectification Theory, in the design of these technologies limits their effectiveness in addressing the complex nature of GBV (Fredrickson & Roberts, 1997).

In Zimbabwean Higher and Tertiary Education Institutions (HTEIs), challenges are pronounced. While institutions like the University of Zimbabwe have established gender desks and reporting mechanisms, these measures remain largely disconnected from advanced technological solutions. Government initiatives, such as the National GBV Strategy (2021–2025), and efforts by civil society organizations, including the Institute of Women Social Workers (IWSW), underscore the urgency of addressing GBV ((Gatti & Vittoria, 2025);(Women, 2013)). However, there is a notable gap in the adoption of AI-driven interventions that combine real-time reporting with culturally sensitive psychosocial support. This absence reflects both infrastructural constraints and the need for theoretical refinement to capture the complex interplay between technology, culture, and gendered power relations. ​

Recent scholars, have suggested that integrating Social Exchange Theory and Objectification Theory into AI-driven platforms could provide a more holistic response to GBV in HTEIs. Social Exchange Theory offers insights into how trust and reciprocal benefits can be engineered into digital interventions to promote sustained victim engagement, while Objectification Theory emphasizes the necessity of designing systems attuned to the emotional and psychological realities of women who experience GBV (de Silva de Alwis, 2024). Such theoretically grounded approaches would not only facilitate more accurate and empathetic responses but also help mitigate the risks of perpetuating existing gender biases in AI systems.​

Despite promising advances, significant research gaps remain. Notably, there is a dearth of empirical studies examining the long-term effectiveness of AI interventions in mitigating GBV within Zimbabwean HTEIs and similar contexts. Furthermore, there is a pressing need to develop culturally tailored training datasets that reflect the diverse sociocultural landscapes of both developed and developing countries. Bridging this gap requires interdisciplinary collaboration among technologists, gender studies scholars, sociologists, and policymakers to create scalable, context-sensitive solutions that prioritize survivor safety and empowerment. This research aims to address these gaps by proposing an integrated AI-driven model that embeds both Social Exchange and Objectification Theories, thereby offering a robust framework for enhancing GBV reporting and virtual psychosocial support in higher education institutions globally.

Related Work

​ Gender-based violence (GBV) is a critical issue affecting higher and tertiary education institutions (HTEIs) in Zimbabwe, with significant implications for student well-being, academic performance, and institutional reputation. The integration of artificial intelligence (AI) in GBV reporting and virtual psychosocial support systems has been proposed as a transformative solution to enhance accessibility, security, and efficiency in addressing these challenges. However, AI-driven interventions must be designed with a strong theoretical foundation to ensure their effectiveness, acceptability, and ethical integrity. Existing research has explored various social science theories, including Feminist Theory, Routine Activity Theory, Strain Theory, Social Exchange Theory, and Objectification Theory, to understand the dynamics of GBV and how AI-driven solutions can be effectively implemented.

Feminist Theory provides a critical lens for examining the structural and systemic inequalities that underpin GBV in educational settings. Connell (2005) emphasizes the role of patriarchal institutions in maintaining gender hierarchies that perpetuate violence and silence survivors. In Zimbabwean universities, gendered power imbalances manifest through coercive relationships, harassment, and institutional failures in providing safe reporting mechanisms (Matope & Muchabaiwa, 2022). AI-driven reporting platforms offer an alternative by allowing survivors to report cases anonymously and access virtual psychosocial support without fear of retaliation. However, scholars argue that while technology can facilitate reporting, it does not address the root causes of GBV, such as deeply embedded cultural norms and gendered power structures (Jaggar, 2015). Digital barriers, including limited internet access and disparities in technological literacy, further hinder the effectiveness of AI-based interventions, particularly among marginalized students in rural areas (Zhou et al., 2023)

Routine Activity Theory (RAT), proposed by Cohen and Felson (1979), explains crime as a function of three converging factors: a motivated offender, a suitable target, and the absence of capable guardianship. In the context of GBV, weak institutional policies, lack of surveillance, and social tolerance for harassment contribute to an enabling environment for perpetrators (Felson, 2017). AI-driven solutions can function as “digital guardians” by enabling real-time reporting, monitoring unsafe areas, and providing immediate guidance to survivors. Research in South African universities has demonstrated that AI-based chatbots and mobile applications have significantly improved GBV reporting rates by reducing survivor hesitation and streamlining access to legal and psychosocial support (Olaitan, 2024). However, critics of RAT argue that while situational prevention is essential, it does not address the socio-cultural determinants of violence, such as gender norms and economic vulnerabilities, which require broader systemic interventions. AI-driven interventions should therefore be integrated with educational programs and policy reforms that challenge GBV-enabling norms and provide holistic support to survivors.

Strain Theory, developed by Merton (1938), suggests that social and economic pressures contribute to deviant behaviors, including GBV. In Zimbabwean universities, financial strain often creates power imbalances between lecturers and students, leading to exploitative relationships where students are coerced into sexual favors in exchange for academic or financial support (Sambo, 2021). AI-driven platforms can mitigate these risks by providing survivors with secure reporting channels and connecting them to institutional and external support services. However, Strain Theory has been criticized for its failure to account for the gendered nature of violence, as it does not fully explain why women and other marginalized groups disproportionately experience GBV (Agnew, Matthews, Bucher, Welcher, & Keyes, 2008). Moreover, economic vulnerabilities also affect access to digital resources, limiting the reach of AI-driven interventions in rural and low-income communities (Salawu, Molale, Uribe Jongbloed, & Ullah, 2022). Addressing these challenges requires a multi-faceted approach that combines AI technology with economic empowerment programs, gender-sensitive policies, and institutional reforms to eliminate exploitative power dynamics.

Social Exchange Theory, introduced by (Homans, 1958), posits that individuals make decisions based on cost-benefit analyses, weighing the risks and rewards of their actions. Survivors of GBV often hesitate to report due to perceived costs, such as social stigma, retaliation, or lack of institutional support (Blau, 1964). AI-driven reporting platforms can reduce these perceived costs by ensuring anonymity, offering real-time legal and psychosocial guidance, and facilitating alternative dispute resolution mechanisms. Research has shown that AI-driven chatbots increase reporting rates in environments where survivors fear exposure or institutional inaction (Awasekar & Lobo, 2024). However, despite these advantages, studies indicate that survivors still prefer human interaction over AI-based support systems due to the emotional and psychological complexities involved in GBV cases (Terp, Weis, & Lundqvist, 2021). Therefore, AI-driven interventions should not entirely replace traditional support mechanisms but should instead complement them by streamlining access to trained professionals and support networks.

Objectification Theory, proposed by (Fredrickson & Roberts, 1997), examines how societal objectification of women leads to self-objectification, reduced self-esteem, and increased vulnerability to GBV. In many Zimbabwean HTEIs, cultural and media representations of women reinforce harmful stereotypes that normalize harassment and violence (Calogero, 2012). AI-driven platforms must be designed with ethical considerations to avoid reinforcing these biases. For instance, algorithms trained on biased datasets may reproduce discriminatory patterns in GBV response systems, disadvantaging certain groups (Wambua, 2024)Ethical AI development in GBV interventions requires continuous monitoring, diverse data representation, and algorithmic transparency to ensure fairness and inclusivity.

Empirical studies have highlighted the potential of AI-driven GBV interventions in various contexts. In the Southern African Development Community (SADC), research in South Africa has shown that AI-based platforms increased GBV reporting rates by 30%, demonstrating their potential to bridge institutional gaps in survivor support (MacEntee, 2015). In Botswana, the WhatsApp-based chatbot Ame has improved access to legal and psychosocial services, particularly for rural women who previously lacked reporting avenues (Xavier et al., 2024)). However, in Zimbabwe, while legal frameworks exist, cultural stigmas and digital infrastructure challenges continue to hinder AI adoption in GBV interventions (Mabvurira, 2020)Comparatively, in India, the AI-powered Safe City platform has successfully mapped GBV hotspots using survivor reports, allowing law enforcement agencies to allocate resources more effectively (Chawki, Basu, & Choi, 2024). Similarly, Brazil’s SOS Mulher application has facilitated real-time reporting and legal assistance for survivors, while in the United States, the National Domestic Violence Hotline has integrated AI chatbots to provide risk assessments and crisis support (Ponnusamy, Bora, Daigavane, & Wazalwar, 2024). These case studies underscore the importance of contextual adaptation, ethical AI design, and integration with human support systems to maximize AI’s impact in GBV interventions.

The integration of AI-driven reporting platforms and virtual psychosocial support systems in Zimbabwean HTEIs must be guided by strong theoretical foundations and empirical evidence. While Feminist Theory, Routine Activity Theory, Strain Theory, Social Exchange Theory, and Objectification Theory provide valuable insights into GBV dynamics and AI interventions, their limitations highlight the need for a multi-disciplinary approach that incorporates technological, sociocultural, and ethical considerations. AI-driven solutions should be complemented by policy reforms, community engagement, and capacity-building initiatives to address the root causes of GBV and ensure long-term sustainability. Future research should explore the role of AI in predictive analytics for GBV prevention, algorithmic fairness in AI-driven interventions, and the intersection of AI with legal and policy frameworks to create a more holistic and effective response to GBV in Zimbabwean HTEIs.

METHODOLOGY

This study employed a mixed-methods approach, strategically integrating Social Exchange Theory (SET) and Objectification Theory (OT) to critically examine the potential of AI-driven platforms in enhancing Gender-Based Violence (GBV) reporting and providing virtual psychosocial support within Zimbabwean higher and tertiary education institutions. A stratified snowball sampling method was employed to recruit a purposively diverse cohort of 150 participants, including students, faculty, and staff across various demographic segments gender, age, academic discipline, and socio-economic status. This sampling approach ensured a broad spectrum of GBV experiences, including physical, sexual, emotional, and psychological abuse, as well as harassment, stalking, and cyberbullying, facilitating a nuanced understanding of the pervasive nature of GBV within institutional contexts.

Data were collected using a triangulated approach, comprising focus group discussions, semi-structured in-depth interviews with key informants such as university administrators, counselors, and GBV experts, and participant observations to capture the real-time dynamics of GBV disclosures and institutional responses. The qualitative data were subjected to thematic analysis, wherein transcripts were inductively coded to uncover recurring patterns and emergent themes, which were subsequently interpreted through the dual theoretical lenses of SET and OT. Specifically, SET provided a framework for understanding the cost-benefit analysis survivors undertook when deciding whether to disclose GBV, while OT offered a critical lens to assess the societal implications of objectification on survivors’ sense of self-worth and vulnerability.

Quantitative survey data were analyzed through descriptive and inferential statistical methods to identify patterns and correlations between demographic variables and participants’ perceptions of the AI platform’s usability, accessibility, and effectiveness. The integration of both qualitative and quantitative paradigms, grounded in pragmatism emphasizing the practical implications of AI-driven interventions and interpretivism, which emphasized the lived, subjective experiences of participants, ensured a robust, contextually rich analysis. Ethical rigor was maintained throughout the study, with informed consent procured from all participants, confidentiality assured, and provisions made for psychological support to mitigate any distress caused by the sensitive nature of the topics discussed. Ultimately, this methodological framework not only quantified the potential impact of AI-based GBV interventions but also provided profound, grounded insights into their acceptability, usability, and ethical ramifications, offering critical insights to inform future policy, practice, and research within the realm of GBV support systems in Zimbabwean higher education institutions.

FINDINGS

The study aimed to explore the integration of Social Exchange Theory and Objectification Theory in the development of an AI-powered platform for enhancing Gender-Based Violence (GBV) reporting and providing virtual psychosocial support in Zimbabwean Higher and Tertiary Education Institutions. A mixed-methods approach involving 150 participants—comprising students, lecturers, and administrative staff was used to collect data through surveys, focus group discussions (FGDs), and in-depth interviews.

Demographic Profile of Participants

The demographic characteristics of the 150 participants were as follows:

Gender Percentage
Female 60%
Male 40%

Table 1: Demographic Profile of Participants by Gender

The gender distribution of participants reveals a majority of females (60%), which aligns with the heightened prevalence of gender-based violence (GBV) among women in many contexts. The higher female representation is particularly relevant for understanding how GBV disproportionately affects women. The male participants (40%) provide essential insight into the potential role of men in both perpetrating and responding to GBV, which is critical in shaping inclusive interventions.

Age Group Percentage
18-24 45%
25-34 30%
35 and above 25%

Table 2: Demographic Profile of Participants by Age Group

The age distribution indicates that a significant proportion of participants (45%) were within the 18-24 age range, which is commonly associated with university and tertiary education students. This finding reflects the vulnerability of younger individuals to GBV in academic environments. The representation of older age groups (25-34 and 35 and above) suggests that GBV issues extend beyond just students and affect a broader segment of the academic community, including staff and faculty, underlining the need for comprehensive institutional responses to GBV.

Academic Discipline Percentage
Humanities 30%
Sciences 40%
Social Sciences 20%
Others 10%

Table 3: Demographic Profile of Participants by Academic Discipline

This table shows the distribution of participants across different academic disciplines. A substantial proportion (40%) of participants came from the sciences, reflecting the diversity of academic sectors involved in the study. The humanities and social sciences also represented a significant portion (30% and 20%, respectively), fields which often engage directly with social issues like GBV. The inclusion of a small percentage (10%) from other disciplines highlights the study’s inclusivity, capturing a range of perspectives from various fields of study, which can influence the understanding and response to GBV.

Socio-economic Status Percentage
Low-income 50%
Middle-income 30%
High-income 20%

Table 4: Demographic Profile of Participants by Socio-economic Status

The socio-economic status distribution indicates that half of the participants (50%) came from low-income backgrounds, which is important in understanding how socio-economic factors influence experiences of GBV and access to support services. The representation of middle-income (30%) and high-income (20%) groups allows for a more nuanced understanding of how individuals from different socio-economic strata might perceive or interact with AI-based GBV interventions. It is likely that individuals from lower-income backgrounds may face additional barriers to accessing resources and reporting GBV, highlighting the need for interventions to be sensitive to these disparities.

Types of GBV Experienced Percentage
Physical Abuse 15%
Sexual Abuse 25%
Emotional/Psychological Abuse 40%
Harassment 30%
Stalking 20%
Cyberbullying 10%

Table 5: Types of GBV Experienced

The table reveals that emotional and psychological abuse was the most prevalent form of GBV reported by participants (40%), followed by harassment (30%) and sexual abuse (25%). This finding emphasizes the complex and varied nature of GBV in academic settings, where non-physical forms of abuse, such as emotional and psychological manipulation, are equally as pervasive and damaging. The relatively lower prevalence of physical abuse (15%) and cyberbullying (10%) suggests that these forms of GBV, while significant, are less frequently experienced compared to emotional or sexual forms. This indicates the need for AI platforms to address both the overt and subtle forms of GBV, ensuring comprehensive support for all survivors.

Statement Frequency Percentage(%)
Strongly Agree 45 30%
Agree 52 34.7%
Neutral 25 16.7%
Disagree 20 13.3%
Strongly Disagree 8 5.3%

Table 6: Perceptions on the Use of Social Exchange Theory in AI-Driven GBV Reporting

The findings indicated that 30% of the respondents strongly believed that embedding Social Exchange Theory into the AI system would increase survivors’ willingness to report GBV. Participants mentioned that emphasizing benefits such as confidentiality, safety, and access to services while minimizing costs such as stigma or retaliation, would motivate disclosure. Qualitative data from FGDs supported this, as participants noted that anonymity and perceived safety were key motivators.

Preference for CBT Support Frequency Percentage (%)
Strongly Preferable 53 35%
Preferable 40 26.7%
Neutral 28 18.7%
Not Preferable 21 14%

Table 7: Preference for CBT-Based Virtual Therapy in AI System

A significant proportion of participants (35%) strongly supported the integration of Cognitive Behavioral Therapy (CBT)-based support features into the AI platform. This aligns with Objectification Theory, which argues that the internalization of objectifying experiences leads to emotional distress. Participants emphasized that CBT-based tools, such as thought restructuring and self-compassion exercises, would be instrumental in helping survivors cope with trauma.

Concern Type Gender Frequency
Fear of Retaliation Female 60
Lack of Gender-Sensitive Reporting Female 60
Awareness and Sensitization Needs Male 30

Table 8: Gender-Based Concerns in GBV Reporting

Female participants (40%) expressed concern over retaliation, societal judgment, and the lack of gender-sensitive reporting systems. On the other hand, 20% of male participants saw the platform as a tool to promote gender sensitivity among males, calling for awareness campaigns and educational components targeting bystander intervention and respect.

HTEIs Stakeholder View Frequency Percentage (%)
Support Integration 60 40%
Require Training & Resources 45 30%
Prefer Pilot Roll-Out First 30 20%

Table 9: Institutional Stakeholder Support

Among institutional stakeholders such as university administrators, GBV officers, and counselors 40% believed the AI platform could fill existing reporting gaps. However, a significant number (30%) emphasized the need for infrastructure, staff training, and resource mobilization, while 20% proposed pilot implementation before a full rollout.

All these findings, suggest strong support for integrating Social Exchange Theory and Objectification Theory into the development of AI-based Gender-Based Violence (GBV) interventions within Higher and Tertiary Education Institutions. Participants emphasized several key priorities, including the need to ensure safe, confidential, and benefit-oriented reporting mechanisms that encourage survivors to come forward without fear of stigma or retaliation. There was also significant support for the integration of virtual psychosocial support services, particularly those grounded in Cognitive Behavioral Therapy (CBT), to address the emotional and psychological needs of survivors. Additionally, respondents stressed the importance of designing user-friendly and institutionally compatible platforms that align with existing technological infrastructures and are accessible to all users. Addressing gender-specific concerns and ensuring institutional preparedness through training and resource allocation were also highlighted as critical for successful implementation. These insights collectively provide a solid foundation for developing a survivor-centered, theory-driven AI platform that is culturally sensitive, technologically feasible, and psychologically empowering.

DISCUSSIONS OF FINDINGS

This study highlights the integration of Social Exchange Theory (SET) and Objectification Theory (OT) in the development of an AI-powered platform for enhancing Gender-Based Violence (GBV) reporting and virtual psychosocial support within Zimbabwean Higher and Tertiary Education Institutions (HTEIs). The findings from this study provide valuable insights into how these theories can inform the design of interventions aimed at addressing GBV, emphasizing the need for a comprehensive approach that incorporates psychological, social, and technological interventions to effectively combat GBV.

Integration of Social Exchange Theory (SET) in GBV Reporting.

One of the key findings of this study is the significant support for embedding Social Exchange Theory (SET) into the AI platform. A majority of participants (64.7%), consisting of 30% strongly agreeing and 34.7% agreeing, believed that the integration of SET would enhance survivors’ willingness to report GBV. This aligns with Homans’ (2024) research on social exchange and technology adoption, which posits that individuals are more likely to engage in behaviors that offer perceived benefits and minimize perceived costs (Robinson, 2025). One participant echoed this sentiment, stating, “As a participant, I believe that confidentiality, safety, and the availability of support services are crucial motivators for survivors to disclose incidents of GBV. Without these elements, survivors may feel too vulnerable or at risk to come forward.” This highlights the importance of creating an environment where survivors feel secure in seeking help, a factor that directly influences their decision to report incidents of GBV.

Furthermore, Orr et al. (2022)explored the use of technology in combating GBV within academic settings, finding that anonymity and the mitigation of social costs were critical factors influencing survivors’ willingness to report incidents. Their findings suggest that educational institutions must address institutional barriers, such as fear of retaliation and social stigma, to make reporting more accessible. This aligns directly with the current study’s findings, where participants emphasized that the safety of reporting is a significant concern, especially when considering the societal backlash survivors often face. As one participant shared, “If reporting GBV means facing judgment or retaliation from peers or the institution, many survivors will hesitate to speak up. Creating a platform that guarantees anonymity and protects the survivor’s identity could make all the difference in encouraging reporting.” This underscores the importance of integrating Social Exchange Theory (SET) into AI platforms, which offer clear benefits such as confidentiality and professional support. Such features address these concerns and significantly reduce the perceived costs of disclosing GBV incidents, encouraging more survivors to come forward for support

Preference for Cognitive Behavioral Therapy (CBT) and Virtual Psychosocial Support.

Another prominent finding from this study was the overwhelming support for integrating Cognitive Behavioral Therapy (CBT)-based tools into the AI platform, with 35% of participants strongly preferring it and 61.7% overall supporting its inclusion. This preference underscores the recognition of the psychological impact of GBV and the importance of addressing the emotional well-being of survivors. Cognitive Behavioral Therapy is a well-established method for helping trauma survivors process their emotions, and its integration into AI platforms is consistent with the work of Beg and Verma (2025), who developed an AI-driven chatbot incorporating CBT principles. Their research demonstrated that survivors of trauma, including GBV, benefited significantly from self-guided CBT tools, which allowed them to process their emotions in a confidential and supportive environment. This was supported by one of the respondent said that, “… I feel that including Cognitive Behavioral Therapy (CBT)-based tools in the AI platform is crucial for survivors of GBV. It offers them a way to address their emotional challenges in a private, secure space. Many survivors may feel reluctant to seek face-to-face support due to stigma or fear of judgment, so having access to CBT within the AI platform gives them the opportunity to work through their trauma at their own pace, in a way that feels safer and more comfortable.”

Furthermore, (Fraser, 2024) conducted research indicating that survivors of GBV who received virtual CBT interventions showed significant reductions in symptoms of depression, anxiety, and post-traumatic stress disorder (PTSD). These findings provide further validation for the study’s results, confirming that CBT-based virtual support proves to be a valuable tool in providing survivors with the tools they need to rebuild emotional resilience. However, Oo (2024) caution that CBT may not be universally accepted by all survivors, particularly those from communities with cultural or structural barriers to mental health support. This indicates the need for flexible, culturally sensitive interventions to ensure wider acceptance and effectiveness of AI-based mental health tools. One participant echoed this concern during the interview, stating, “As someone who has seen how different communities in Zimbabwe approach mental health, I believe that while CBT can be beneficial, it may not be universally accepted, especially for those from rural or traditional backgrounds. For instance, some people in my community may prefer traditional forms of support, like talking to elders or community leaders, rather than a structured therapy like CBT. If the platform could incorporate these culturally familiar options alongside CBT, it would feel more accessible and supportive for everyone, ensuring that more survivors feel comfortable seeking help.”

Gender-Specific Concerns and Institutional Readiness.

Gender-specific concerns were also a significant factor in the findings, particularly the concerns raised by female participants (40%) regarding retaliation and the lack of gender-sensitive reporting mechanisms. These concerns echo findings from Charles (2024), who emphasized that women in academic institutions often hesitate to report GBV due to fear of retaliation, judgment, and inadequate support mechanisms. Their research advocates for gender-sensitive platforms that not only ensure anonymity but also provide tailored support that addresses the unique experiences and challenges faced by women in academic settings. On the participants said that, “As a female participant, I can relate to the concerns raised about retaliation and the lack of gender-sensitive reporting mechanisms. In my experience, many women hesitate to speak up about GBV because they fear being judged or even facing repercussions, especially in an academic environment. It’s crucial that any reporting platform takes these fears into account and offers a space where women can report incidents anonymously, without the fear of retaliation or social stigma. Gender-sensitive support is key to ensuring that survivors receive the help they need, addressing their unique emotional and psychological challenges.”

Stakeholders from the current study emphasized the importance of adequate resources, training, and infrastructure to support the platform’s integration into academic institutions. One of the stakeholders said, “As a stakeholder, I believe that for the platform to be truly effective within academic institutions, we must prioritize adequate resources, comprehensive training, and robust infrastructure. This includes not only having the technical tools to implement the AI system but also ensuring that staff and administrators are well-trained to handle sensitive cases and provide the necessary support to survivors. Without these foundational elements, the platform’s integration could fall short of its potential impact, leaving both the technology and the survivors who rely on it underserved.

These concerns align with the research of Dumitru and Dragomir (2025), who found that the successful integration of AI-powered interventions in academic settings requires institutional buy-in and support from administrators and staff. They recommend pilot programs and gradual rollouts of AI systems, allowing institutions to gather feedback, refine the platform, and ensure it meets the needs of survivors and the academic community. Similarly, Ekeh et al. (2025)stress that institutional readiness and staff training are critical for the success of digital interventions in GBV prevention and support, as well-trained staff are more likely to successfully implement AI-driven systems. In addition to gender-specific concerns, institutional readiness emerged as a key factor influencing the successful implementation of the AI platform.

While the findings indicate strong support for integrating SET and OT into AI platforms, some studies highlight challenges in implementing such systems. Henry, Witt, and Vasil (2024) raised concerns about the limitations of AI platforms in fully addressing the emotional complexity of GBV cases. They argue that while AI-driven interventions can be valuable, they may lack the nuanced understanding of human emotions and the complex, diverse needs of survivors. Their research suggests that AI systems should complement, rather than replace, human interaction, such as counseling and peer support groups, to provide a more holistic approach.

Additionally, Kraft-Buchman and Peralta (2024) found that while AI-powered reporting tools can increase the number of GBV disclosures, concerns about data security and privacy remain a significant barrier. In regions with weak digital infrastructures, the protection of sensitive information becomes even more critical. These concerns were also reflected in the current study, where participants emphasized the importance of ensuring data protection and security when using AI platforms for sensitive reporting. In order tpo address these issues, (Sharma, Sharma, & Gupta, 2024), recommend implementing robust encryption and secure data management practices to protect users’ confidentiality and prevent the misuse of sensitive information.

Several successful AI-driven models developed between 2023 and 2025 exemplify the practical application of SET and OT in addressing GBV. One such example is the Zizu AI chatbot developed by the University of Cape Town in 2024. This platform has successfully integrated SET by providing survivors with secure, confidential reporting channels and access to real-time support, thereby reducing the perceived costs of reporting. Similarly, KNUST’s 2024 AI platform for IPV and GBV victims also utilizes SET by offering tailored support and maintaining confidentiality, while addressing OT concerns by focusing on the emotional and psychological well-being of survivors. Additionally, the SUKHSANDESH platform developed in rural India provides an example of how AI can deliver culturally sensitive sexual education and mental health support, in line with both SET and OT. These models demonstrate the transformative potential of AI in providing survivor-centered, psychologically empowering interventions that reduce stigma, offer support, and address the unique needs of GBV survivors.

CONCLUSION

In conclusion, the findings of this study underscore the value of integrating Social Exchange Theory (SET) and Objectification Theory (OT) in the development of AI-powered platforms for Gender-Based Violence (GBV) reporting and psychosocial support. The results not only align with current global trends but also contribute to the growing body of knowledge on the role of AI in addressing GBV. Successful models developed between 2023 and 2025 provide strong evidence of the transformative potential of AI in supporting survivors by reducing the psychological and social costs of reporting while offering tailored support that meets survivors’ emotional and psychological needs. This study emphasized the need for survivor-centered, culturally sensitive, and institutionally compatible interventions, highlighting the importance of addressing both the logistical and emotional barriers to reporting. As AI systems continue to evolve, it is crucial that ongoing research, stakeholder feedback, and cultural considerations inform their design and implementation to ensure they remain effective, inclusive, and ethical. The integration of SET and OT into AI systems represents a powerful framework for addressing the multifaceted challenges of GBV, particularly within educational institutions, and further reinforces the potential of AI-driven solutions to overcome barriers to reporting and provide meaningful psychological support. Ultimately, this study contributes to a more comprehensive and effective response to GBV in Zimbabwean higher education institutions, paving the way for future advancements in the integration of AI technologies for social good.

RECOMMENDATIONS.

The researchers recommended that:

The AI GBV system should highlight the tangible and psychological benefits of reporting gender-based violence (GBV), such as access to confidential support services and legal assistance. At the same time, it must work to reduce perceived costs such as fear of stigma or retaliation.  Therefore, by applying Social Exchange Theory in this manner, this can enhance user engagement and promote disclosure.

The integration of Cognitive Behavioral Therapy theory into the AI GBV platform is essential for offering trauma-informed virtual support. This is particularly crucial in resource-limited settings where traditional mental health services are inaccessible.  This theory being integrated in AI GBV   systems promised to help survivors process trauma, improve mental well-being, and regain a sense of control.

The researcher recommends that the AI data-driven GBV system be effectively adopted through integration with existing institutional structures within higher and tertiary Institutions educational sectors. Institutions must demonstrate commitment by training staff, aligning the platform with internal reporting and support policies, and cultivating a culture that embraces technology-enabled mechanisms for GBV prevention and response.

The platform must be inclusive mobile-friendly, language-sensitive, and intuitive while upholding strong data protection standards. There for by prioritizing these features, this ensures safe and equitable access, especially for vulnerable populations.

A dynamic evaluation framework should be implemented to track platform usage, effectiveness, and user experience. Incorporating participatory feedback and data analytics will support continuous refinement and ensure that the platform remains responsive to the evolving needs of GBV survivors.

REFERENCES

  1. Agnew, R., Matthews, S. K., Bucher, J., Welcher, A. N., & Keyes, C. (2008). Socioeconomic status, economic problems, and delinquency. Youth & Society, 40(2), 159-181.
  2. Awasekar, D. D., & Lobo, L. M. R. J. (2024). Empowering women through AI: A comprehensive chatbot for domestic violence awareness and legal support in India. Paper presented at the International Conference on Interactive Collaborative Learning.
  3. Beg, M. J. K., & Verma, M. K. (2025). Revolutionizing Mental Healthcare Through Artificial Intelligence. In Chatbots and Mental Healthcare in Psychology and Psychiatry (pp. 21-54): IGI Global Scientific Publishing.
  4. Blau, P. M. (1964). Justice in social exchange. Sociological inquiry, 34(2).
  5. Calogero, R. M. (2012). Objectification theory, self-objectification, and body image.
  6. Charles, D. (2024). Students Speak: A Qualitative Examination of the Intersection of Identity and Sexual Violence in a University Setting. Rowan University,
  7. Chawki, M., Basu, S., & Choi, K.-S. (2024). Redefining boundaries in the metaverse: Navigating the challenges of virtual harm and user safety. Laws, 13(3), 33.
  8. Cohen, L. E., & Felson, M. (1979). Social change and crime rate trends: A routine activity approach. American sociological review, 588-608.
  9. Connell, R. W. (2005). Change among the gatekeepers: Men, masculinities, and gender equality in the global arena. Signs: journal of women in culture and society, 30(3), 1801-1825.
  10. de Silva de Alwis, R. (2024). From Critical Mass to Critical Parity in Women’s Leadership. University of Pennsylvania Journal of Law and Public Affairs, 10(1), 3.
  11. Dumitru, M., & Dragomir, V. D. (2025). Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines. Education Sciences, 15(3).
  12. Ekeh, A. H., Apeh, C. E., Odionu, C. S., & Austin-Gabriel, B. (2025). Data analytics and machine learning for gender-based violence prevention: A framework for policy design and intervention strategies. Gulf Journal of Advance Business Research, 3(2), 323-347.
  13. Felson, M. (2017). Linking criminal choices, routine activities, informal control, and criminal outcomes. In The reasoning criminal (pp. 119-128): Routledge.
  14. Fraser, K. E. (2024). Determinants of Sexual and Gender-Based Violence Against Refugee Women and Girls. In Global Happiness and Humanitarian Assistance: Systemic Solutions (pp. 97-113): Springer.
  15. Fredrickson, B. L., & Roberts, T. A. (1997). Objectification theory: Toward understanding women’s lived experiences and mental health risks. Psychology of women quarterly, 21(2), 173-206.
  16. Gatti, R., & Vittoria, A. (2025). What measures for what politics? Gender-Based Violence Protection and the RDL Policy Gap for Women in Italy (2021-2024). Partecipazione e conflitto, 18(1), 176-197.
  17. Henry, N., Witt, A., & Vasil, S. (2024). A ‘design justice’approach to developing digital tools for addressing gender-based violence: exploring the possibilities and limits of feminist chatbots. Information, Communication & Society, 1-24.
  18. Homans, G. C. (1958). Social behavior as exchange. American journal of sociology, 63(6), 597-606.
  19. Jaggar, A. M. (2015). Just methods: An interdisciplinary feminist reader: Routledge.
  20. Kraft-Buchman, C., & Peralta, J. S. (2024). Advancing research on feminist artificial intelligence to advance gender equality and inclusion: final technical report.
  21. Mabvurira, V. (2020). Making sense of African thought in social work practice in Zimbabwe: Towards professional decolonisation. International Social Work, 63(4), 419-430.
  22. MacEntee, K. (2015). Using cellphones in participatory visual research to address gender-based violence in and around rural South African schools: Reflections on research as intervention. Agenda, 29(3), 22-31.
  23. Matope, N., & Muchabaiwa, W. (2022). Redefining the gender narrative: sexual harassment and intimate partner violence in selected institutions of higher learning in Zimbabwe’s tertiary institutions. The Dyke, 16(2), 1-16.
  24. Merton, R. K. (1938). Science and the social order. Philosophy of science, 5(3), 321-337.
  25. Olaitan, Z. M. (2024). Using digital technology to address gender-based violence in South Africa. In African women in the fourth industrial revolution (pp. 226-240): Routledge.
  26. Oo, P. P. (2024). Gender-based violence. In Oxford Research Encyclopedia of Global Public Health.
  27. Orr, N., Chollet, A., Rizzo, A. J., Shaw, N., Farmer, C., Young, H., . . . Melendez‐Torres, G. (2022). School‐based interventions for preventing dating and relationship violence and gender‐based violence: A systematic review and synthesis of theories of change. Review of Education, 10(3), e3382.
  28. Ponnusamy, S., Bora, V., Daigavane, P. M., & Wazalwar, S. S. (2024). AI Tools and Applications for Women’s Safety: IGI Global.
  29. Robinson, T. Y. (2025). Social Exchange and Information Technology Workforce Preparation of Previously Incarcerated Individuals: A Mixed Methods Approach. North Carolina Agricultural and Technical State University,
  30. Salawu, A., Molale, T. B., Uribe Jongbloed, E., & Ullah, M. S. (2022). Indigenous language for development communication in the Global South: Lexington Books.
  31. Sambo, B. N. (2021). The relationship between domestic violence and development in Zimbabwe. University of Pretoria (South Africa),
  32. Sharma, S., Sharma, A., & Gupta, P. K. (2024). Developing Cloud Based Secure Data Monitoring System.
  33. Terp, K., Weis, J., & Lundqvist, P. (2021). Parents’ views of family-centered care at a pediatric intensive care unit—a qualitative study. Frontiers in Pediatrics, 9, 725040.
  34. Wambua, R. N. (2024). A systematic review of digital innovation in Higher Education Institutions in developing countries. African Journal of Education, Science and Technology (AJEST), 7(4), Pg 154-164
  35. Women, U. (2013). Ending violence against women and girls. How we make a difference: Advocacy, URL: http://www2. unwomen. org/~/media/headquarters/attachments/sections/library/publications/20, 13, 12.
  36. Xavier, S. P., Mahoche, M., Rondó, P. H., da Silva, A. M., Flores-Ortiz, R., & Victor, A. (2024). Addressing inequalities in vaccination coverage among children aged 12 to 23 months in ten Sub-Saharan African countries: Insights from DHS and MIS Data (2017-2022). MedRxiv, 2024.2012. 2013.24318976.
  37. Zhou, F., Soremekun, O., Chikowore, T., Fatumo, S., Barroso, I., Morris, A. P., & Asimit, J. L. (2023). Leveraging information between multiple population groups and traits improves fine-mapping resolution. Nature Communications, 14(1), 7279.

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

0 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER