Page 1763
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
The Effect of Generative AI Usage on Academic Engagement: A
Mixed-Methods Case Study at a College of Education in Ghana
Aduo Frank
1
, Frank Blessed Amenyeke
2
, Frederick Akosah Sekyere
3
, Emmanuel Adjei
4
1
Department of Integrated Science Education, University of Education, Winneba.
2
Department of Integrated Science Education, University of Education,
3
Sampson Mensah Akrosumah, St. Ambrose College of education, Dormaa Akwamu. Science
Department
4
St. Teresa’s College of Education, Hohoe, Science Department Department of Integrated Science
Education, University of Education, Winneba.
DOI:
https://dx.doi.org/10.47772/IJRISS.2025.910000149
Received: 02 October 2025; Accepted: 10 October 2025; Published: 06 November 2025
ABSTRACT
The study was conducted to investigate the impact of the usage of generative AI on the academic engagement
of students in a selected college of education in Ghana. The study seeks to explore and provide insights on the
relationship between the use of generative AI in learning and students’ academic engagement. This approach
was selected because it offers a more comprehensive understanding of the relationship between generative AI
use and students’ academic engagement. A sequential-explanatory mixed-method research design is applied in
the study to provide in-depth enlightenment, discussion, investigation, and thorough understanding of the
generative AI frequently used by students and how it affects their academic engagement. This approach was
selected because it offers a more comprehensive understanding of the relationship between generative AI use
and students’ academic engagement. Ninety-eight (98) respondents served as participants for the quantitative
phase of the study, and twelve (15) interviewees for the qualitative phase. The respondents were selected using
convenience sampling, a non-probability sampling technique. The instrument for the quantitative data was a
survey questionnaire to examine the type of generative AI mostly used by students and also assess the effect of
generative AI on students’ academic engagement. The interview guide, on the other hand, was used to gather
qualitative data to obtain rich data that could not be explored using only the quantitative data.
Findings showed that use of generative AI has positively affected students' academic engagement and it
improved their learning environment. On the other hand, generative AI has been a great aid in enhancing
students' learning and engagement. Using Pearson R-Correlation, the researcher found that there is a significant
high positive correlation between use of generative AI and students' academic engagement (r (98) = 0.785, p <
0.00001).
INTRODUCTION
Background
The use of Artificial Intelligence (AI) in education has attracted considerable attention in recent years, with
researchers highlighting its transformative effects on teaching and learning (Okoye & Mante, 2024). (GenAI), a
branch of artificial intelligence that focuses on machine-generated content, has significant potential for
delivering personalized and context-aware learning experiences, particularly in out-of-classroom settings.
(Norman & Fraenkel, 2000).
GenAI tools refer to students’ usage of generative AI technologies to support personalized learning, receive
instant feedback, and access adaptive content (Bulawan, 2023). Generative AI helps students simplify academic
Page 1764
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
tasks, fostering collaboration and promoting critical thinking education. It enables deeper engagement with
content by automating routine tasks and offering adaptive learning, contributing to enhanced AA, particularly in
achieving SDG4 (Baidoo-Anu & Owusu Ansah, 2023; Bulawan, 2023).
Generative AI applies transformer-based deep learning architectures to produce novel content such as text,
images, and audio, exemplified by tools like ChatGPT, Microsoft Copilot, and Canva (Vaswani et al., 2017; Lim
et al., 2023).
Globally, generative AI tools have been adopted in higher education, reshaping learning behaviors. Students
perceive GenAI as an accessible tutor, available anytime, while institutions continue to grapple with challenges
such as academic dishonesty (Liu, 2024; Oravec, 2023). In some contexts, authorities have even banned
ChatGPT, citing fears of plagiarism and reduced critical thinking (Estrellado, 2023; Johnson, 2023). Meanwhile,
other studies have revealed the benefits of responsible use, such as improved writing, learning engagement, and
personalized learning opportunities (Bulawan, 2023; Lim, 2023; Sok & Heng, 2023).
Statement of the Problem
The growing integration of artificial intelligence (AI) and machine learning in education offers opportunities to
personalize and enhance learning experiences; however, their full potential and limitations remain insufficiently
understood. Although generative AI promotes efficiency, creativity, and active learning, it also presents
challenges such as dependency, plagiarism, and diminished student initiative (Hagendorff, 2020; Halaweh, 2023;
Rodrigues, 2024). In Ghanaian Colleges of Education, students increasingly utilize tools like ChatGPT, Quillbot,
and Grammarly to support academic tasks (Bulawan, 2023), yet little empirical research has examined how such
use influences student engagement.
To fill this gap, the present study investigates the impact of generative AI usage on students’ academic
engagement and explores its relationship with academic achievement. Insights from this research will guide
educators, policymakers, and curriculum developers in promoting the responsible integration of AI technologies
while preserving meaningful student engagement.
To what extent does the use of generative AI influence the academic engagement of students, and how do
students perceive and describe their experiences of engagement when using generative AI in their learning?
What is the relationship between the use of generative AI and students’ academic engagement, and how do
students explain their experiences and perspectives of this relationship in their learning?
LITERATURE REVIEW
It is fascinating to see how technology continues to shape the educational landscape, isn’t it? Liang et al. (2023)
posited that although the emergence and increasing utilization of GenAI are recent developments, scholars have
long been intrigued by the probable applications of AI in education. Earlier research has demonstrated how
leveraging AI can enhance assessment feedback and streamline administrative tasks (Crompton & Bruke, 2023;
Popenici & Kerr, 2017; Brown et al., 1978; Garito, 1991). Both Liang (2023) and Yang et. Al., (2024) emphasize
how generative artificial intelligence (GenAI) improves student learning engagement and accomplishment.
Hidayat-ur-Rehman (2024) investigates the association between students’ engagement in a GenAI atmosphere
through smartphone usage, formal digital learning activity, and digital skills. Taken together, these findings
further demonstrate GenAI’s effectiveness in enhancing student learning and engagement and extend the scope
of GenAI in many learning contexts.
GenAI is a special and powerful case of artificial intelligence. Hashmi & Bal (2024) describe generative
artificial intelligence (GenAI) as a machine learning tool that generates new text, video, and image content (page
5/28); its prevalence has risen sharply in recent years. Like having a digital tutor who knows the preferences and
adapts accordingly, GenAI tailors’ content to individual learners, provides swift feedback, and generates
thought-provoking prompts for formative assessments and interactive theory beyond humans, thus human-to-
machine scenarios (Liang et al., 2023). GenAI could be more interactive compared with the traditional types of
Page 1765
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
learning resources, such as textbooks and exercises, and could promote students' engagement, which may be
attributable to the reward mechanism.
Motivation to learn with AI
The potential of AI to transform education has been widely acknowledged. As highlighted in a recent systematic
review of AI in education by Chiu (2023a), AI technologies are demonstrating their capacity to reshape teaching
and learning. AI technologies such as chatbots, intelligent tutoring systems, and automatic grading systems are
increasingly used in schools (Chiu et al., 2023b; Yim & Su, 2024). For example, chatbots could facilitate
personalized learning based on students’ competence, intelligent tutoring systems could offer timely feedback
on students’ inquiries, and automatic grading systems could provide detailed and effective grading (Vázquez-
Cano et al., 2021). These advancements represent a significant stride toward a technologically advanced and
efficient educational environment. However, the question of the extent to which university students are
motivated to interact with and apply AI technologies in their learning process remains underexplored. Students
may demonstrate different types of motivation to interact and learn with AI technologies (Scherer & Siddiq,
2019). Some may be largely inspired by inherent interest in and the enjoyment of learning with AI, whereas
others might be motivated by external factors such as the pressure to avoid lagging behind others or the benefits
of learning with AI. In general, those with higher autonomous motivation to use AI in learning are likely to
devote more time to using, engage more with, and benefit more from AI technologies (Al Shamsi et al., 2022;
Esiyok et al., 2025; Kelly et al., 2023; Lai et al., 2023).
Impact of AI on Teaching and Learning Processes
The use of GenAI is increasingly growing, and it is influencing the teaching and learning processes in higher
education. Immediate support is provided to students through the use of chatbots and AI-powered virtual
assistants. Any queries related to assignments or other course contents can be addressed with 24/7 accessibility
(Ocaña-Fernández, et al., 2019, p. 561). There are various learning analytics tools that help in tracking the
progress of students and predicting their performances in the future. Such tools can also prove to be helpful for
refining teaching strategies and optimizing curriculum designs. Thus, the information that has been discussed in
this section lays emphasis on the benefits, challenges, potential applications, and ethical considerations with
regard to the use of AI in the field of education. If AI is used effectively in the educational field, it would bring
an immense improvement to the teaching and learning methods. But this also requires proper management so
that the related risks can be avoided. However, artificial intelligence can make the learning environment better
and also make the learning process student-centered and more inclusive.
Theoretical framework
A widely recognized framework used to explain learners’ adoption of new technologies or software applications
is the Technology Acceptance Model (TAM) (Davis et al., 1989; Venkatesh & Davis, 1996). TAM was designed
to predict and clarify the factors influencing users’ acceptance and utilization of emerging technological systems.
Over time, it has been extensively applied, refined, and expanded (Chuttur, 2009; Yousafzai et al., 2007a). The
model emphasizes four primary constructs: perceived ease of use, perceived usefulness, intention to use, and
actual system use.
Perceived ease of use reflects the degree to which individuals find a technology straightforward and effortless
to operate; technologies that are simpler to handle are typically adopted more readily, as they minimize the effort
required for learning and interaction (Venkatesh & Davis, 2000). Perceived usefulness, on the other hand, relates
to the belief that using a technology enhances one’s effectiveness or performance (Opoku & Enu-Kwesi, 2019).
Intention to use concerns an individual’s motivation or willingness to employ the technology, while actual
system use refers to the observable frequency and extent of its application.
According to the extended version of TAM (Venkatesh & Davis, 1996), both perceived usefulness and perceived
ease of use directly influence an individual’s intention to use, which subsequently predicts actual use behavior.
Various external factorssuch as users experience with technology, educational background, digital self-
efficacy, and agecan shape perceptions of ease of use and usefulness. However, studies have not reached a
Page 1766
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
consensus on which factors exert the greatest influence, as findings vary across contexts (Chuttur, 2009;
Yousafzai et al., 2007a). Davis et al. (1989) further noted that the effect of perceived ease of use on behavioral
intention tends to decrease as users become more familiar with a systema trend supported by subsequent
research indicating that this influence is strongest during early adoption and lessens over time (Adams et al.,
1992; Chau, 1996; Gefen & Straub, 2000; Igbaria et al., 1996).
In the present study, perceived usefulness refers to students’ beliefs that generative AI tools such as ChatGPT,
Gemini, and Copilot enhance their learning effectiveness, efficiency, and academic outcomes. Perceived ease of
use describes the extent to which students view these tools as user-friendly and requiring minimal effort to
operate. When students hold positive perceptions of both usefulness and ease of use, they are more likely to
adopt and consistently engage with generative AI tools in their academic activities.
Self-Determination Theory
This study employs Self-Determination Theory (SDT) as the foundational framework to understand student
motivation in relation to generative AI usage and academic engagement. SDT is well-supported by extensive
theoretical and empirical research (Bureau et al., 2022; Howard et al., 2021) and is widely recognized as a
powerful lens for examining motivation across diverse settings, including education (Bureau et al., 2022;
Howard et al., 2021), healthcare (Ntoumanis et al., 2021), and business environments (Van den Broeck et al.,
2021). SDT distinguishes motivation based on the extent to which it is internalized by the individual. At the
highest level of internalization is intrinsic motivation, which drives engagement purely for the inherent interest,
enjoyment, or personal satisfaction derived from the activity itself. Meanwhile, extrinsic motivation refers to
actions triggered by external factors and is further classified into types based on internalization levels: integrated
regulation (fully embracing the value of the behavior), identified regulation (recognition of the behavior’s
importance), introjected regulation (participation driven by self-esteem maintenance and avoidance of guilt), and
external regulation (behavior controlled by external rewards or pressure). Amotivation, in contrast, describes a
lack of motivation or intention to engage in an activity (Ryan & Deci, 2000, 2020).
Autonomous motivation, comprising intrinsic motivation, integrated regulation, and identified regulation, is
generally associated with more positive and adaptive outcomes compared to controlled motivation, which
includes introjected and external regulation (Ryan & Deci, 2000, 2020; Vansteenkiste et al., 2009). Applying
this theory in the context of generative AI use, it suggests that students who internalize the value of AI as a
learning tool and engage with it for autonomous reasons are likely to experience higher academic engagement
and better educational outcomes. Controlled motivation or amotivation toward AI use, on the other hand, may
limit these benefits.
METHODOLOGY
Integration
The integration of quantitative and qualitative data greatly enhances mixed methods research, offering several
advantages (Bryman, 2006; Creswell & Plano Clark, 2011). For example, qualitative data can validate
quantitative results, while quantitative data can help inform qualitative sample selection or clarify qualitative
findings. Qualitative inquiry can also guide the development or refinement of quantitative tools or generate
hypotheses for testing quantitatively (O’Cathain, Murphy, & Nicholl, 2010). Despite these potential benefits,
many mixed methods studies underutilize integration (Bryman, 2006; Lewin, Glenton, & Oxman, 2009).
Nevertheless, established approaches exist to integrate qualitative and quantitative methods and data at various
stages, including design, data collection, analysis, and reporting (O’Cathain, Murphy, & Nicholl, 2010;
Creswell & Plano Clark, 2011).
Integration at the study design Level
The study adopted a mixed-method research approach, which involves gathering and analyzing both quantitative
and qualitative data to provide a comprehensive understanding of the relationship between generative AI use
and students’ academic engagement. A sequential-explanatory design was specifically employed, beginning with
Page 1767
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
the collection and analysis of quantitative data, followed by qualitative data to build on and explain the initial
results. This design was chosen because it allows for richer data collection and deeper insights by encouraging
active respondent participation and addressing research gaps more effectively (Creswell, 2018).
Integration at the interpretation and reporting Level
The sequential-explanatory design allowed for data integration where qualitative findings enriched the
interpretation of quantitative results. Quantitative analysis yielded a significant positive correlation (r = 0.785, p
< 0.00001) between generative AI use and academic engagement. Qualitative themes such as active learning,
motivation, and improved problem-solving helped explain the mechanisms underlying this correlation. The
integrated reporting presents a comprehensive narrative combining numerical trends with detailed student
perspectives, thereby enhancing the explanatory power and practical relevance of the findings for educators and
policymakers.
Population
In research, the term “population” refers to the entire aggregation of individuals, objects, or elements that share
common characteristics relevant to a particular study and from which the researcher intends to draw conclusions
(Norman, 2000). The population therefore represents the broad group about which the researcher seeks to gain
insight or make generalizations.
For the purpose of this study, the population consisted of all students enrolled in a selected College of Education
located in the Central Region of Ghana. This group was chosen because the college has integrated various digital
learning approaches and has witnessed increasing exposure of students to generative artificial intelligence (AI)
tools. The total student population of the institution at the time of the study was approximately 320, covering
different academic levels and specializations. This population was deemed appropriate for the investigation
because teacher trainees at the College of Education are expected to develop strong digital literacy and innovative
learning habits, both of which align closely with the objectives of exploring generative AI usage and academic
engagement. By focusing on this specific population, the study sought to understand how future teachers in
training are adopting and interacting with generative AI tools and how these interactions may shape their
engagement, learning outcomes, and preparedness for teaching in technology-enhanced educational
environments.
Sample and sampling techniques
The study involved a sample of ninety-eight (98) Level 300 students from the College of Education. Participants
were selected through a convenience non-probability sampling technique, which was considered appropriate due
to the accessibility and availability of the students during the data collection period. This approach allowed the
researcher to gather responses from students who were readily available and willing to participate in the study.
The selection of Level 300 students was intentional because they were in their final semester, preparing for the
mandatory internship program. This group was considered suitable for the study since they had extensive
academic exposure and were actively using generative AI tools such as ChatGPT, Gemini, and Copilot to support
their coursework and professional preparation. The researcher aimed to explore the extent to which these students
utilize generative AI technologies and how such use influences their academic engagement and learning
experiences.
Research instrument
The study adopted a mixed-methods approach, employing a survey questionnaire, individual interviews, and
focus group discussions to examine the relationship between the use of generative artificial intelligence (AI) and
the level of academic engagement among students in a College of Education. The purpose was to gain both
quantitative and qualitative insights into how generative AI influences students’ learning behaviors,
participation, and overall academic involvement. The survey instrument consisted of structured items designed
to measure students’ perceptions and experiences with generative AI tools. Responses were captured using a
Page 1768
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
five-point Likert scale, ranging from Strongly Disagree” to “Strongly Agree.” This scale allowed participants
to express varying degrees of agreement with statements related to their use of AI tools and their corresponding
engagement levels. Complementing the survey, face-to-face interviews and focus group discussions provided a
deeper understanding of students’ perspectives, attitudes, and lived experiences regarding generative AI. These
qualitative methods enriched the quantitative findings by revealing nuanced views on how AI tools affect
motivation, participation, and learning outcomes within the academic environment.
Validity and Reliability of the Study
The validity of this study was ensured through a rigorous pilot testing process prior to the main data collection.
Specifically, the questionnaire and interview schedule were piloted with a smaller sample of students to assess
their clarity, relevance, and comprehensiveness. This pilot phase facilitated the identification and correction of
ambiguous or misleading items, thereby enhancing content validitythe degree to which the instruments
accurately capture the constructs related to generative AI usage and academic engagement. Additionally, expert
reviewers with extensive experience in digital technology integration reviewed the instruments to ensure they
comprehensively covered all relevant aspects of the research.
Regarding reliability, the study employed a test-retest reliability method to assess the consistency and stability
of the responses over time. The same questionnaire was administered to the same group of students on two
separate occasions spaced appropriately to minimize recall bias while assuming no significant change in their
perceptions. The responses from both administrations were then correlated to quantify the degree of agreement,
reflecting temporal stability. The Pearson correlation coefficient obtained was 0.785, indicating acceptable to
good reliability. This value suggests that the instrument yields stable and consistent results over time, with
minimal measurement error.
Ethical Considerations
This study was conducted in accordance with ethical principles to protect the rights and welfare of participants.
Prior to data collection, participants were fully informed about the purpose of the research, the procedures
involved, and how their information would be used. Additionally, the researcher protects participant privacy and
confidentiality, minimizes the risk of harm, and ensures bias and fairness in the test (Kumar, 2014). Obtaining
informed consent is a fundamental principle, where the researcher must fully inform participants about the
purpose of the test, procedures, and potential risks and benefits (Creswell, 2017). Anonymity and confidentiality
were strictly maintained by excluding names or other personal identifiers from data collection, analysis, and
reporting. All information provided was treated with discretion to ensure privacy and security. The research
instruments, including tests, Likert-scale questionnaires, and evaluation surveys, were carefully designed to be
fair, unbiased, and respectful. Measures were taken to minimize any potential risks, and participants were treated
with dignity and fairness throughout the study. Debriefing and feedback were provided as appropriate to foster
transparency and trust. Ethical approval for this study was obtained from the University Research Ethics
Committee before data collection commenced.
Limitations of the study
This study centers on the specific generative AI tools students employ, how they integrate these technologies
into their learning practices, and the overall effects on their academic engagement within a selected college of
education in Ghana. While the findings illuminate significant aspects of AI adoption and student engagement
in this localized setting, caution is warranted in extending these conclusions to other higher education contexts.
Differences in institutional infrastructure, curricular frameworks, cultural attitudes toward technology, and
access to digital resources likely impact the deployment and influence of generative AI across various
educational environments. Consequently, the results contribute valuable localized insights while highlighting
the need for further studies to explore generative AI’s role and impact in other institutional and cultural
landscapes.
Page 1769
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
RESULTS AND DISCUSSION
Data analysis procedure
Data analysis was done using IBM SPSS 24, including cleaning for completeness and consistency. Descriptive
statistics was used to analyze the quantitative survey data, whilst thematic analysis was used to analyse the
qualitative data. The reliability of the survey instrument was confirmed using Cronbach’s alpha (Cronbach,
1951) to ensure internal consistency of the items. We obtained a Cronbach’s alpha value of 0.785, which
indicates a high level of reliability, suggesting that the survey items are consistently measuring the same
underlying concept. Construct validity, which assesses the accuracy with which a survey measures the theoretical
construct it intends to measure, was confirmed through factor analysis, following the method outlined in Hair
et.al., (2009). This verifies that the questionnaire items are appropriately grouped under their respective
constructs, further affirming the validity of the survey instrument used in this study.
Table 1: The Effect of AI on Student's Academic Engagement
Dimension of
Engagement
Generative AI Effect on Academic Engagement
M
Interpretation
Cognitive Engagement
Generative AI supports me in developing critical
thinking and problem-solving skills.
3.59
High
Using generative AI encourages me to reflect on and
improve my learning strategies.
3.98
High
Cognitive mean
3.785
High
Participatory Engagement
Using generative AI in group work helps me
collaborate more effectively with peers.
3.56
High
Generative AI enhances my capacity to engage
thoughtfully in academic conversations.
3.49
Moderate
Participatory mean
3.525
High
Behavioral Engagement
Generative AI motivates me to dedicate more time to
studying.
3.57
High
The assistance I get from generative AI leads me to
engage more actively in class discussions.
2.67
Moderate
Behavioral mean
3.120
Moderate
Emotional Engagement
Generative AI helps reduce my anxiety when faced
with difficult assignments.
3.93
High
Using generative AI boosts my confidence in my
academic work.
3.18
Moderate
Emotional mean
3.555
High
Total Weighted Mean
3.496
High
NB: 1-1.49=very low/strongly disagree; 1.50-2.49=low/disagree; 2.50-3.49=moderate/neutral; 3.50-
4.49=high/agree; 4.50-5.49=very high/strongly agree
Table 1 shows the effect of using generative AI on students' academic engagement. Based on the overall result
of Table 1, it is evident that generative AI has highly affected students' academic engagement.
The dimension of engagement among the four dimensions with the highest mean value is the cognitive domain,
with a mean value of 3.785. Under the cognitive engagement, the statement Using generative AI encourages
Page 1770
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
me to reflect on and improve my learning strategies” and the statement “Generative AI supports me in developing
critical thinking and problem-solving skillshave mean values of 3.98 and 3.59, respectively, both interpreted
as strong agreement.
Emotional engagement is the second dimension of engagement, with a mean value of 3.555. The statements
Using generative AI boosts my confidence in my academic work” and “Generative AI helps reduce my anxiety
when faced with difficult assignmentshave mean values of 3.18 and 3.93, respectively, also indicating strong
agreement.
The next dimension of engagement responded to by the students was participatory engagement, with a mean
value of 3.525. The statement Using generative AI in group work helps me collaborate more effectively with
peers” had a mean value of 3.56, whilst the statement “Generative AI enhances my capacity to engage
thoughtfully in academic conversationshad a mean value of 3.49
The least dimension of engagement with a moderate verbal interpretation of the mean value of 3.120 was
behavioral engagement. The statements The assistance I get from generative AI leads me to engage more
actively in class discussionsand Generative AI motivates me to dedicate more time to studyinghad mean
values of 2.67 and 3.57, respectively, also indicating moderate agreement
To gather qualitative data, focus group discussions and follow-up interviews with students were conducted. The
first theme that emerged was active engagement.
Excerpts from face-to-face interview and focus group discussion
Respondent 2: “GenAI has inspired me to dedicate more time to academic tasks. With motivation from GenAI,
I’m able to complete any assigned task given to me.”
Respondent 8: “I personally feel motivated using GenAI in all my searches.”
Respondent 13: Generative AI has definitely helped me to become more active in class, especially in class
contribution.”
Respondent 4: GenAI has changed my passive mood in class. I’m now active when it comes to classroom
interaction.”
The last theme that emerged was problem-solving.
Respondent 1 explained that GenAI has made solving complex questions much easier, noting that while some
responses are difficult to grasp, others provide clear understanding. Similarly, Respondent 5 affirmed that
entering challenging questions into GenAI enables effective problem-solving through critical thinking.
The study revealed that generative AI improved students’ problem-solving, critical thinking, and comprehension
skills, aligning with modern educational goals of developing twenty-first-century competencies. This finding
supports Buchanan et al. (2022), who noted that GenAI use promotes active participation in collaborative
problem-solving tasks.
Table 2: Significant Relationship between the Use of Generative AI and Students’ Academic Engagement
Indicators
P value
Remarks
Decision
The effect of Generative AI usage
on students’ academic engagement
0.00001
Correlation
Highly
Positive
Null
hypothesisrejecte
d
Significant at 0.05
Page 1771
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
As shown in Table 2 above, for the relationship between the use of generative AI and students' academic
engagement in the selected college of education, a p-value of 0.00001 was obtained, which was lower than the
significance level of 0.05. This indicates a significant relationship between the use of generative AI and students'
academic engagement in the selected college of education. This means that both of the variables have a high
positive correlation. In the qualitative stage of the study, the researchers got the chance to assess how the
utilization of generative AI has affected the respondents' academic engagement. Eager Learners. The first theme
that appeared based on the experiences of most of the respondents is Eager Learner. The majority of them have
stated that their use of generative AI has affected their academic engagement in school. They have been able to
recite, participate in class discussions, ask and answer questions, and more by using AI. The respondents shared
the following:
Focus Group Discussion Excerpt:
One participant shared that AI has boosted their academic engagement by offering support when they encounter
difficult lessons, which helps them respond confidently and participate actively in class activities. Another
commented that AI tools like ChatGPT have significantly aided their preparation for class presentations, making
them more interactive and engaged during discussions. Studentsengagement in generative AI positively affects
their academic involvement by making challenging lessons easier to understand, boosting their willingness to
participate, and answering questions confidently. This aligns with the study by Kurniati and Fithriani (2022),
which found favorable student attitudes toward generative AI use in learning.
Conclusions drawn from the findings are as follows:
Usage of GenAI helps reduce students’ anxiety when faced with challenging questions.
Students also utilized generative AI to improve their understanding of grammar and challenging vocabulary
encountered online. The suggestions, corrections, and feedback from AI tools helped them refine their grammar
skills and grasp unfamiliar words.
Generative AI tools have fostered greater creativity and academic engagement among students by helping
generate ideas despite the fact some students might rely too heavily on AI, leading to decreased engagement.
The use of generative AI among college of education students shows a strong positive correlation with their
academic engagement. These tools supported students in participating and interacting more effectively in class,
making it easier for them to learn and encouraging active involvement.
Study Implications
Generative AI holds considerable promise to revolutionize higher education by making learning experiences
more tailored, engaging, and interactive, particularly in settings with limited resources.
The strong positive link between generative AI use and academic engagement indicates that these technologies
can increase student motivation, participation, and confidence.
Nonetheless, the dual effects of generative AI require vigilant oversight to avoid dependency and to uphold
academic standards and integrity.
Educational institutions must revise curricula and teaching approaches to effectively incorporate AI tools while
also fostering students' digital literacy and responsible AI use skills.
Policymakers and curriculum developers should promote responsible AI adoption that benefits learners and
addresses the distinct challenges present in various educational contexts.
Page 1772
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
RECOMMENDATIONS
Educators and institutions should adopt generative AI tools responsibly to promote students’ cognitive,
emotional, behavioral, and participatory engagement
Generative AI should be used as a supportive aid to ease anxiety over complex tasks, improve problem-solving,
and stimulate creativity.
Future research should explore the long-term effects of generative AI on students’ academic skills and
engagement through longitudinal or experimental designs to establish clearer causal relationships.
REFERENCE
1. Adams, D. A., Nelson, R. R., & Todd, P. A. (1992). Perceived usefulness, ease of use, and usage of
information technology: A replication. MIS Quarterly, 16(2), 227247.
https://doi.org/10.2307/249577
2. Al Shamsi, J. H., Al-Emran, M., & Shaalan, K. (2022). Understanding key drivers affecting students’
use of artificial intelligence-based voice assistants. Education and Information Technologies, 27(6),
80718091. https://doi. org/10.1007/s10639-022-10947-3
3. Baidoo-Anu, D., & Owusu Ansah, L. (2023). Education in the era of generative Artificial Intelligence
(AI): Understanding the potential benefits of ChatGPT in promoting teaching and learning. SSRN
Electronic Journal. https://doi.org/10.2139/ssrn.4337484
4. Brown, J. S., Collins, A., & Harris, G. (1978). Artificial intelligence and learning strategies. In F. O.
Harold (Ed.), Learning strategies (pp. 107-139). Academic Press. https://doi.org/10.1016/B978-0-12
526650-5.50010-1
5. Bryman, A. 2006. “Integrating Quantitative and Qualitative Research: How Is It Done? Qualitative
Inquiry 6 (1): 97113.
6. Buchanan, R., Burridge, A., & Bowen, L. (2022). Generative artificial intelligence in education:
Enhancing critical thinking through collaborative problem-solving. Journal of Educational Technology
& Society, 25(3), 81-90.
7. Bulawan, A. A., Tilos, F. G., Bulawan, A., Samonte, J., Alejo, K., & Carasicas, M. (2023). The lived
experiences of students in learning with technology: A descriptive phenomenological research study.
International Journal of Advanced Multidisciplinary Research and Studies, 3(4), 438441.
https://www.multiresearchjournal.com/arclist/list3.4/id-1445
8. Bureau, J. S., Howard, J. L., Chong, J. X., & Guay, F. (2022). Pathways to student motivation: A meta-
analysis of antecedents of autonomous and controlled motivations. Review of Educational Research,
92(1), 4672. https://doi. org/10.3102/00346543211042426
9. Chau, P. Y. (1996). An empirical assessment of a modified technology acceptance model. Journal of
Management Information Systems, 13(2), 185204.
https://doi.org/10.1080/07421222.1996.11518128
10. Chiu, T. K. (2023a). Future research recommendations for transforming higher education with generative
AI. Computers and Education: Artificial Intelligence, 6, 100197.
https://doi.org/10.1016/j.caeai.2023.100197
11. Chiu, T. K., Ismailov, M., Zhou, X., Xia, Q., Au, C. K., & Chai, C. S. (2023a). Using self-determination
theory to explain how community-based learning fosters student interest and identity in integrated STEM
education. International Journal of Science and Mathematics Education, 21(S1), 109130.
https://doi.org/10.1007/s10763-023-10382
12. Chiu, T. K., Moorhouse, B. L., Chai, C. S., & Ismailov, M. (2023b). Teacher support and student
motivation to learn with artificial intelligence (AI) based chatbot. Interactive Learning Environments,
32(7), 117. https://doi.org/10.1 080/10494820.2023.2172044
13. Chuttur, M. (2009). Overview of the Technology Acceptance Model: Origins, developments, and future
directions. Sprouts: Working Papers on Information Systems, 9(37), 290.
https://aisel.aisnet.org/sprouts_all/290
14. Creswell, J. W. (2018). Research design: Qualitative, quantitative, and mixed methods approaches (5th
ed.). Thousand Oaks, CA: SAGE Publications.
Page 1773
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
15. Creswell, J. W. (2018). Research design: Qualitative, quantitative, and mixed methods approach.
Thousand Oaks, CA: SAGE Publications, Inc.
16. Creswell, J. W., and V. L. Plano Clark. 2011. Designing and Conducting Mixed Methods Research.
Thousand Oaks, CA: Publications, Inc.
17. Crompton, H., & Burke, D. (2023). Artificial intelligence in higher education: the state of the eld.
International Journal of Educational Technology in Higher Education, 20(1), 1-22. Page 21/28
https://doi.org/10.1186/s41239-023-00392-8
18. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A
comparison of two theoretical models. Management Science, 35(8), 9821003.
https://doi.org/10.1287/mnsc.35.8.982
19. Deci, E. L., Olafsen, A. H., & Ryan, R. M. (2017). Self-determination theory in work organizations: The
state of a science. Annual Review of Organizational Psychology and Organizational Behavior, 4(1), 19
43. https://doi.org/10.1146/annurev-orgpsych-032516-113108
20. Esiyok, E., Gokcearslan, S., & Kucukergin, K. G. (2025). Acceptance of educational use of AI chatbots
in the con text of self-directed learning with technology and ICT self-efficacy of undergraduate students.
International Journal of HumanComputer Interaction, 41(1), 641650.
https://doi.org/10.1080/10447318.2024.2303557
21. Estrellado, C. J. (2023). Artificial Intelligence in the Philippine educational context: Circumspection and
future inquiries. International Journal of Scientific and Research Publications, 13(4), 16.
https://doi.org/10.29322/IJSRP.13.04.2023.p13704
22. Garito, M. A. (1991). Arti cial intelligence in education: evolution of the teachinglearning relationship.
British Journal of Educational Technology, 22(1), 41-47. https:// doi.org/10.1111/j.1467-
8535.1991.tb00050.x
23. Gefen, D., & Straub, D. (2000). The relative importance of perceived ease of use in IS adoption: A study
of e-commerce adoption. Journal of the Association for Information Systems, 1(8), 130.
https://doi.org/10.17705/1jais.00008
24. Hagendorff, T. (2020). The ethics of AI ethics: An evaluation of guidelines. Minds and Machines, 30(1),
99120. https://doi.org/10.1007/s11023-020-09517-8
25. Halaweh, M. (2023). ChatGPT in education: Strategies for responsible implementation. Contemporary
Educational Technology,15(2), 421. https://doi.org/10.30935/cedtech/13036
26. Hashmi, N., & Bal, A. S. (2024). Generative AI in higher education and beyond. Business Horizons.
https://doi.org/10.1016/j.bushor.2024.05.005
27. Hidayat-ur-Rehman, I. (2024). Digital competence and students’ engagement: a comprehensive analysis
of smartphone utilization, perceived autonomy and formal digital learning as mediators. Interactive
Technology and Smart Education. https://doi.org/10.1108/itse-09-2023-0189
28. Howard, J. L., Bureau, J. S., Guay, F., Chong, J. X., & Ryan, R. M. (2021). Student motivation and
associated out comes: A meta-analysis from self-determination theory. Perspectives on Psychological
Science: A Journal of the Association for Psychological Science, 16(6), 13001323.
https://doi.org/10.1177/1745691620966789
29. Igbaria, M., Parasuraman, S., & Baroudi, J. (1996). A motivational model of microcomputer usage.
Journal of Management Information Systems, 13(1), 127143.
https://doi.org/10.1080/07421222.1996.11518115
30. Johnson, B. (2023). NYC schools ban ChatGPT over academic dishonesty fears. New York Times.
31. Kelly, S., Kaye, S. A., & Oviedo-Trespalacios, O. (2023). What factors contribute to the acceptance of
artificial intelligence? A systematic review. Telematics and Informatics, 77, 101925.
https://doi.org/10.1016/ j.tele.2022.101925
32. Kumar, R. (2014). Research methodology: A step-by-step guide for beginners (3rd ed.). Thousand Oaks,
CA: SAGE Publications.
33. Kurniati, E. Y., & Fithriani, R. (2022). Post graduate students’ perceptions of Quillbot utilization in
English academic writing class. Journal of English Language Teaching and Linguistics, 7(3), 437.
https://doi.org/10.21462/jeltl.v7i3.852
34. Lai, C. Y., Cheung, K. Y., & Chan, C. S. (2023). Exploring the role of intrinsic motivation in ChatGPT
adoption to support active learning: An extension of the technology acceptance model. Computers and
Education: Artificial Intelligence, 5, 100178. https://doi.org/10.1016/j.caeai.2023.100178
Page 1774
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
35. Lewin, S., C. Glenton, and A. D. Oxman. 2009. “Use of Qualitative Methods Alongside Randomized
Controlled Trials of Complex Healthcare Interventions: Methodological Study.” British Medical Journal
339: b3496.
36. Liang J, Wang L, Luo J, Yan Y and Fan C (2023). The relationship between student interaction with
generative artificial intelligence and learning achievement: serial mediating roles of self-efficacy and
cognitive engagement. Front. Psychol. 14:1285392. https://doi.org/10.3389/fpsyg.2023.1285392
37. Lim, W. M., Gunasekara, A., Pallant, J. L., Pallant, J. I., & Pechenkina, E. (2023). Generative AI and the
future of education: Ragnak or reformation? A paradoxical perspective from management educators.
International Journal of Management in Education, 21(2), 100790.
https://doi.org/10.1016/j.ijme.2023.100790
38. Liu, G. L., Darvin, R., & Ma, C. (2024). Exploring AI-mediated informal digital learning of English (AI-
IDLE): A mixed-method investigation of Chinese EFL learners’ AI adoption and experiences. Computer
Assisted Language Learning, 129.
39. Norman, E. W., & Fraenkel, J. R. (2000). How to design and evaluate research in education. New Jersey,
NJ: McGraw-Hill.
40. O’Cathain, A., E. Murphy, and J. Nicholl. 2010. “Three Techniques for Integrating Data in Mixed-
Methods Studies.” British Medical Journal 341: c4587.
41. Ocaña-Fernández, Y., Valenzuela-Fernández, L. A. & Garro-Aburto, L. L., 2019. Artificial Intelligence
and Its Implications in Higher Education. Journal of Educational Psychology-Propositosy
Representaciones, 7(2), pp. 553-568
42. Okoye, M., & Mante, D. (2024). The nexus between artificial intelligence and STEM education:
Research on AI applications in higher education. Educational Technology Research and Development,
68(4), 18511861.
43. Opoku, M. O., & Enu-Kwesi, F. (2019). Relevance of the technology acceptance model (TAM) in
information management research: A review of selected empirical evidence. Research Journal of
Business and Management, 6(1), 5562. https://doi.org/10.17261/Pressacademia.2019.1028
44. Oravec, J. (2023). Artificial intelligence implications for academic cheating: Expanding the dimensions
of responsible human-AI collaboration with ChatGPT and Bard. Journal of Interactive Learning
Research, 34(2), 213237.
45. Popenici, S. A., & Kerr, S. (2017). Exploring the impact of artificial intelligence on teaching and learning
in higher education. Research and Practice in Technology Enhanced Learning, 12(1), 1-13.
https://doi.org/10.1186/s41039-017-0062-8
46. Rodrigues, M., Silva, R., Franco, M. A. P. B., & Oliveira, C. (2024). Artificial intelligence: Threat or
asset to academic integrity? A bibliometric analysis. Kybernetes. https://doi.org/10.1108/k-09-2023-
1666
47. Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new
directions. Contemporary Educational Psychology, 25(1), 5467.
https://doi.org/10.1006/ceps.1999.1020
48. Ryan, R. M., & Deci, E. L. (2020). Intrinsic and extrinsic motivation from a self-determination theory
perspective: Definitions, theory, practices, and future directions. Contemporary Educational Psychology,
61, 101860. https://doi.org/10.1016/j.cedpsych.2020.101860
49. Scherer, R., & Siddiq, F. (2019). The relation between students’ socioeconomic status and ICT literacy:
Findings from a meta-analysis. Computers & Education, 138, 1332.
https://doi.org/10.1016/j.compedu.2019.04.011
50. Sok, P., & Heng, C. (2023). ChatGPT and academic writing: Student perceptions in Cambodian
universities. Asian Journal of Education, 14(3), 245259.
51. Van den Broeck, A., Howard, J. L., Van Vaerenbergh, Y., Leroy, H., & Gagné, M. (2021). Beyond
intrinsic and extrinsic motivation: A meta-analysis on self-determination theory’s multidimensional
conceptualization of work motivation. Organizational Psychology Review, 11(3), 240273.
https://doi.org/10.1177/20413866211006173
52. Vansteenkiste, M., Sierens, E., Soenens, B., Luyckx, K., & Lens, W. (2009). Motivational profiles from
a self-determination perspective: The quality of motivation matters. Journal of Educational Psychology,
101(3), 671688. https://doi.org/10.1037/a0015083
Page 1775
www.rsisinternational.org
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
53. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin,
I. (2017). Attention is all you need. In Proceedings of the 31st Conference on Neural Information
Processing Systems (pp. 59986008). Long Beach, CA, USA.
54. Venkatesh, V., & Davis, F. D. (1996). A model of the antecedents of perceived ease of use: Development
and test. Decision Sciences, 27(3), 451481. https://doi.org/10.1111/j.1540-5915.1996.tb01822.x
55. Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four
longitudinal field studies. Management Science, 46(2), 186204.
https://doi.org/10.1287/mnsc.46.2.186.11926
56. Yang, Y., Luo, J., Yang, M., Yang, R., & Chen, J. (2024). From surface to deep learning approaches with
generative AI in higher education: an analytical framework of student agency. Studies in Higher
Education, 49, 817-830.
57. Yim, I. H. Y., & Su, J. (2024). Artificial intelligence (AI) learning tools in K-12 education: A scoping
review. Journal of Computers in Education, 12, 93131. https://doi.org/10.1007/s40692-023-00304-9
58. Yim, I. H. Y., & Su, J. (2024). Artificial intelligence (AI) learning tools in K-12 education: A scoping
review. Journal of Computers in Education, 12, 93131.
https://doi.org/10.1007/s40692-023-00304-9
59. Yousafzai, S. Y., Foxall, G. R., & Pallister, J. G. (2007a). Technology acceptance: A meta-analysis of
the TAM: Part 1. Journal of Modelling in Management, 2(3), 251280.
https://doi.org/10.1108/17465660710834453