INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
Exploring the Psychological Mechanisms behind AI Chatbot  
Adoption  
Mariem Sboui  
Department of Marketing, University of Tunis  
Received: 14 December 2025; Accepted: 21 December 2025; Published: 31 December 2025  
ABSTRACT  
Objective: This study examines the fundamental determinants influencing the intention to adopt AI chatbots in  
the telecommunication services context in an emerging market. Specifically, it investigates the influence of  
perceived ease of use (PEOU), perceived usefulness (PU), initial trust, and technology anxiety on chatbot usage  
intention.  
Methodology: Data were collected via an online survey administered to 321 Tunisian higher education students.  
The hypotheses were tested using structural equation modeling (SEM) with SmartPLS4 (Partial Least Squares  
4).  
Results: The findings reveal that initial trust and perceived usefulness (PU) have a significant and positive effect  
on intention to use chatbots. In contrast, perceived ease of use (PEOU) and technology anxiety were found to be  
non-significant predictors.  
Managerial/societal implications: Managers should prioritize incorporating initial trust-building elements in  
chatbot design by proposing privacy protection mechanisms and transparent conversation. They must also  
reinforce the perceived usefulness of chatbots through their ability to address user needs efficiently.  
Originality: This study enriches the growing literature on the Technology Acceptance Model (TAM), shedding  
light on the psychological factors that shape users' decisions to engage with AI-chatbot services.  
Keywords: Chatbot services, perceived ease of use (PEOU), perceived usefulness (PU), initial trust, technology  
anxiety.  
INTRODUCTION  
Artificial Intelligence (AI) has undergone remarkable advancements in recent years, transforming societal  
behaviors and organizational processes. Its widespread incorporation into business operations has reshaped the  
functioning of various industries, leading to enhanced automation, efficiency, and driving customer-centric  
innovation (Rashid and Kausik, 2024). One of the most rapidly evolving areas is AI-enabled marketing  
communication, where the adoption of advanced technologies, particularly interactive messaging services  
known as “chatbot e-services”, has experienced substantial growth (Desaulniers, 2016). An increasing range of  
brands are shifting from traditional customer service to chatbot-powered digital solutions, which offer scalability,  
consistency, and cost-efficiency (Forbes, 2017). The emergence of AI-based marketing has empowered  
organizations to integrate conversational agents as a pivotal component of their customer service strategies,  
significantly improving customer experience (Mehta et al., 2022; Wang, 2024).  
Chatbots represent a fascinating field of communication between humans and AI (Liu et al, 2024). These  
programs communicate with users via natural language, employing voice, text, or both, simulating a human  
interaction (Liu et al, 2024). By serving as a 24/7 touchpoint providing customer responses, chatbots offer the  
potential for a significant cost reduction opportunity by reducing the need for a human agent (Bakhshi et al.,  
Page 896  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
2018). According to a recent industry report, the global chatbot market is expected to grow to around $4.9 billion  
by 20321.  
Despite the many benefits that conversational agents offer for improving services, customers have expressed  
frustration with chatbots due to ambiguous questions and inadequate responses (Elliott, 2018). Trust represents  
a key determinant in fostering the adoption of chatbots (Jyothsna et al, 2024). Trust holds significant importance  
in shaping user behavior and has been incorporated into the technology adoption model to predict future actions  
(Kelly et al, 2023). While several studies have investigated the role of trust across various technologies, little  
research has centered on initial trust in AI chatbots (Jyothsna et al, 2024). This highlights the need for additional  
investigation on this topic in Tunisia. In this country, individuals express a strong distrust of new technologies  
(Mostafa and Kasamani, 2022).  
Additionally, prior studies show that, even for young users, rapidly transforming digital technologies generate  
technostress (Prior and Dwyer, 2023; Liu et al., 2021). Hence, digital natives may experience apprehension or  
unease when interacting with new technology, specifically in emerging markets. Following the suggestion of  
Park et al. (2019), we examined “technology anxiety” as an individual characteristic that may obstruct the  
adoption of AI chatbots. Therefore, examining the impact of technology-related anxiety on the adoption of AI  
chatbots is important in the context of developing markets (Khanfar et al., 2024).  
To enhance the prediction power of Tunisia scholars’ acceptance of AI-assisted literature reading, this study  
employed the widely established TechnologyAcceptance Model (TAM) (Davis, 1989). TAM remains a dominant  
framework for predicting users’ behavioral intentions to accept and use a new technology or system. Perceived  
ease of use (PEOU) and perceived usefulness (PU), two key elements in TAM, have been supported by extensive  
research demonstrating a significant predictive power for users’ behavioral intention. Notably, these functional  
constructs may be less prominent among users familiar with digital tools, thereby calling into question the  
generalizability of TAM assumptions in these contexts (Bayaga and Du Plessis, 2023). Furthermore, one of the  
notable limitations of the Technology Acceptance Model (TAM) lies in its premise that users engage in a fully  
rational decision-making process when adopting novel technology (Ahn,2023). However, it is increasingly  
recognized that emotional drivers often impact user decisions rather than solely rational decisions (Ahn,2022).  
This highlights the crucial need to examine users’ psychological states in the adoption of new technology  
(Ahn,2023).  
This study addresses this research gap by integrating technology anxiety and initial trust into an extended TAM  
to examine the factors of chatbot adoption among students in an emergent market. By focusing on this  
underexplored segment, the study contributes to a more contextually grounded understanding of chatbot adoption  
in the telecommunication sector, offering theoretical insights and practical implications for marketers and service  
designers.  
THEORETICAL BACKGROUND  
Chatbot services  
Chatbots, also referred to as conversational Agents (CA), are artificial intelligence (AI)-based systems designed  
to communicate with users through natural language processing (NLP). As defined by Ciechanowski et al. (2019,  
p. 540), a chatbot “is a software program that interacts with users using natural language”. Owing to their  
ability to interact with customers continuously 24 hours a day and seven days a week, irrespective of standard  
working hours, companies are rapidly integrating chatbots to improve efficiency, competitiveness, and customer  
service (Sharma et al., 2022; Hyun et al., 2022; Troshani et al., 2021).  
Chatbots have become increasingly popular in various service sectors, including banking (Lubbe and Ngoma,  
2021; Pillai and Sivathanu, 2020), hospitality (Alotaibi and Hidayat-ur-Rehman, 2025), and e-commerce (Araújo  
and Casais, 2020). However, literature related to chatbot adoption in telecommunication services is relatively  
scant, particularly in the specific context of emerging countries (Fallaque, 2024; Sboui et al, 2024). Previous  
research within the telecommunications sector has investigated the role of chatbot interactions to improve  
Page 897  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
customer satisfaction (Shin et al., 2023; Fallaque, 2024), yet a significant gap remains in comprehending how  
these agents stimulate intention to use chatbots in this sector within emerging markets.  
The Technology Acceptance Model (TAM)  
The Technology Acceptance Model (TAM), developed by Davis (1989), is grounded in the Theory of Reasoned  
Action (TRA) and provides a valuable framework for investigating users’ acceptance behaviors of information  
technologies. This model has been widely applied in the era of information systems (IS) and chatbot-related  
studies (De Cicco et al., 2022).  
The TAM posits that user engagement with a specific information technology application, such as chatbots,  
depends on their behavioral intention, which refers to the extent to which a user is inclined to perform a particular  
behavior (Qatawneh et al, 2024). The model suggested that “perceived ease of use (PEU)” and “perceived  
usefulness (PU)” were the two main crucial factors of user acceptance (Marikyan and Papagiannidis, 2024).  
PEOU represents users' anticipation of an uncomplicated and intuitive interaction (Na et al., 2022), while PU  
refers to an individual's belief in significant advantages and enhanced performance that adoption of technology  
offers (Bailey et al., 2022). According to this framework, users are likely to adopt the chatbot given its user-  
friendly nature, and they perceive it as highly performant.  
In the era of AI, TAM is widely acknowledged as the most prevalent theoretical framework for assessing usage  
intentions and customer acceptance across multiple technological applications (Kelly et al., 2023; Ibrahim et al.,  
2025). A broad range of empirical studies has consistently confirmed its effectiveness in explaining and  
predicting users' intentions to adopt and utilize various IT systems and applications (King and He, 2006).  
By leveraging TAM, this study can comprehensively assess the factors driving the intention to use the chatbot  
services, thereby providing actionable insights to enhance customer engagement in emerging markets.  
Hypothesis Development  
Perceived ease of use and usage intention  
Perceived ease of use represents “the degree to which a person believes that using a particular system would be  
free of effort” (Radner and Rothschild, 1975). It refers to the extent to which a system operates smoothly without  
specialized knowledge or specific expertise from users (Amin et al., 2014; Jo, 2022). Ease of use is a determinant  
of technology acceptance, insofar as consumers are likely to adopt technology that is easily understood and user-  
friendly (Davis, 1989). Previous studies show that when users experience minimal effort in operating  
technologies, their overall adoption enhances, leading to higher engagement (Chen et al., 2022; Iqbal and  
Campbell, 2023).  
In the context of chatbot adoption, the concept of “perceived ease of use” is widely examined (Araújo and Casais,  
2020; De Cicco et al., 2022; Gopinath and Kasilingam, 2023; Li et al., 2023). Specifically, prior studies have  
investigated its effect on usage intention (Pillai and Sivathanu, 2020; Maduku et al., 2024; PAN et al, 2025). The  
greater the PEOU of an innovative technology by consumers, the more likely they are to adopt it (Alotaibi and  
Hidayat-ur-Rehman,2025 ). hence, the following hypothesis:  
H1: Perceived ease of use positively influences chatbot usage intention  
Perceived usefulness and usage intention  
Perceived Usefulness is related to the degree to which a technological system enhances user efficiency and aligns  
with their preferences (Gani et al., 2024). When customers consider a system or application an effective means  
of facilitating their tasks, they are more likely to use it again (Muslichah, 2018). Any service that saves customers  
time and offers personalized services and flexibility improves users' attitude toward the service provider (Eger  
et al, 2021).  
Page 898  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
Previous research has highlighted the relevance of the PU on the customers' intention to use emergent  
technologies (Gümüş & Çark, 2021; PAN et al, 2025). In the context of chatbots, when customers recognize the  
tangible benefits of using a chatbot, such as smoother interactions or process automation, their intention to adopt  
the technology enhances (Zhang et al., 2023). Hence, favorable evaluations of a chatbot’s utility improve users'  
willingness to interact with it (Song et al., 2024), as people are more predisposed to adopt technologies that  
demonstrate clear advantages (Portz et al., 2019). Chatbot systems that offer personalized suggestions and predict  
user needs enhance the perception of usefulness.  
Given these insights, users who recognize the chatbot as useful in facilitating the completion of various tasks are  
more likely to intend to use it in the future. Therefore, PU is a crucial determinant of intention in chatbot services.  
Hence, we posit the following hypothesis:  
H2: Perceived usefulness positively influences chatbot usage intention  
Initial trust and usage intention  
Trust was identified as the most significant factor in predicting attitude and the intention to use an AI system  
(Zarifis and Cheng, 2023). This concept is defined as the willingness of one party to be vulnerable to the actions  
of another based on positive expectations about the latter's motivations or behavior (Hong and Cha, 2013). Trust  
manifests in two main forms: initial trust and ongoing trust (Pena et al., 2021). Trust building is a dynamic  
process marked by a gradual transition from initial trust to ongoing trust development (Chakraborty et al., 2022;  
Siau and Wang, 2018). Initial trust assumes that users have no information before the preliminary encounter  
(Talwar et al., 2020).  
Initial trust refers to "trust in an unknown party or without prior use" (Fan et al., 2020). It represents a form of  
trust that arises without prior experience and is distinct from experiential trust due to its temporal dimension  
(Koufaris and Hampton-Sosa, 2004). In the realm of emerging technologies, numerous studies highlighted the  
importance of investigating initial trust, particularly when consumers are confronted with uncertainty before  
utilizing new technologies (Li et al., 2008; Mostafa and Kasamani, 2022; Talwar et al., 2020). Specifically, in  
the case of chatbots, existing literature has emphasized the pivotal role of initial trust on customer usage intention  
(Jyothsna et al, 2024; Mostafa and Kasamani, 2022; Kaabachi et al, 2019). Consequently, our study suggests  
that initial trust toward chatbots can enhance consumers' intention to adopt and use them. Hence, we can posit  
the following hypothesis:  
H3: Initial trust in chatbots positively influences chatbot usage intention.  
Anxiety and usage intention  
Technological anxiety represents the extent to which a customer feels apprehensive or uneasy when using a  
technology (Meuter et al., 2005; Pillai et al., 2024). This emotional response engenders trepidation and fear  
toward adopting new technological tools (Venkatesh, 2000). Recognized as a substantial psychological barrier,  
technological anxiety has been established to reduce individuals' willingness to engage with technology and  
brings confusion (Meuter et al., 2005). The literature has shown that technological anxiety leads to avoidance of  
novel technology, becoming a major impediment to adoption (Lam et al., 2008; Mani and Chouk, 2018; Wang  
et al., 2020). It even risks overshadowing the advantage of technology and diverting the consumer's engagement  
(Man et al., 2024). This implies that individuals with high levels of technological anxiety are less inclined to use  
or adopt technology (Alboqami, 2023; Pillai and Sivathanu, 2020), hence the following hypothesis:  
H4: Technology anxiety negatively affects the intention to adopt chatbots.  
Page 899  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
Figure 1. Conceptual Framework  
Source(s) : The authors’ own work  
METHODOLOGY  
Data Collection and Sample  
To investigate the adoption of chatbot technologies among digital-native users in an emerging market context,  
data were collected with an online survey administered via Google Forms. The target population consisted of  
university students in Tunisia, who are generally considered digital natives (Dinh and Park, 2023). Aconvenience  
sampling method was used, given its appropriateness for reaching the intended population efficiently and  
effectively (Jager et al., 2017).  
Before the data collection phase, the questionnaire underwent a readability check. A pilot test with five students  
was conducted to ensure item clarity. The final questionnaire, available in English and Arabic, was distributed  
through email and social media. The data collection process spanned eight weeks. To simulate real chatbot  
interaction, participants were first invited to interact with "Djingo Damdoum", a customer service chatbot  
developed by the telecommunication company of Orange Tunisia. Notably, this locally developed chatbot  
supports Tunisian dialect, French, and Modern Standard Arabic, enhancing linguistic accessibility. Following  
this interaction, participants were asked to pose between three and six questions to the chatbot on topics like  
services, promotions, or problem resolution in the language of their choice. The final sample is composed of  
321 respondents, including 77 males (24%) and 244 females (76%).  
Measures  
The constructs of the current study were measured using validated scales, slightly modified and adapted from  
the literature. The items for the constructs Perceived Ease of Use (PEU) and Perceived Usefulness (PU) were  
taken from Davis’s study (1989). The chatbot's initial trust measure scale was adapted from Oliveira et al. (2014).  
The items developed by Meuter et al. (2005) were used to inspect technology anxiety. All items were measured  
on seven-point Likert scales, with 1 “strongly disagree” and 7 “strongly agree.” The study constructs and their  
sources are presented in Table 1.  
Table 1. Variables measures  
Constructs  
Items  
Sources  
Ease of Use  
EU1. My interaction with chatbots would be clear and understandable  
EU2. I would find chatbots easy to use  
EU3. Learning to operate chatbots would be easy for me  
Davis  
(1989)  
Page 900  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
Perceived  
PU1. I find chatbot services useful in the purchasing process  
Davis  
Usefulness  
(1989)  
PU2. Using chatbot services enables me to accomplish the purchasing process  
quickly  
PU3. Using chatbot services increases my efficiency in the purchasing process  
PU4. Overall, I find Chatbot useful to me.  
Technology  
anxiety  
TA1. I feel apprehensive about using technology  
Meuter  
al. (2005)  
et  
TA2 Technical terms sound like confusing jargon to me  
TA3 I have avoided technology because it is unfamiliar to me  
TA4 I hesitate to use most forms of technology for fear of making mistakes I  
cannot correct  
Chatbot  
initial trust  
CIT1 Chatbots seem dependable  
Oliveira et  
al.  
CIT2 Chatbots seem secure  
(2014)  
CIT3 Chatbots were created to help the  
client  
CIT4 Chatbots seem trustworthy  
Usage  
Intention  
UI1. I intend to use chatbot services the next time I make an online purchase  
UI2. I will probably use chatbot services the next time I make an online purchase  
Venkatesh  
et  
al.  
(2003)  
UI3. I have decided to use chatbot services the next time I make an online  
purchase  
RESULTS  
Common Method Bias Analysis :  
Data were analyzed by the PLS-SEM estimation method using Smart PLS4, which is widely applied in social  
sciences. To examine common method variance (CMV), the method recommended by Podsakoff et al. (2003)  
has been used. Hartman's single-factor technique showed that the single factor explained 19.47% of the total  
variance, which is lower than 50%, indicating the nonexistence of significant common method bias (Fuller et  
al., 2016).  
Exploratory Factor Analysis  
To assess the reliability and validity of the measurement scales, we applied the exploratory factor analysis using  
principal component analysis. Item CIT3 was excluded because its communality (0.429) is below 0.5. All  
remaining constructs demonstrated satisfactory reliabilities (above 0.7).  
Measurement Model  
The results presented in Table 2 confirm the reliability and validity of the robustness of the measurement model.  
Indeed, all Cronbach’s α values are above 0.7, indicating strong internal consistency. All outer loadings were  
above 0.7 (Hair et al., 2019), indicating good indicator reliability. Except for item AT4, which was eliminated  
due to its outer loading of 0.617 (i.e., below the recommended threshold of 0.7). Report composite reliability  
(CR) values exceed 0.7, validating the internal consistency of each construct (Hair et al., 2019). Additionally,  
Page 901  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
Average Variance Extracted (AVE) indicators confirm convergent validity as they exceed 0.5 (Henseler et al.,  
2016). As shown in Table 3, all HTMT values are below 0.9, indicating discriminant validity (Henseler et al.,  
2015).  
Table 2. Reliability and Convergent Validity Assessment  
Cronbach's  
alpha  
Composite reliability Composite reliability Average  
variance  
(rho_a)  
0.965  
0.842  
0.934  
0.833  
0.956  
(rho_c)  
0.977  
0.886  
0.950  
0.894  
0.971  
extracted (AVE)  
0.965  
0.813  
0.930  
0.824  
0.955  
0.935  
CIT  
EU  
PU  
TA  
UI  
0.721  
0.828  
0.738  
0.918  
Table 3. Discriminant Validity (HTMT Criterion)  
CIT  
EU  
PU  
TA  
UI  
CIT  
EU  
PU  
TA  
UI  
0.266  
0.350  
0.055  
0.481  
0.486  
0.065  
0.288  
0.154  
0.590  
0.42  
Structural Model  
First, we assessed the multi-collinearity issues. Following the recommendations of MacKenzie et al. (2012), the  
item CIT4 is eliminated since its variance inflation factor value is above 10 (VIF 25.738). The structural Model  
is shown in Figure 2.  
Figure 2: Structural Model  
Subsequently, we analyzed the hypothesized relationships between the constructs. The results illustrated in Table  
4 show that the hypotheses H2 and H3 are supported, while H1 and H4 are rejected. Among the variables that  
Page 902  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
influence chatbot usage intention, perceived usefulness was found to have the most significant effect (β=0.468,  
t=8.931), followed by chatbot initial trust (β=0.309, t=6.336).  
Table 4. Results  
Original sample (O)  
T statistics P values  
Remarks  
0.001  
0.468  
0.309  
-0.088  
0.018  
8.931  
6.336  
1.495  
0.985  
0.000  
0.000  
0.135  
Not supported  
Supported  
H1  
H2  
H3  
H4  
EU -> UI  
PU -> UI  
CIT -> UI  
TA -> UI  
Supported  
Not supported  
DISCUSSION  
Drawing on TAM theory, this study investigates the impact of initial trust and technology anxiety on chatbot  
usage intention.  
The results of this research confirm that initial trust has a significant influence on intention to use chatbots. This  
result aligns with the majority of previous research findings (Sboui et al, 2024; Rajaobelina et al., 2021;  
Kasilingam, 2020; Moussawi et al., 2021), confirming that when individuals have high initial trust in chatbots,  
they are more likely to intend to use these chatbots. Thus, increased initial trust in chatbots fosters the intention  
to engage with these systems and reduces behavioral ambiguity towards them (Sboui et al, 2024).  
Moreover, the findings highlight the critical role of perceived usefulness (PU) in shaping the intention to use  
chatbot AI. Existing research (Ismatullaev and Kim, 2024; Norzelan et al., 2024; Davis et al, 1989; Rafique et  
al., 2020) reveals that users’ perceptions of the technology's usefulness have a significant effect on their intention  
to adopt it. As users recognize the benefits and importance of chatbot services functionality to their needs, their  
adoption of the technology increases (Topsakal, 2024).  
In contrast, the results showed that perceived ease of use was not a significant factor in the intention to adopt  
chatbots. This result diverges from the traditional TAM framework, which consistently highlights PEOU as a  
key determinant of intention. Our results align with prior studies suggesting that as users develop proficiency in  
digital tools, the role of ease of use becomes less impactful in their adoption of technologies (Liu et al., 2016;  
Van Eeuwen, 2017; Gopinath & Kasilingam, 2023; Topsakal, 2024). In digital environments, highly educated  
Internet surfers frequently use AI systems for various tasks; these interactions contribute to reducing the  
perceived skills required to interact with chatbots (Gopinath and Kasilingam, 2023). The participants in our  
study seem to be relatively familiar with open AI chatbots and did not perceive ease of use as an essential  
determinant for chatbot adoption. They seek to gather the needed knowledge rather than requiring the ease-of-  
use characteristic of an AI system (Chen and Barnes, 2007). Thus, the willingness to adopt novel technology  
such as chatbots is primarily driven by its perceived usefulness, rather than the ease of use of systems (Topsakal,  
2024).  
Furthermore, technology anxiety does not play a significant role in shaping customers’ intention to use AI  
chatbots, contrary to our hypothesis and previous research (Li et al. 2021; Melián-González et al, 2021).  
However, this outcome aligns with the results of Foroughi et al (2025). This finding may be explained by the  
fact that students tend not to perceive chatbots as risky or overly complex, likely due to their habitual and  
frequent interaction with digital technologies and chatbot services. As a demographic characterized by high  
digital literacy and the effortless incorporation of technology into daily life (Ayanwale and Ndlovu, 2024; Tian  
et al., 2024; Ragheb et al., 2022), students are generally more at ease when engaging with digital innovations.  
Moreover, as noted by Foroughi et al. (2025), the interaction with conversational agents is often voluntary rather  
than obligatory, which may alleviate anxiety commonly associated with new technologies. Thus, it has been  
Page 903  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
proved that older people are more anxious about novel technologies than younger people (Mariano et al., 2022;  
Wang et al., 2020), offering a perspective for future research into older people's intention to use chatbots.  
Theoretical Implications  
Our study offers several significant theoretical contributions to the existing body of literature. First, it seeks to  
integrate key research variables to develop a holistic framework for forecasting user adoption of AI-driven  
customer chatbot services. The strength of our model builds on an extended version of the Technology  
Acceptance Model (TAM) framework, by adding two additional factors: technology anxiety, as suggested by  
Dinh and Park (2023) and Ko and Chang (2019), and initial trust as recommended by Mostafa and Kasamani  
(2022). To the best of our knowledge, the present work is among the first to explore the combined impact of  
these two psychological factors with the cognitive determinants of the TAM, providing novel theoretical insights  
into the mechanisms driving technology adoption. Thus, the integrated conceptual model of the present study  
serves as a guidepost for future research on the adoption of chatbots, specifically among higher graduate students  
in Tunisia.  
Second, although our study focuses on the “tech-savvy sailors”, perceived ease of use and technology anxiety  
are not significant factors in predicting chatbot usage intention. These results contradict the traditional  
assumptions of TAM, which generally assert ease of use as a key factor and anxiety as a psychological inhibitor  
included in extended models ofTAM. For instance, the younger generation in emerging markets possesses higher  
exposure to technology and digital expertise (Dinh and Park, 2023), considering ease of use as a less important  
determinant, while instrumental benefits like perceived usefulness gain prominence. Similarly, the lack of a  
significant relationship between technology anxiety and intention highlights the limited relevance of this  
construct for digitally native users familiar with AI tools. These findings open new avenues for expanding TAM  
theories in future research by underscoring the need to refine current theoretical frameworks to specific  
technological contexts.  
Finally, the effect of initial trust on intention to adopt chatbots has received minimal attention from scholars  
(Ameen et al., 2021; Prakash et al., 2023). Hence, following prior studies' perspectives (Foroudi et al., 2018; Van  
den Broeck et al., 2019), our study contributes to the literature by demonstrating that initial trust emerged as a  
strong determinant, reinforcing the theoretical argument that early perceptions of system’s trustworthiness and  
credibility are critical in shaping technology adoption, particularly when users have limited familiarity with the  
system.  
Managerial implications  
From a managerial standpoint, the study offers valuable insights for marketing managers on the strategic  
implementation of conversational agent service within the emerging market context. First, building initial trust  
with the chatbot is identified as a priority. Companies should embed design features that improve perceptions of  
credibility, such as privacy protection mechanisms, transparent conversation, and empathetic communication  
styles. These strategies contribute to a more trustworthy user experience. Second, communication strategies  
should emphasize the functional advantages of chatbot use, particularly their ability to offer rapid support can  
reinforce the perceived usefulness (Arce-Urriza et al, 2025). Specifically, managers should focus on bolstering  
the positive factors that influence adoption, such as the utility of chatbots in providing effective assistance. By  
reinforcing the perceived usefulness of chatbots through their ability to address user needs efficiently,  
organizations can enhance the willingness to adopt chatbot services, particularly among the young generation.  
Finally, while ease of use and technological anxiety may be less significant among students, these determinants  
could play an important role in other demographic segments. Thus, tailored strategies based on users’ digital  
proficiency are recommended.  
Limitations and Future Research Directions  
Despite its significant contributions, this study has several limitations that should be taken into account in future  
research. First, the survey was conducted with only university students, thereby limiting the generalizability of  
Page 904  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
the findings to other populations. Future studies should broaden the sample to include a more representative  
sample population in different contexts.  
Second, this study explored technology anxiety as an exogenous variable. Future research could explore its  
moderating effect to gain deeper insights into its effect on behavioral intention. In addition, the inclusion of other  
determinants like chatbot anthropomorphism and social influence would strengthen the prediction of the research  
model.  
Finally, another limitation is the use of cross-sectional data. Future studies could adopt a longitudinal approach  
to capture changes in customers' experiences with chatbots over time.  
REFERENCES  
1. Ahn, H. (2022). Emotional intelligence as a personality trait that predicts consumption behavior: The  
role of consumer emotional intelligence in persuasive communication. Sustainability, 14, 15461.  
2. Ahn, H. (2023). Unrevealing voice search behaviors: Technology acceptance model meets  
anthropomorphism in understanding consumer psychology in the US market. Sustainability, 15(23),  
16455.  
3. Alboqami, H. (2023). Factors affecting consumers' adoption of AI-based chatbots: The role of  
anthropomorphism. American Journal of Industrial and Business Management, 13(4), 195214.  
4. Alotaibi, M., & Hidayat-ur-Rehman, I. (2025). An empirical analysis of user intention to use chatbots  
for airline tickets consultation. Journal of Science and Technology Policy Management, 16(1), 204228.  
5. Ameen, N., Tarhini, A., Reppel, A., & Anand, A. (2021). Customer experiences in the age of artificial  
intelligence. Computers in Human Behavior, 114, 106548.  
6. Amin, M., Rezaei, S., & Abolghasemi, M. (2014). User satisfaction with mobile websites: The impact of  
perceived usefulness (PU), perceived ease of use (PEOU) and trust. Nankai Business Review  
International, 5(3), 258274.  
7. Applied Psychology, Vol. 88 No. 5, pp. 879-903, doi: 10.1037/0021-9010.88.5.879.  
8. Araújo, T., & Casais, B. (2020). Customer acceptance of shopping-assistant chatbots. In Á. Rocha, J. L.  
Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and smart technologies (Vol. 167, pp. 278287).  
9. Arce-Urriza, M., Chocarro, R., Cortiñas, M., & Marcos-Matás, G. (2025). From familiarity to  
acceptance: The impact of generative artificial intelligence on consumer adoption of retail chatbots.  
Journal of Retailing and Consumer Services, 84, 104234.  
10. Ayanwale, M. A., & Ndlovu, M. (2024). Investigating factors of students’ behavioral intentions to adopt  
chatbot technologies in higher education: Perspective from expanded diffusion theory of innovation.  
Computers in Human Behavior Reports, 14, 100396.  
11. Bailey, D. R., Almusharraf, N., & Almusharraf, A. (2022). Video conferencing in the e-learning context:  
Explaining learning outcome with the technology acceptance model. Education and Information  
Technologies, 27(6), 7679–7698. https://doi.org/10.1007/s10639-022-10949-1  
12. Bakhshi, N., van den Berg, H., Broersen, S., de Vries, D., Bouazzaoui, H. E., & Michels, B. (2018).  
Chatbots  
Point  
of  
View.  
Deloitte.  
Retrieved  
December  
20,  
2022,  
from  
13. Bayaga, A., & Du Plessis, A. (2023). Ramifications of the Unified Theory of Acceptance and Use of  
Technology (UTAUT) among developing countries’ higher education staffs. Education and Information  
14. behavioral research: a critical review of the literature and recommended remedies”, Journal of  
15. Chen, Y.-H., & Barnes, S. (2007). Initial trust and online buyer behaviour. Industrial Management &  
Data Systems, 107(1), 2136.  
16. Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny  
valley: An experimental study of humanchatbot interaction. Future Generation Computer Systems, 92,  
Page 905  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
17. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information  
technology. MIS Quarterly, 13(3), 319340. https://doi.org/10.2307/249008  
18. De Cicco, R., Iacobucci, S., Aquino, A., Romana Alparone, F., & Palumbo, R. (2021, November).  
Understanding users’ acceptance of chatbots: an extended TAM approach. In International Workshop on  
Chatbot Research and Design (pp. 3-22).  
19. Desaulniers, S. (2016). Chatbots rise, and the future may be ‘re-written’. Development, 82(2), 1330.  
20. Dinh, C. M., & Park, S. (2024). How to increase consumer intention to use Chatbots? An empirical  
analysis of hedonic and utilitarian motivations on social presence and the moderating effects of fear  
across generations. Electronic Commerce Research, 24(4), 2427-2467.  
21. Eger, L., Komárková, L., Egerová, D., & Mičík, M. (2021). The effect of COVID-19 on consumer  
shopping behaviour: Generational cohort perspective. Journal of Retailing and Consumer Services, 61,  
22. Elliott, C. (2018). Chatbots are killing customer service. Here’s why. European Conference on  
Knowledge Management, Vol. 25, pp. 1115-1122.  
23. Fallaque, C.A.H. (2024), “Impact of chatbots on satisfaction and loyalty in Lima’s telecom sector”,  
24. Følstad, A., Araujo, T., Papadopoulos, S., Law, E. L. C., Granmo, O. C., Luger, E., & Brandtzaeg, P. B.  
(2020). Chatbot research and design. Amsterdam: Springer International Publishing.  
25. Forbes. (2017). How chatbots improve customer experience in every industry: An infographic. Retrieved  
26. Foroudi, P., Gupta, S., Sivarajah, U., & Broderick, A. (2018). Investigating the effects of smart  
technology on customer dynamics and customer experience. Computers in Human Behavior, 80, 271-  
282.  
27. Fuller, C. M., Simmering, M. J., Atinc, G., Atinc, Y., & Babin, B. J. (2016). Common methods variance  
detection in business research. Journal of Business Research, 69(8), 31923198.  
28. Gani, M. O., Rahman, M. S., Bag, S., & Mia, M. P. (2024). Examining behavioural intention of using  
smart health care technology among females: Dynamics of social influence and perceived usefulness.  
Benchmarking: An International Journal, 31(2), 330352.  
29. Gopinath, K., & Kasilingam, D. (2023). Antecedents of intention to use chatbots in service encounters:  
A meta-analytic review. International Journal of Consumer Studies, 47(6), 23672395.  
30. Gümüş, N., & Çark, Ö. (2021). The effect of customers' attitudes towards chatbots on their experience  
and behavioral intention in Turkey. Interdisciplinary Description of Complex Systems, 19(3), 420436.  
31. Hair, J. F., Risher, J. J., Sarstedt, M., & Ringle, C. M. (2019). When to use and how to report the results  
of PLS-SEM. European Business Review, 31(1), 224.  
32. Henseler, J., Hubona, G., & Ray, P. A. (2016). Using PLS path modeling in new technology research:  
Updated guidelines. Industrial Management & Data Systems, 116(1), 220.  
33. Henseler, J., Ringle, C. M., & Sarstedt, M. (2015). A new criterion for assessing discriminant validity in  
variance-based structural equation modeling. Journal of the Academy of Marketing Science, 43(1), 115–  
34. Hong, I.B. and Cha, H.S. (2013), “The mediating role of consumer trust in an online merchant in  
35. Ibrahim, F., Münscher, J.-C., Daseking, M., & Telle, N.-T. (2025). The technology acceptance model  
and adopter type analysis in the context of artificial intelligence. Frontiers in Artificial Intelligence, 7,  
36. Ismatullaev, U. V. U., & Kim, S. H. (2024). Review of the factors affecting acceptance of AI-infused  
systems. Human Factors, 66(1), 126144. https://doi.org/10.1177/00187208211064707  
37. Jager, J., Putnick, D. L., & Bornstein, M. H. (2017). II. More than just convenient: The scientific merits  
of homogeneous convenience samples. Monographs of the society for research in child development,  
82(2), 13-30.  
38. Jo, H. (2022). Continuance intention to use artificial intelligence personal assistant: Type, gender, and  
39. Jyothsna, M., & Kryvinska, N. (2024). Exploring the chatbot usage intention: A mediating role of chatbot  
initial trust. Heliyon, 10, e33028.  
Page 906  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
40. Kaabachi, S., Ben Mrad, S., & O’Leary, B. (2019). Consumer’s initial trust formation in IOB’s  
acceptance: The role of social influence and perceived compatibility. International Journal of Bank  
41. Kasilingam, D. L. (2020). Understanding the attitude and intention to use smartphone chatbots for  
42. Kelly, S., Kaye, S.-A., & Oviedo-Trespalacios, O. (2023). What factors contribute to the acceptance of  
artificial intelligence? A systematic review. Telematics and Informatics, 77, 101925.  
43. Khanfar, A. A., Kiani Mavi, R., Iranmanesh, M., & Gengatharen, D. (2024). Determinants of artificial  
intelligence adoption: Research themes and future directions. Information Technology and Management.  
44. King, W. R., & He, J. (2006). A meta-analysis of the technology acceptance model. Information &  
45. Koufaris, M., & Hampton-Sosa, W. (2004). The development of initial trust in an online company by  
new customers. Information & Management, 41(3), 377397. https://doi.org/10.1016/j.im.2003.08.004  
46. Lam, S. Y., Chiang, J., & Parasuraman, A. (2008). The effects of the dimensions of technology readiness  
on technology acceptance: An empirical analysis. Journal of Interactive Marketing, 22(4), 1939.  
47. Li, B., Chen, Y., Liu, L., & Zheng, B. (2023). Users’ intention to adopt artificial intelligence-based  
chatbot:  
A
meta-analysis.  
The  
Service  
Industries  
Journal,  
43(1516),  
11171139.  
48. Li, L., Lee, K. Y., Emokpae, E., & Yang, S.-B. (2021). What makes you continuously use chatbot  
services? Evidence from Chinese online travel agencies. Electronic Markets, 31, 575599.  
49. Li, X., Hess, T. J., & Valacich, J. S. (2008). Why do we trust new technology? A study of initial trust  
formation with organizational information systems. The Journal of Strategic Information Systems, 17(1),  
50. Liu, M., Yang, Y., Ren, Y., Jia, Y., Ma, H., Luo, J., Fang, S., Qi, M., & Zhang, L. (2024). What influences  
consumer AI chatbot use intention? An application of the extended technology acceptance model. Journal  
of Hospitality and Tourism Technology, 15(4), 667689.  
51. Liu, Z., Shan, J., & Pigneur, Y. (2016). The role of personalized services and control: An empirical  
evaluation of privacy calculus and technology acceptance model in the mobile context. Journal of  
Information Privacy and Security, 12(3), 123144.  
52. Maduku, D. K., Rana, N. P., Mpinganjira, M., Thusi, P., Mkhize, N. H.-B., & Ledikwe, A. (2024). Do  
AI-powered digital assistants influence customer emotions, engagement and loyalty? An empirical  
investigation. Asia Pacific Journal of Marketing and Logistics, ahead-of-print.  
53. Makridakis, S. (2017). The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and  
54. Man, S. S., Ding, M., Li, X., Chan, A. H. S., & Zhang, T. (2024). Acceptance of highly automated  
vehicles: The role of facilitating condition, technology anxiety, social influence and trust. International  
Journal of HumanComputer Interaction.  
55. Mani, Z., & Chouk, I. (2018). Consumer resistance to innovation in services: Challenges and barriers in  
the Internet of Things era. Journal of Product Innovation Management, 35(5), 780807.  
56. Marikyan, D., & Papagiannidis, S. (2024). Technology acceptance model: A review. TheoryHub Book.  
57. Mehta, P., Jebarajakirthy, C., Maseeh, H. I., Anubha, A., Saha, R., & Dhanda, K. (2022). Artificial  
intelligence in marketing: A metanalytic review. Psychology & Marketing, 39(11), 20132038.  
58. Melián-González, S., Gutiérrez-Taño, D., & Bulchand-Gidumal, J. (2021). Predicting the intentions to  
use  
chatbots  
for  
travel  
and  
tourism.  
Current  
Issues  
in  
Tourism,  
24(2),  
192210.  
59. Meuter, M. L., Bitner, M. J., Ostrom, A. L., & Brown, S. W. (2005). Choosing among alternative service  
delivery modes: An investigation of customer trial of self-service technologies. Journal of Marketing,  
Page 907  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
60. Mostafa, R. B., & Kasamani, T. (2022). Antecedents and consequences of chatbot initial trust. European  
journal of marketing, 56(6), 1748-1771.  
61. Moussawi, S., Koufaris, M., & Benbunan-Fich, R. (2021). How perceptions of intelligence and  
anthropomorphism affect adoption of personal intelligent agents. Electronic Markets, 31(2), 343364.  
62. Muslichah, M. (2018). The effect of self-efficacy and information quality on behavioral intention with  
perceived usefulness as intervening variable. Journal of Account Business and Management, 25(1), 21–  
34.  
63. Na, S., Heo, S., Han, S., Shin, Y., & Roh, Y. (2022). Acceptance model of artificial intelligence (AI)-  
based technologies in construction firms: Applying the Technology Acceptance Model (TAM) in  
combination with the TechnologyOrganisationEnvironment (TOE) framework. Buildings, 12(2), 90.  
64. Norzelan, N. A., Mohamed, I. S., & Mohamad, M. (2024). Technology acceptance of artificial  
intelligence (AI) among heads of finance and accounting units in the shared service industry.  
Technological  
Forecasting  
and  
Social  
Change,198,123022.  
65. Oliveira, T., Faria, M., Thomas, M. A., & Popovič, A. (2014). Extending the understanding of mobile  
banking adoption: When UTAUT meets TTF and ITM. International journal of information management,  
34(5), 689-703.  
66. Pan, L., Luo, H., & Gu, Q. (2025). Incorporating AI literacy and AI anxiety into TAM: Unraveling  
Chinese scholars' behavioral intentions toward adopting AI-assisted literature reading. IEEE Access.  
67. Pillai, R., & Sivathanu, B. (2020). Adoption of AI-based chatbots for hospitality and tourism.  
International Journal of Contemporary Hospitality Management, 32(10), 31993226.  
68. Pillai, R., Ghanghorkar, Y., Sivathanu, B., Algharabat, R., & Rana, N. P. (2024). Adoption of artificial  
intelligence (AI) based employee experience (EEX) chatbots. Information Technology & People, 37(1),  
69. Podsakoff, P.M., MacKenzie, S.B., Lee, J.-Y. and Podsakoff, N.P. (2003), “Common method biases in  
70. Portz, J. D., Bayliss, E. A., Bull, S., Boxer, R. S., Bekelman, D. B., Gleason, K., & Czaja, S. (2019).  
Using the technology acceptance model to explore user experience, intent to use, and use behavior of a  
patient portal among older adults with multiple chronic conditions: Descriptive qualitative study. Journal  
of Medical Internet Research, 21(4), e11604. pp. 927-939, doi: 10.1016/j.ijinfomgt.2013.08.007  
71. Prakash, A. V., Joshi, A., Nim, S., & Das, S. (2023). Determinants and consequences of trust in AI-based  
customer  
service  
chatbot.  
The  
Service  
Industries  
Journal,  
43(910),  
642675.  
72. predicting purchase intention”, International Journal of Information Management, Vol. 33 No. 6,  
73. Qatawneh, N., Aljaafreh, A., Allymoun, O., & Aladaileh, R. (2024). Critical success factors influencing  
the behavioral intention to adopt smart home technologies. IEEE Access.  
74. Rafique, H., Almagrabi, A. O., Shamim, A., Anwar, F., & Bashir, A. K. (2020). Investigating the  
acceptance of mobile library applications with an extended technology acceptance model (TAM).  
Computers & Education, 145, 103732.  
75. Ragheb, M. A., Tantawi, P., Farouk, N., & Hatata, A. (2022). Investigating the acceptance of applying  
chatbot (artificial intelligence) technology among higher education students in Egypt. International  
Journal of Higher Education Management, 8(2). https://doi.org/10.24052/ijhem/v08n02/art-1  
76. Rajaobelina, L., Prom Tep, S., Arcand, M., & Ricard, L. (2021). Creepiness: Its antecedents and impact  
on loyalty when interacting with a chatbot. Psychology & Marketing, 38(12), 2339-2356.  
77. Rashid, A. B., & Kausik, M. A. K. (2024). AI revolutionizing industries worldwide: A comprehensive  
overview of its diverse applications. Hybrid Advances, 7, 100277.  
78. Sboui, M., Baati, O., & Sfar, N. (2024). Influencing factors and consequences of chatbot initial trust in  
AI telecommunication services: A study on Generation Z. The TQM Journal.  
79. Song, X., Gu, H., Li, Y., Leung, X. Y., & Ling, X. (2024). The influence of robot anthropomorphism  
and perceived intelligence on hotel guests’ continuance usage intention. Information Technology &  
80. Tian, W., Ge, J., Zhao, Y., & Zheng, X. (2024). AI chatbots in Chinese higher education: Adoption,  
perception, and influence among graduate studentsAn integrated analysis utilizing UTAUT and ECM  
models. Frontiers in Psychology, 15, 1268549.  
Page 908  
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)  
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue XII December 2025  
81. Topsakal, Y. (2024). How familiarity, ease of use, usefulness, and trust influence the acceptance of  
generative artificial intelligence (AI)-assisted travel planning. International Journal of HumanComputer  
82. Van den Broeck, E., Zarouali, B., & Poels, K. (2019). Chatbot advertising effectiveness: When does the  
message  
83. Van Eeuwen, M. (2017). Mobile conversational commerce: Messenger chatbots as the next interface  
between businesses and consumers [Master's thesis, University of Twente].  
get  
through?  
Computers  
in  
Human  
Behavior,  
98,  
150157.  
84. Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic motivation,  
and emotion into the technology acceptance model. Information Systems Research, 11(4), 342365.  
85. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information  
technology: Toward a unified view. MIS Quarterly, 27(3), 425478.  
86. Wang, C. L. (2024). Editorial What is an interactive marketing perspective and what are emerging  
research areas? Journal of Research in Interactive Marketing, 18(2), 161165.  
87. Zhang, B., Zhu, Y., Deng, J., Zheng, W., Liu, Y., Wang, C., & Zeng, R. (2023). “I am here to assist your  
tourism”: Predicting continuance intention to use AI-based chatbots for tourism. Does gender really  
matter?  
International  
Journal  
of  
HumanComputer  
Interaction,  
39(9),  
18871903.  
88. Note:1 https://www.precedenceresearch.com/chatbot-market Report  
Page 909