International Journal of Research and Innovation in Social Science

Submission Deadline- 11th September 2025
September Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-03rd October 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-19th September 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Affective and Cognitive Engagement with Political Conversational AI Agents: Evidence from the Rubens Jr Case

  • José Jance Marques Grangeiro
  • 278-285
  • Jul 8, 2025
  • Education

Affective and Cognitive Engagement with Political Conversational AI Agents: Evidence from the Rubens Jr Case

José Jance Marques Grangeiro*

University of Brasília (UnB);Master’s in Law, UnB; Master’s in Strategic Communication, University of Tokyo;Lawyer, journalist, and public relations specialist.

*Corresponding author

DOI: https://dx.doi.org/10.47772/IJRISS.2025.913COM0025

Received: 05 June 2025; Accepted: 09 June 2025; Published: 08 July 2025

ABSTRACT

This experimental study aimed to evaluate the affective and cognitive engagement of users when interacting with a conversational artificial intelligence (AI) agent using a humanized voice—specifically, that of Brazilian congressman Rubens Pereira Jr. Twenty participants were divided into two groups: one familiar with the politician and another without prior familiarity. During five minutes of free interaction with the AI agent, neurophysiological data were collected via a portable EEG device (Brainlink Dual), recording signals across delta, theta, alpha, beta, and gamma bands. The findings reveal that the humanized voice produced a significant increase in high-beta and low-gamma activity—markers of cognitive engagement—especially among younger participants and those unfamiliar with the politician. Conversely, participants who recognized the voice exhibited greater theta and low-alpha activation, indicating higher affective engagement and memory recall. Gender differences were also observed: women showed higher affective activation, while men displayed greater cognitive focus. The results suggest that voice personalization in AI agents fosters emotional bonds and enhances attention, making it a promising strategy for public communication, political marketing, and personalized education. The study advances the field by integrating continuous EEG data with qualitative self-reports, offering a multidimensional perspective on human-AI interaction. In practical terms, humanizing AI agents with familiar and expressive voices can strengthen user trust, empathy, and engagement, including among older participants. These findings highlight the value of human-like voice strategies for more effective digital communication and citizen engagement.

Keywords: neuroscience artificial intelligence, political marketing, affective engagement, cognitive engagement.

LITERATURE REVIEW

Conversational agents powered by artificial intelligence (AI) have gained significant relevance in digital communication and marketing strategies, with applications ranging from customer service and mental health to education and entertainment. As new AI tools emerge, they drive innovative applications in communication, reflecting a profound shift in communication theories. Technologies are no longer seen merely as intermediaries between human subjects but are increasingly recognized as autonomous communicative agents themselves [1].

A growing number of these tools employ advanced natural language processing (NLP), machine learning, and speech recognition techniques to simulate increasingly realistic human interactions [2]. The use of synthetic voices based on real people—especially public figures such as politicians—adds an extra layer of identification and affective connection, strengthening emotional bonds between users and AI agents [3]. Schuetzler et al. have demonstrated that the presence of a recognizable voice enhances perceptions of humanization, reinforcing affective ties through empathy, trust, and familiarity [4].

A key framework for understanding this phenomenon is the Media Naturalness Theory, which suggests that the closer a conversational agent comes to mimicking face-to-face interaction—including elements like intonation and vocal expressiveness—the greater the user’s engagement. Empirical research, such as Chandra et al. and several Chinese studies, confirm that personalized synthetic voices, especially those derived from individuals known to the target audience, amplify both affective and cognitive engagement during interactions [3,5,6].

But can this phenomenon be measured using neuroscientific tools? To what extent is such engagement reflected when the emulated subject is a political figure? These are the two main questions addressed in the present study, which analyzes the results of an experiment measuring brainwave activity during interactions with an artificial intelligence agent emulating the voice of a federal congressman.

Indeed, the application of neuroscientific technologies, such as electroencephalography (EEG), has proven to be an efficient tool for real-time assessment of the emotional and cognitive impact of these interactions. Previous studies have used the same method to demonstrate the precise measurement of different types of engagement during human–AI interactions [7,8]. Other researchers have applied EEG technology to identify specific changes in brain activity associated with empathy and emotional engagement, which are fundamental for understanding how users respond to AI agents [9,10].

To clarify what this study intends to explore, it is useful to revisit the literature and highlight some key categories, such as cognitive and affective engagement. The former refers to the mental effort, attention, memory retrieval, and problem-solving disposition that users mobilize when interacting with technologies, especially conversational agents based on artificial intelligence. For example, Wang et al. have demonstrated that the ability of an agent to understand and respond to the specific needs of its interlocutors positively influences the depth of interaction, promoting more elaborate reasoning and more active participation in problem-solving [11].

From a neurophysiological perspective, this engagement is reflected in the predominance of beta waves (13–30 Hz), and to a lesser extent, gamma waves (>30 Hz), particularly in the frontal regions of the brain. These frequencies are associated with active attention, working memory, and problem-solving effort [12,13]. Beta band activation directly reflects alertness and mental concentration, serving as a strong indicator of cognitive engagement in tasks involving language and critical analysis. Gamma activity—although more difficult to detect with low-density EEG devices—can also be linked to information integration and pattern recognition [14].

Affective engagement, in turn, relates to the emotional aspects of interaction, including feelings of empathy, trust, motivation, and a subjective sense of connection with the agent. Vocal familiarity triggers auditory recognition processes and autobiographical memory recall, often mediated by alpha (8–12 Hz) and theta (4–7 Hz) wave activity [15,16]. A reduction in alpha power in the frontal and temporal regions is a typical marker of emotional involvement and affective attention, while increased theta activity indicates episodic memory mobilization and empathic processing [17].

Studies such as Gazzaniga et al. [13] indicate that the recognition of familiar voices activates neural circuits associated with self-identity and social belonging. This phenomenon is frequently accompanied by frontal theta oscillations and sustained parietal alpha reduction, suggesting emotional comfort and activation of social memory. Based on this, we hypothesize that participants already familiar with the congressman exhibited these patterns during interaction, reflecting both affective engagement and the activation of positive personal memories [13].

Therefore, by integrating EEG data with qualitative reports of interaction, it is possible to identify distinct neuroelectric signatures for both types of engagement [13,17]. Beta and gamma activity reveal mental effort and logical reasoning during interaction with the agent’s content, while alpha and theta activity indicate emotional connection, voice recognition, and subjective appreciation of the experience. The combination of these patterns suggests that conversational agents employing voices of well-known public figures can simultaneously enhance both cognitive and affective engagement, increasing the impact of the interaction.

From this brief literature review, it becomes clear that the effectiveness of AI-based conversational agents cannot be assessed solely by subjective satisfaction or usability metrics. Instead, direct observation of neurophysiological indicators is required to reveal how users engage—both cognitively and emotionally—with these systems.Given that the literature highlights the central role of personalization, responsiveness, and human-like voice in intensifying engagement [3,4], the use of electroencephalography (EEG) emerges as an appropriate methodological strategy. EEG allows for the identification of brain activation patterns associated with both cognitive effort and affective response during interactions with AI agents.

The following section presents the experimental design conducted to investigate these aspects, considering the differences in the user experience of interacting with a conversational agent using the real voice of a parliamentarian—either previously known or unknown to the participants.

METHODOLOGY

This study adopted an exploratory experimental design to evaluate the impact of using a humanized voice in digital conversational agents, specifically analyzing the influence of reproducing the voice of federal congressman Rubens Pereira Jr. on participants’ affective and cognitive engagement. The research was conducted between May 12 and 19, 2025, in a controlled environment, using an electroencephalography (EEG) device to collect real-time neurophysiological data.

Twenty individuals participated in the research, divided into two equal groups of ten participants each. The first group consisted of individuals who were already familiar with federal congressman Rubens Pereira Jr.; the second group included participants with no prior knowledge of him. To ensure representativeness, balance, and diversity, each group was equally divided between men and women, covering participants aged 18–25, 26–35, 36–45, 46–60, and over 65.

The conversational agent used in the experiment was created through the ElevenLabs platform[1], based on a synthetic voice trained with real samples from congressman Rubens Pereira Jr. This agent was configured to perform empathetic, active, and adaptive interactions, offering personalized courses on demand according to each user’s interests. Communication with the agent occurred via natural speech in Portuguese, with content organized into short thematic modules.

Each participant completed the experiment individually. Before the interaction began, the researcher positioned the Brainlink Dual EEG device on the participant’s head. This is a portable device with two channels, dry sensors, and Bluetooth connection, capable of recording brain signals in the alpha (8–12 Hz), beta (13–30 Hz), theta (4–7 Hz), gamma (>30 Hz), and delta (<4 Hz) frequency bands. The choice of this device was based on its ease of use, accuracy in capturing the relevant frequencies for the study, and suitability for short sessions.

After the EEG was positioned, each participant was informed that they would interact with a digital conversational agent based on artificial intelligence. They were instructed to initiate conversations with the agent on any topic of interest, without limitations regarding creativity or complexity. During the five-minute interaction, the EEG continuously collected brain signals related to attention, concentration, emotions, and memory.

After the interaction ended, each participant responded to a brief qualitative questionnaire. The questions aimed to identify: (a) whether they recognized the agent’s voice; (b) the perceived similarity between the agent’s voice and the real voice of congressman Rubens Pereira Jr.; (c) the credibility attributed to the voice; and (d) their perception of the agent’s effectiveness as a tool for public communication and political marketing. These qualitative data allowed for a complementary analysis to the neurophysiological results, providing a broader and more narrative perspective on the impacts of the humanized voice in digital interactions.

The main empirical findings of the experiment are presented below, organized according to brainwave frequency, intergroup differences, and correlations with self-reported qualitative data.

RESULTS

The analysis of neurophysiological data revealed clear patterns of brain activation during the five-minute sessions in which participants interacted with the conversational agent reproducing the voice of federal congressman Rubens Pereira Jr. As illustrated in Figure 1, there was a predominance of delta waves, followed by theta and high-beta waves, throughout most of the experimental session. Although elevated delta activity generally indicates resting states or low cognitive activity, its persistent presence in this context should be interpreted with caution. It may indicate not only lower cortical activity but also automatic processes or a stable emotional condition, as suggested by Sanei and Chambers [12].

On the other hand, theta waves showed recurrent elevations, especially between seconds 100 and 300. These peaks suggest emotional involvement and the activation of episodic memories, supporting the idea that participants may have recognized or been emotionally influenced by the agent’s humanized voice [15]. Such affective engagement was reinforced by simultaneous peaks in alpha waves, particularly low-alpha, which are associated with relaxed alertness, empathy, and attentive listening.

Beta waves, especially in the high-beta range, also showed significant increases at various moments during the interaction, indicating that participants were cognitively active, with heightened attention and reasoning while engaging with the content generated by the agent [12,13]. This specific combination of brainwaves clearly suggests that participants experienced both affective engagement—possibly due to voice recognition and generated empathy—and cognitive engagement, stimulated by the personalized nature of the content.

Among participants who were not previously familiar with the congressman’s voice, there was a significant increase in high-beta (20–30 Hz) and low-gamma (30–50 Hz) waves during the second and third minutes of the interaction. These frequencies are directly associated with sustained attention, logical reasoning, and linguistic processing. The scientific literature confirms that such activities are indicative of active cognitive effort and critical analysis of the presented content [13,12].

In contrast, participants who recognized the voice showed these peaks with less intensity and greater variability, suggesting less cognitive effort and greater fluency in comprehension, likely due to familiarity with the agent’s speech patterns and style. In the group familiar with the voice, there was a greater activation of theta waves (4–7 Hz) and low-alpha waves (8–10 Hz), especially during the first 90 seconds of interaction. The literature associates theta waves with memory recall and affective introspection [15,16], while alpha waves are linked to relaxation and empathic listening.

The simultaneous activation of theta and alpha waves, accompanied by an early increase in attention levels, suggests a phenomenon of “autobiographical echo,” in which neural networks related to personal memory and social recognition are activated [14]. This reinforces the hypothesis that familiar voices can facilitate mental receptivity, reduce cognitive and emotional barriers, and increase trust in digital agents. In addition, studies such as that by Habicht et al. [18] indicate that agent personalization and the use of empathetic intonation accelerate emotional engagement and reduce the number of interactions required to generate a meaningful bond.

The qualitative responses from participants reinforced these neurophysiological findings. Many reported feeling more engaged, welcomed, and interested in the conversation, particularly emphasizing the humanized voice of the agent as a central element in building this bond. This positive response was observed even among older participants, who traditionally are expected to show lower technological adherence, suggesting that empathetic artificial intelligence, with a recognizable voice and accessible language, can be a powerful tool for inclusion and engagement among the elderly.

The segmental analysis by gender also revealed differences: women, on average, showed greater activation in the theta band, indicating more intense emotional involvement, especially during the first 150 seconds of interaction. Men, on the other hand, exhibited more pronounced peaks in high-beta, suggesting greater cognitive effort or focus on understanding the informational content. These differences are consistent with previous findings that indicate faster and more intense affective responses among women in interactions mediated by a humanized voice [18].

DISCUSSION

The analysis of neurophysiological data obtained via EEG during the interactions with the humanized conversational agent revealed robust patterns of affective and cognitive engagement among participants, with relevant variations according to gender, age group, and prior recognition of the voice of federal congressman Rubens Pereira Jr.

Overall, participants showed significant activation in the high-beta (13–30 Hz) and low-gamma (30–50 Hz) bands, especially between the second and third minute of interaction. These frequency bands are associated with sustained attention, logical reasoning, and language integration, serving as clear evidence of active cognitive engagement [13,12]. This indicates that the content generated by the agent—personalized, contextual, and responsive—was able to mobilize users for an intellectually elaborate interaction.

At the same time, there was strong activation in the theta (4–7 Hz) and low-alpha (8–10 Hz) bands, particularly among participants who recognized the agent’s voice as similar to the parliamentarian’s. The theta band is related to access to episodic memory and emotional introspective states [16], while the alpha band is traditionally associated with empathic listening and the formation of affective bonds, suggesting that the presence of a familiar and humanized voice was effective in provoking affective engagement, favoring states of comfort and emotional proximity with the agent [14].

Qualitative data obtained through the questionnaire reinforce this finding: 90% of participants who declared that they recognized the agent’s voice also attributed high credibility to the communication and reported a sense of welcome and empathy. This positive affective response is compatible with the studies of Dev and Camp [19], which demonstrate that the use of natural and expressive voices in digital agents significantly increases the perception of social presence and conversational relevance.

The gender-segmented analysis showed that, on average, women presented higher activation in the theta band, indicating more intense emotional involvement, particularly during the first 150 seconds of interaction. Men, in turn, exhibited more pronounced peaks in high-beta, which suggests greater cognitive effort or a focus on understanding the informational content. These differences are consistent with previous findings indicating faster and more intense affective responses among women in interactions mediated by humanized voice [18].

CONCLUSION

This study aimed to investigate the impact of using a humanized and recognizable voice—in this case, the voice of federal congressman Rubens Pereira Jr.—in interactions with digital conversational agents, focusing on the measurement of affective and cognitive engagement through electroencephalography (EEG). The results indicate that integrating a familiar vocal identity into the AI agent has significant effects on the quality of the interaction experience, both from a neurophysiological and a subjective perspective.

Expressive activation in the high-beta and gamma bands throughout the interaction indicated that participants were cognitively engaged, mobilizing attention, language, and analytical thinking [13,12]. This type of response was especially evident among younger users (26–35 years old) and among those who did not previously know the congressman, signaling that the personalized and responsive nature of the agent’s content is sufficient to capture interest and stimulate cognition, even in the absence of a prior affective bond.

On the other hand, participants who recognized the agent’s voice showed intensified patterns in the theta and low-alpha bands, associated with autobiographical memory, empathic listening, and emotional involvement [16,15]. These findings confirm that vocal familiarity not only generates greater comfort and trust but also enhances the perception of social presence and the sense of relational bonding with the agent [14,19].

From a methodological standpoint, this study advances the field by integrating continuous neurophysiological data with qualitative self-reports, offering a rich and multidimensional view of the experience of interacting with AI.

In practical terms, the findings suggest that strategic humanization of voice in AI agents—especially when anchored to public figures with whom the audience already has some bond or positive image—is a promising resource for contexts such as public communication, education, political marketing, and citizen services. By combining affective familiarity with cognitive responsiveness, the conversational agent goes beyond the role of a mere information provider and becomes a relational mediator—a channel for listening, guidance, and influence.

This experiment appears to confirm that vocal personalization of conversational agents is more than an aesthetic enhancement: it is an evidence-based engagement strategy that can be objectively measured and leveraged to foster more meaningful and effective digital relationships. For future studies, it is recommended to expand the sample size, introduce additional behavioral metrics, and employ multichannel EEG to deepen the analysis of the brain dynamics involved in these interactions.

Regarding age groups, the data indicate that participants over 60 years old exhibited the highest combined levels of theta and high-beta, suggesting a more intense simultaneous affective and cognitive engagement than other groups. The sustained activation of these bands throughout the interaction reveals not only attention and mental effort, but also the evocation of memories and emotional involvement.

Qualitative reports corroborate this finding: elderly participants reported feeling more connected, welcomed, and interested in the conversation, highlighting the humanized voice of the agent as a central element in building this bond. This positive response is surprising, as it challenges the recurring expectation of low technological adherence among this audience, and suggests that empathetic artificial intelligence—with a recognizable voice and accessible language—may represent a powerful tool for inclusion and engagement among the elderly population.

Based on these findings, it can be concluded that agent voice personalization—especially when recognized—has a positive and significant effect on the quality of interaction, both emotionally and rationally. The use of a humanized voice, in this case specific and politically identifiable, reinforces trust, empathy, and attention, favoring more effective public communication strategies.

REFERENCES

  1. Araújo, T. (2020). Conversational Agent Research Toolkit: An alternative for creating and managing chatbots for experimental research. Communication and Cognition Review, 2(1), 35–51. https://doi.org/10.5117/CCR2020.1.002.ARAU
  2. [2] Bhatia, P., & Sisodia, S. (2024). Conversational AI: Enhancing Customer Engagement and Support. In Proceedings of the International Conference on Artificial Intelligence (pp. 36–44). https://doi.org/10.58532/v3bflt6p1ch405
  3. Chandra, S., Ranade, A., & Srivastava, S. C. (2022). To Be or Not to Be…Human? Theorizing the Role of Human-Like Competencies in Conversational Artificial Intelligence Agents. Journal of Management Information Systems, 39(4), 969–1005. https://doi.org/10.1080/07421222.2022.2127441
  4. Schuetzler, R. M., Grimes, G. M., & Giboney, J. S. (2018). An Investigation of Conversational Agent Relevance, Presence, and Engagement. In Americas Conference on Information Systems. https://dblp.uni-trier.de/db/conf/amcis/amcis2018.html#SchuetzlerGG18
  5. Oh, Y. H., Chung, K., & Ju, D. Y. (2020). Differences in Interactions with a Conversational Agent. International Journal of Environmental Research and Public Health, 17(9), 3189. https://doi.org/10.3390/IJERPH17093189
  6. XYGKOU, A., Ang, C. S., Siriaraya, P., Kopecki, J. P., Covaci, A., Kanjo, E., & She, W. J. (2024). MindTalker: Navigating the Complexities of AI-Enhanced Social Engagement for People with Early-Stage Dementia. In Proceedings of the ACM on Human-Computer Interaction. https://doi.org/10.1145/3613904.3642538
  7. Choi, G. Y., Shin, J. G., Lee, J., & Park, J. Y. (2022). EEG Dataset for the Recognition of Different Emotion States. Data in Brief, 42, 108306. https://doi.org/10.1016/j.dib.2022.108306
  8. Kumar, M., Delaney, C., & Krusienski, D. J. (2022). Estimation of Affective States in Virtual Reality with Portable EEG Headsets. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 30, 1431–1440. https://doi.org/10.1109/TNSRE.2022.3182167
  9. Alimardani, M., Hermans, A., Lim, A., & Doya, K. (2020). Assessment of Empathy in an Affective VR Environment Using EEG. Frontiers in Human Neuroscience, 14, 228. https://doi.org/10.3389/fnhum.2020.00228
  10. Staffa, M., & D’Errico, L. (2022). EEG-Based Machine Learning Models for Emotion Recognition. Sensors, 22(6), 2200. https://doi.org/10.3390/s22062200
  11. Wang, K., Pan, Z., & Lu, Y. (2024). From general AI to custom AI: the effects of generative conversational AI’s cognitive and emotional conversational skills on user’s guidance. Kybernetes. https://doi.org/10.1108/k-04-2024-0894
  12. Sanei, S., & Chambers, J. A. (2022). EEG Signal Processing and Machine Learning (2nd ed.). John Wiley & Sons.
  13. Gazzaniga, M. S., Ivry, R. B., & Mangun, G. R. (2014). Cognitive Neuroscience: The Biology of the Mind (4th ed.). W. W. Norton & Company.
  14. Fields, R. D. (2020). Electric Brain: How the New Science of Brainwaves Reads Minds, Tells Us How We Learn, and Helps Us Change for the Better. BenBella Books.
  15. Mulert, C., & Lemieux, L. (Eds.). (2022). EEG–fMRI: Physiological Basis, Technique, and Applications (2nd ed.). Springer Nature. https://doi.org/10.1007/978-3-031-07121-8
  16. Dickter, C. L., & Kieffaber, P. D. (2014). EEG Methods for the Psychological Sciences. SAGE Publications.
  17. Panigrahi, N., & Mohanty, S. P. (2023). Brain Computer Interface: EEG Signal Processing. CRC Press. https://doi.org/10.1201/9781003241386
  18. Habicht, J., McFadyen, J., Harper, R., Hauser, T. U., & Rollwage, M. (2024). AI-Enabled Conversational Agent Improves Treatment Outcomes and Patient Engagement in 1:1 Cognitive-Behavioral Therapy (CBT): A Real-World Observational Study. PsyArXiv Preprints. https://doi.org/10.31234/osf.io/byz4c
  19. Dev, J., & Camp, L. J. (2020). User Engagement with Chatbots: A Discursive Psychology Approach. In Proceedings of the ACM on Conversational User Interfaces. https://dblp.uni-trier.de/db/conf/cui/cui2020.html#DevC20
  20. Avaliable at: https://elevenlabs.io/app/talk-to?agent_id=agent_01jvsbc1ppf55vs3zpe0fcc9cv

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

33 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER