Psychological Trust and Human-Centric Security in Biometric Authentication: A Multi-Factor Face-Based Voice Assistant System
Authors
Assistant Professor, Rayat Bahra University, Mohali (India)
Article Information
DOI: 10.47772/IJRISS.2026.10190048
Subject Category: Humanities
Volume/Issue: 10/19 | Page No: 543-547
Publication Timeline
Submitted: 2026-01-28
Accepted: 2026-01-30
Published: 2026-02-16
Abstract
With the increasing integration of intelligent voice assistants into domestic environments, concerns regarding privacy, surveillance, and psychological comfort have become as significant as technical security itself. Conventional smart home security systems prioritize algorithmic accuracy while often overlooking how users emotionally perceive constant monitoring. This study redirects attention from purely computational performance toward the psychological dimensions of trust and user agency in biometric authentication. We propose a human-centric, multi-factor authentication framework in which facial recognition functions as an intentional visual initiation before voice assistant activation. This “Face-First” interaction model is grounded in psychological theories of autonomy and agency, emphasizing that security systems are most effective when they align with natural human interaction patterns. By requiring visual acknowledgment before auditory access, the system transforms passive surveillance into an active, user-controlled process. Through qualitative investigation, this research demonstrates that multisensory authentication reduces subconscious anxiety, enhances perceived control, and strengthens relational trust between users and intelligent systems. The findings suggest that biometric security, when designed around human intentionality rather than constant monitoring, can evolve from a mechanical safeguard into a psychologically supportive and trustworthy domestic companion.
Keywords
Psychological Trust; Human-Centric Security
Downloads
References
1. Holec, H. (1981). Autonomy and foreign language learning. Oxford: Pergamon Press. [Google Scholar] [Crossref]
2. Little, D. (2009). Language learner autonomy and the European language portfolio: Two L2 English examples. Language Teaching, 42(2), 222–233. https://doi.org/10.1017/S0261444808005636 [Google Scholar] [Crossref]
3. Littlewood, W. (1996). Autonomy: An anatomy and a framework. System, 24(4), 427–435. https://doi.org/10.1016/S0346-251X(96)00039-5 [Google Scholar] [Crossref]
4. Reinders, H. (2011). Learner autonomy and new learning environments. Language Learning & Technology, 15(3), 1–3. [Google Scholar] [Crossref]
5. Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. https://doi.org/10.1126/science.aaa1465 [Google Scholar] [Crossref]
6. Crawford, K., & Paglen, T. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. Yale University Press. [Google Scholar] [Crossref]
7. Dourish, P., & Bell, G. (2011). Divining a digital future: Mess and mythology in ubiquitous computing. MIT Press. [Google Scholar] [Crossref]
8. Friedman, B., Kahn, P. H., & Borning, A. (2006). Value sensitive design and information systems. In P. Zhang & D. Galletta (Eds.), Human–Computer Interaction and Management Information Systems (pp. 348–372). M.E. Sharpe. [Google Scholar] [Crossref]
9. Garfinkel, S. (2000). Database nation: The death of privacy in the 21st century. O’Reilly Media. [Google Scholar] [Crossref]
10. Jain, A. K., Ross, A., & Prabhakar, S. (2004). An introduction to biometric recognition. IEEE Transactions on Circuits and Systems for Video Technology, 14(1), 4–20. https://doi.org/10.1109/TCSVT.2003.818349 [Google Scholar] [Crossref]
11. Kohnke, L., & Moorhouse, B. L. (2021). Adopting artificial intelligence in English language teaching: Challenges and opportunities. Computer Assisted Language Learning Electronic Journal, 22(2), 1–15. [Google Scholar] [Crossref]
12. Lau, J., Zimmerman, B., & Schaub, F. (2018). Alexa, are you listening? Privacy perceptions, concerns and privacy-seeking behaviors with smart speakers. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1–31. https://doi.org/10.1145/3274371 [Google Scholar] [Crossref]
13. Norman, D. A. (2013). The design of everyday things (Revised ed.). Basic Books. [Google Scholar] [Crossref]
14. Renaud, K., & Van Biljon, J. (2008). Predicting technology acceptance and adoption by the elderly: A qualitative study. Proceedings of the 2008 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists, 210–219. [Google Scholar] [Crossref]
15. Shneiderman, B. (2020). Human-centered AI. Oxford University Press. [Google Scholar] [Crossref]
16. Whittaker, M., Crawford, K., Dobbe, R., Fried, G., & Kaziunas, E. (2018). AI Now report 2018. AI Now Institute, New York University. [Google Scholar] [Crossref]
17. Zuboff, S. (2019). The age of surveillance capitalism. PublicAffairs. [Google Scholar] [Crossref]
Metrics
Views & Downloads
Similar Articles
- A Psychoanalytical Study of the Gift of Magi
- Analyzing Community Initiatives and Government Interventions in Salt Farming Resource Management in Pangkajene and Kepulauan Districts
- Diaconal Ministries and the Ordination of Women
- Socio Economic Changes in Sagar Island before and After Cyclone Aila
- Grief and Its Transformations in Joan Didion’s the Year of Magical Thinking