Cultivating Media Critical Thinking with a Socratic AI Coach: A Methodology
Authors
Institute of Informatics and Telecommunications, National Centre for Scientific Research (N.C.S.R.) “Demokritos” GR-153 10, P.O.BOX 60228, Aghia Paraskevi, Athens (Greece)
Ioannis Elissaios Paparigopoulos
Institute of Informatics and Telecommunications, National Centre for Scientific Research (N.C.S.R.) “Demokritos” GR-153 10, P.O.BOX 60228, Aghia Paraskevi, Athens (Greece)
Institute of Informatics and Telecommunications, National Centre for Scientific Research (N.C.S.R.) “Demokritos” GR-153 10, P.O.BOX 60228, Aghia Paraskevi, Athens (Greece)
Institute of Informatics and Telecommunications, National Centre for Scientific Research (N.C.S.R.) “Demokritos” GR-153 10, P.O.BOX 60228, Aghia Paraskevi, Athens (Greece)
Institute of Informatics and Telecommunications, National Centre for Scientific Research (N.C.S.R.) “Demokritos” GR-153 10, P.O.BOX 60228, Aghia Paraskevi, Athens (Greece)
Article Information
DOI: 10.47772/IJRISS.2025.910000678
Subject Category: Telecommunications
Volume/Issue: 9/10 | Page No: 8302-8321
Publication Timeline
Submitted: 2025-10-27
Accepted: 2025-11-01
Published: 2025-11-20
Abstract
The increasing sophistication of disinformation necessitates a fundamental shift from reactive fact-checking to proactive cognitive and affective resilience. In today's hybrid digital ecosystem, manipulative content exploits cognitive shortcuts like motivated reasoning and emotional vulnerabilities to bypass analytical scrutiny. Existing interventions often fail due to insufficient scope, slow speed, and the inert skill problem—where individuals possess critical thinking skills but fail to apply them in emotionally charged, real-world contexts. This concept paper introduces Media Critical Thinking (MCT), a unified pedagogical framework that integrates Media Information Literacy (MIL), critical thinking and manipulation-discernment skills, and critical-thinking dispositions into a single practice. MCT is operationalized through the Socratic AI Coach, a chatbot developed in the EU TITAN project, designed not as a truth arbiter, but as a ‘media thinking coach’ for users. The coach trains users to recognize the mechanics of manipulation tactics—such as conspiracism, polarization, and discrediting—rather than focusing solely on verifying facts. It utilizes Retrieval-Augmented Generation (RAG) architecture and a ‘Tactic Profiling’ methodology to implement dialogical inoculation. Through structured, Socratic questioning, the system guides users to evaluate media sources ("WHO") and recognize specific manipulative tactics ("WHAT"). A formative mixed-method pilot study (N = 12) evaluated usability, engagement, and educational impact through focus groups, questionnaires, and reflective discussion. Findings provided initial empirical validation of the Socratic approach, confirming its high relevance for stimulating critical analysis while revealing a key design challenge: balancing user expectations for definitive answers with the tool’s role as a facilitator of reflection. Overall, the study outlines a scalable model for media education that cultivates durable meta-literacy—the reflective habits essential for autonomous reasoning in complex information environments.
Keywords
Critical Thinking, Media Literacy, Disinformation, AI Chatbot, Socratic Method, Psychological Inoculation .
Downloads
References
1. Bailin, S., Case, R., Coombs, J.R., & Daniels, L.B. (1999). Conceptualizing critical thinking. Journal of Curriculum Studies, 31(3), 285–302. https://doi.org/10.1080/002202799183133 [Google Scholar] [Crossref]
2. Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA Handbook of Research Methods in Psychology: Vol. 2. Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 57-71). American Psychological Association. [Google Scholar] [Crossref]
3. Bruner, J. S. (1966). Toward a theory of instruction. Belknap Press of Harvard University Press. [Google Scholar] [Crossref]
4. Caulfield, M., & Wineburg, S. (2023). Verified: How to think straight, get duped less, and make better decisions about what to believe online. University of Chicago Press. [Google Scholar] [Crossref]
5. Cohen, J. S., Edmunds, J. M., Brodman, D. M., Benjamin, C. L., & Kendall, P. C. (2013). Using selfmonitoring: Implementation of collaborative empiricism in cognitive-behavioral therapy. Cognitive and Behavioral Practice, 20(4), 419–428. https://doi.org/10.1016/j.cbpra.2012.07.001 [Google Scholar] [Crossref]
6. Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS ONE, 12(5), e0175799. https://doi.org/10.1371/journal.pone.0175799 [Google Scholar] [Crossref]
7. Costello, T. H., Pennycook, G., & Rand, D. G. (2024). Durably reducing conspiracy beliefs through dialogues with AI. Science, 385(6714). https://doi.org/10.1126/science.adq1814 [Google Scholar] [Crossref]
8. Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., & Fazio, L. K. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1, 13–29. https://doi.org/10.1038/s44159-021-00006-y [Google Scholar] [Crossref]
9. Facione, P.A. (2015). Critical Thinking: What It Is and Why It Counts. Millbrae, CA: Measured Reasons LLC. [Google Scholar] [Crossref]
10. Floridi, L. and Cowls, J. (2022). A Unified Framework of Five Principles for AI in Society . In Machine Learning and the City, S. Carta (Ed.). https://doi.org/10.1002/9781119815075.ch45 [Google Scholar] [Crossref]
11. Gabaree, S. (2022). A metacognitive approach to reduce the spread of online misinformation. The Journal of Media Literacy. International Council for Media Literacy. https://ic4ml.org/journal-article/ametacognitive-approach-to-reduce-the-spread-of-online-misinformation/ [Google Scholar] [Crossref]
12. Gao, Y., Xiong, Y., Gao, X., Jia, K., Pan, J., Bi, Y., … Wang, H. (2024). Retrieval-Augmented Generation for Large Language Models: A Survey. arXiv [Cs.CL]. Retrieved from http://arxiv.org/abs/2312.10997 [Google Scholar] [Crossref]
13. Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences of the United States of America, 117(27), 15536–15545. https://doi.org/10.1073/pnas. [Google Scholar] [Crossref]
14. Hameleers, M., & van der Meer, T. G. L. A. (2019). Misinformation and Polarization in a High-Choice [Google Scholar] [Crossref]
15. Media Environment: How Effective Are Political Fact-Checkers? Communication Research, 47(2), 227- [Google Scholar] [Crossref]
16. 250. https://doi.org/10.1177/0093650218819671 (Original work published 2020) [Google Scholar] [Crossref]
17. Harjani, T., Roozenbeek, J., Biddlestone, M., van der Linden, S., Stuart, A., Iwahara, M., Piri, B., Xu, R., Goldberg, B., & Graham, M. (2022). A Practical Guide to Prebunking Misinformation. [Google Scholar] [Crossref]
18. Hitchcock, D. (2024). Critical thinking. In E. N. Zalta & U. Nodelman (Eds.), The Stanford encyclopedia of philosophy (Summer 2024 Edition). Stanford University. https://plato.stanford.edu/archives/sum2024/entries/critical-thinking/ [Google Scholar] [Crossref]
19. Hoes, E., Aitken, B., Zhang, J., Gackowski, T., & Wojcieszak, M. (2024). Prominent misinformation interventions reduce misperceptions but increase scepticism. Nature Human Behaviour, 8, 1545 - 1553. [Google Scholar] [Crossref]
20. Jones-Jang, S.M., Mortensen, T., & Liu, J. (2021). Does Media Literacy Help Identification of Fake News? Information Literacy Helps, but Other Literacies Don’t. American Behavioral Scientist, 65(2), 371-388. https://doi.org/10.1177/0002764219869406 [Google Scholar] [Crossref]
21. Kahan, D.M. (2017). Misinformation and Identity-Protective Cognition. Political Communication eJournal. [Google Scholar] [Crossref]
22. Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux. [Google Scholar] [Crossref]
23. Kapantai, E., Christopoulou, A., Berberidis, C., & Peristeras, V. (2021). A systematic literature review on disinformation: Toward a unified taxonomical framework. New Media & Society, 23(5), 1301-1326. https://doi.org/10.1177/1461444820959296 [Google Scholar] [Crossref]
24. Kaye, B. K., & Johnson, T. J. (2024). I can’t stop myself! Doomscrolling, conspiracy theories, and trust in social media. Atlantic Journal of Communication, 32(3), 471–483. https://doi.org/10.1080/15456870.2024.2316844 [Google Scholar] [Crossref]
25. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Prentice Hall. [Google Scholar] [Crossref]
26. Koppel, L., Robertson, C. E., Doell, K. C., Javeed, A. M., Rasmussen, J., Rathje, S., Vlasceanu, M., & Van Bavel, J. J. (2023). Individual-level solutions may support system-level change—if they are internalized as part of one's social identity. Behavioral and Brain Sciences, 46, e165. https://doi.org/10.1017/S0140525X2300105X [Google Scholar] [Crossref]
27. Kozyreva, A., Lewandowsky, S., & Hertwig, R. (2020). Citizens versus the internet: Confronting digital challenges with cognitive tools. Psychological Science in the Public Interest, 21(3), 103– 156. https://doi.org/10.1177/1529100620946707 [Google Scholar] [Crossref]
28. Kozyreva, A., Lorenz-Spreen, P., Herzog, S. M., et al. (2024). Toolbox of individual-level interventions against online misinformation. Nature Human Behaviour, 8, 1044–1052. https://doi.org/10.1038/s41562024-01881-0 [Google Scholar] [Crossref]
29. Kwek, A., Peh, L., Tan, J., & Lee, J. X. (2023). Distractions, analytical thinking and falling for fake news: A survey of psychological factors. Humanities & social sciences communications, 10(1), 319. https://doi.org/10.1057/s41599-023-01813-9 [Google Scholar] [Crossref]
30. Lutzke, L., Drummond, C., Slovic, P., & Árvai, J. (2019). Priming critical thinking: Simple interventions limit the influence of fake news about climate change on Facebook. Global Environmental Change, 58, 101964. https://doi.org/10.1016/j.gloenvcha.2019.101964 [Google Scholar] [Crossref]
31. Machete, P., & Turpin, M. (2020). The use of critical thinking to identify fake news: A systematic literature review. In M. Hattingh, M. Matthee, H. Smuts, I. Pappas, Y. K. Dwivedi, & M. Mäntymäki (Eds.), Responsible design, implementation and use of information and communication technology (Vol. 12067, pp. 257–267). Springer. https://doi.org/10.1007/978-3-030-45002-1_20 [Google Scholar] [Crossref]
32. Maertens, R., Roozenbeek, J., & van der Linden, S. (2023). The Manipulative Online Content Recognition Inventory (MOCRI): A scale to measure the ability to spot manipulation techniques in online content. PsyArXiv. https://doi.org/10.31234/osf.io/g68mc [Google Scholar] [Crossref]
33. McGrew, S., & Breakstone, J. (2023). Civic online reasoning across the curriculum: Developing and testing the efficacy of digital literacy lessons. AERA Open, 9, 1–16. https://doi.org/10.1177/23328584231176451 [Google Scholar] [Crossref]
34. McIntyre, L. (2018). Post-truth. MIT Press. [Google Scholar] [Crossref]
35. Meyer, M., Enders, A., Klofstad, C., Stoler, J., & Uscinski, J. (2024). Using an AI-powered “street epistemologist” chatbot and reflection tasks to diminish conspiracy theory beliefs. Harvard Kennedy School (HKS) Misinformation Review, 5(6). https://doi.org/10.37016/mr-2024-070 [Google Scholar] [Crossref]
36. Neenan, M. (2008). Using Socratic questioning in coaching. Journal of Rational-Emotive & Cognitive- [Google Scholar] [Crossref]
37. Behavior Therapy, 27(4), 249–264. https://doi.org/10.1007/s10942-007-0076-z [Google Scholar] [Crossref]
38. Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior (32), 303-330. 10.1007/s11109-010-9112-2. [Google Scholar] [Crossref]
39. Paul, R., & Elder, L. (2016). The thinker’s guide to the art of Socratic questioning. Foundation for Critical Thinking. [Google Scholar] [Crossref]
40. Paul, R., & Elder, L. (2020). Fact over fake: A critical thinker’s guide to media bias and political propaganda. Rowman & Littlefield. [Google Scholar] [Crossref]
41. Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50. https://doi.org/10.1016/j.cognition.2018.06.011 [Google Scholar] [Crossref]
42. Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402. https://doi.org/10.1016/j.tics.2021.02.007 [Google Scholar] [Crossref]
43. Roozenbeek, J., & van der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(65). https://doi.org/10.1057/s41599-019-0279-9. [Google Scholar] [Crossref]
44. Roozenbeek, J., Traberg, C., & van der Linden, S. (2022). Correction: Technique-based inoculation against real-world misinformation (print 2023), by Roozenbeek et al. Royal Society Open Science, 10, Article 231235. https://doi.org/10.1098/rsos.231235 [Google Scholar] [Crossref]
45. Roozenbeek, J., Culloty, E., & Suiter, J. (2023). Countering misinformation: Evidence, knowledge gaps, and implications of current interventions. European Psychologist, 28(3), 189–205. https://doi.org/10.1027/1016-9040/a000492 [Google Scholar] [Crossref]
46. Sartori, R., Tommasi, F., Ceschi, A., Falser, M., Genero, S., & Belotto, S. (2022). Enhancing critical thinking skills and media literacy in initial vocational education and training via self-nudging: The contribution of NERD VET project. Frontiers in Psychology, 13, 935673. https://doi.org/10.3389/fpsyg.2022.935673 [Google Scholar] [Crossref]
47. Singh, A., Guan, Z., & Rieh, S. Y. (2025). Enhancing critical thinking in generative AI search with metacognitive prompts (arXiv:2505.24014). arXiv. https://doi.org/10.48550/arXiv.2505.24014 [Google Scholar] [Crossref]
48. Vosoughi, S., Roy, D. & Aral, S. (2018). The spread of true and false news online. Science, 359, 11461151. 10.1126/science.aap9559. [Google Scholar] [Crossref]
49. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press. [Google Scholar] [Crossref]
50. Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe Report. https://rm.coe.int/information-disorder-towardan-interdisciplinary-framework-for-research/1680765e2b [Google Scholar] [Crossref]
51. Ziemer, C.-T., & Rothmund, T. (2024). Psychological underpinnings of misinformation countermeasures: A systematic scoping review. Journal of Media Psychology: Theories, Methods, and Applications, 36(6), 397–409. https://doi.org/10.1027/1864-1105/a000407 [Google Scholar] [Crossref]