Page 189
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
A Computerized Dynamic Assessment Platform for EFL Listening
Comprehension
1,3
Song Jiao,
1
Shaidatul Akma Adi Kasuma,
2
Amir Panahi Noor,
3
Wang Liheng
1
School of Languages, Literacies and Translation, Universiti Sain Malaysia
2
Razi University,
3
North China University of Science and Technology
DOI:
https://dx.doi.org/10.47772/IJRISS.2025.925ILEIID000035
Received: 23 September 2025; Accepted: 30 September 2025; Published: 05 November 2025
ABSTRACT
Traditional assessments in EFL listening primarily report scores without revealing learners’ specific
difficulties, leaving students uncertain about how to improve and teachers unable to adjust instruction.
Dynamic Assessment (DA) addresses this gap by combining assessment with mediation, yet its traditional one-
on-one format is impractical for large classroom.
This project introduces a Computerized Dynamic Assessment (CDA) platform designed for individual
instruction in large scale EFL context. This platform delivers graduated, pre-scripted manual hints in multiple
modalities (text, audio, image, video) and generates detailed learning logs that track attempts, hints use and
completion time. These features provide individualized scaffolding for learners and equip teachers with
diagnostic insights for targeted instruction.
A quasi-experimental design was conducted with low-proficient Chinese undergraduates across four
intervention sessions. Listening comprehension was measured with pre- and post-tests using CET-4 listening
tests, and platform logs were analyzed for hint trajectories and efficiency. Results showed significant gains in
listening performance, alongside reductions in hint usage and task completion time, indicating both enhanced
comprehension and efficiency.
The CDA platform demonstrates that individualized mediation can be scaled to large classrooms without
overburdening teachers. By combining DA principles with multimodal support and learning analytics, the
platform offers a practical and innovative solution for EFL listening education. Future research may explore its
long-term effects and applicability across different learner groups and contexts.
Keywords: Computerized Dynamic Assessment (CDA), EFL listening, Graduated Prompts, Learning
Analytics
INTRODUCTION
Traditional EFL listening assessment typically consist of test papers and scores, offering little diagnostic
information on where learners struggle (Kao & Kuo, 2023). As a result, students’ progress is often limited
because they are unaware of their problems, In large, test-driven classrooms, such static assessment often leads
to passive learning, with students unable to actively engage in listening activities ( Dogani, 2023; Hidri, 2014).
Dynamic Assessment (DA) has been proposed as an alternative in second language learning (Lantolf &
Poehner, 2014). By integrating assessment with instruction, DA promotes learner development through
mediation tailored to individual abilities (Zhang, 2023). However, DA is typically carried out in one-on-one
teacherstudent interactions, which are time-consuming and impractical for large classrooms (Izadi et al., 2024;
Yang & Qian, 2020).
Page 190
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Computerized Dynamic Assessment (CDA) addresses this challenge by enabling large-scale implementation
through computer systems (Pileh Roud & Hidri, 2021; Poehner et al., 2015). However, most CDA platform
still rely on limited question formats or provide only fixed feedback, offering little flexibility for teachers to
adapt tasks (Zeng, 2020). The present platform improves upon these limitations by incorporating multi-modal
tasks and hints support and multi-function dashboard. These innovations ensures that scaffolding remains
consistent with DA principles while being feasible for large classroom contexts.
Objectives
1. Provide hints when students get stuck, creating an adaptive learning environment and encouraging
active problem-solving to enhance listening comprehension.
2. Generate detailed learning logs and result reports to support teacher diagnosis, enabling timely
adjustments to instruction and materials.
Product Description
In practice, the CDA platform provides an integrated environment for both teachers and students. Teachers
create listening tasks by uploading audio or video files, writing the question stem, and entering multiple-choice,
matching, or fill-in-the-blank answers. Graduated manual hints are entered by teachers in advance. Once tasks
are deployed, students attempt questions. If they answer incorrectly, the system reveals hints progressing from
implicit to explicit (e.g., from a general reminder such as “listen again” to a specific cue), helping them move
forward without simply giving away the answer. As students work through the tasks, the system records their
attempt history, time spent, and hint usage. Afterward, teachers can access dashboards that summarize each
learners performance, identify persistent distractors or skills causing difficulty, and export results for
instructional follow-up.
The platform includes the following key functions:
Item creation: Teachers can design questions with audio, image, or video materials, define the number of
hints and indicate the correct answer.
Multimodal Support: Hints can be delivered in text, audio, image, or video formats.
Adaptive delivery: Next-level hints are released when a learner answers incorrectly, scaffolding listening
comprehension while preserving learner autonomy.
Time management: Tasks can be time-limited, and hints are displayed only for a controlled duration,
encouraging active processing.
Dashboard and learning logs: Track attempts, chosen answers, time spent, and hint use; teachers can view or
edit items, analyse results, and export reports to guide future teaching.
METHODOLOGY
To evaluate the platform’s effectiveness, a quasi-experimental design was conducted with 32 second-year low-
proficiency Chinese undergraduates, whose CET-4 listening scores were below 149 (60% out of total 249). The
CET-4 score below 425 passing benchmark corresponds to the A2B1 transition level of CEFR, indicating
low-proficiency EFL learners (Li et al., 2025). Listening comprehension was measured using CET-4 listening
test. Additional data were collected from platform logs.
The Intervention consisted of four CDA sessions (40 minutes each) in which traditional listening exercises
were replaced with CDA tasks featuring audio-based multiple-choice questions and graduated, pre-scripted
manual hints. Pre- and post-tests were administered to assess changes in listening comprehension performance.
In addition, log data for each session were analyzed to track the number of hints used, attempt made, and
completion time.
Page 191
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Data Analysis involved paired-sample t-tests to compare pre- and post-test listening scores. Descriptive and
trend analyses of the log data examined changes in hint use and efficiency across sessions. Although the
sample size provides a focused view of CDA implementation, findings should be interpreted with caution
regarding their generalizability to broader EFL populations.
POTENTIAL FINDINGS AND COMMERCIALISATION
Preliminary findings shows that the CDA platform enhances EFL learners’ listening performance. Students
scored higher on post-test, while log data indicated fewer hints were needed and tasks were completed more
quickly, reflecting gains in both comprehension and efficiency.
From a commercial perspective, the CDA platform directly responds to the needs of large classrooms where
individual feedback is rarely feasible. It also alleviates teachers’ workload by streamlining task design and
automatic feedback deliver. Its modular architecture allows easy integration into existing Learning
Management Systems (LMS) or use as a stand-alone web-based tool.
Future research employing longitudinal designs is recommended to evaluate the sustainability of learning gains
over time and across diverse educational contexts.
NOVELTY AND RECOMMENDATIONS
Unlike many existing CDA systems that rely on text-only prompts and fixed feedback, the present platform
addresses several long-standing limitations. It introduces multimodal graduated hints (text, audio, image,
video), a flexible backend that enables teachers to edit and configure different hint progressions, and
automated learning logs that capture attempts, hint use, and completion time.
These innovations overcome common CDA shortcomings such as restricted prompt formats, heavy teacher
workload, and lack of actionable diagnostic information. By embedding these functions, the platform advances
CDA beyond simple computer delivery, making it both pedagogically robust and scalable for large EFL
classrooms.
To further validate its educational value, future studies should broaden participant diversity, adopt longitudinal
approaches, and compare CDA with other innovative assessment tools to highlight its relative advantages.
Looking ahead, the model can be extended to other language skills and even other subject areas requiring both
assessment and scaffolding, with future work focusing on LMS integration, richer analytics, and sustainable
deployment.
REFERENCE
1. Dogani, B. (2023). Active learning and effective teaching strategies. International Journal of Advanced
Natural Sciences and Engineering Research, 7(4), 136-142.
2. Hidri, S. (2014). Developing and evaluating a dynamic assessment of listening comprehension in an EFL
context. Language Testing in Asia, 4(1), 4. https://doi.org/10.1186/2229-0443-4-4
3. Izadi, M., Izadi, M., & Heidari, F. (2024). The potential of an adaptive computerized dynamic assessment
tutor in diagnosing and assessing learners’ listening comprehension. Education and Information
Technologies, 29(3), 36373661. https://doi.org/10.1007/s10639-023-11871-w
4. Kao, Y.-T., & Kuo, H.-C. (2023). Diagnosing l2 English learners’ listening difficulties and learning needs
through computerized dynamic assessment. Interactive Learning Environments, 31(4), 22192243.
https://doi.org/10.1080/10494820.2021.1876738
5. Lantolf, J. P., & Poehner, M. E. (2014). 2014 Sociocultural Theory and the Pedagogical Imperative in L2
Education: Vygotskian Praxis and the Research/Practice Divide (1st ed.). Routledge.
https://doi.org/10.4324/9780203813850
Page 192
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
6. Li, C., Alwi, N. A. N. M., & Ali, M. (2025). A Comparative Review of the CEFR and CET4 Writing
Assessment with Insights from Task Complexity Theories. Malaysian Journal of Social Sciences and
Humanities, 10(3), e003251. https://doi.org/10.47405/mjssh.v10i3.3251
7. Pileh Roud, L. F., & Hidri, S. (2021). Toward a sociocultural approach to computerized dynamic assessment
of the TOEFL iBT listening comprehension test. Education and Information Technologies, 26(4), 4943
4968. https://doi.org/10.1007/s10639-021-10498-z
8. Poehner, M. E., Zhang, J., & Lu, X. (2015). Computerized dynamic assessment (C-DA): Diagnosing L2
development according to learner responsiveness to mediation. Language Testing, 32(3), 337357.
https://doi.org/10.1177/0265532214560390
9. Yang, Y., & Qian, D. D. (2020). Promoting L2 English learners’ reading proficiency through computerized
dynamic assessment. Computer Assisted Language Learning, 33(56), 628652.
https://doi.org/10.1080/09588221.2019.1585882
10. Zeng, S. (2020). Unlocking Individual Potential in Computerized Dynamic Assessment with an Evidence-
driven Menu-based Mediation Mechanism. International Conference on Computer Science and Education,
774777. https://doi.org/10.1109/ICCSE49874.2020.9201778
11. Zhang, Y. (2023). Promoting young EFL learners’ listening potential: A model of mediation in the
framework of dynamic assessment. The Modern Language Journal, 107(S1), 113136.
https://doi.org/10.1111/modl.12824