ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Page 107
www.rsisinternational.org
The Dialogue Dashboard: Measuring Performance in Academic Talks (D
Dash)
*1
Nadia bt Mohd Nawi,
2
Hafizah Ab Hamid,
3
Dr. Che Wan Ida Rahimah Che Wan Ibrahim,
1 2 3
ELC, PPAL, Universiti Teknologi Terengganu
*Corresponding Author
DOI: https://dx.doi.org/10.47772/IJRISS.2025.925ILEIID000020
Received: 23 September 2025; Accepted: 30 September 2025; Published: 04 November 2025
ABSTRACT
This innovation presents the development and implementation of the Dialogue Dashboard, an innovative
assessment tool designed for the English for Occupational Purposes course, specifically targeting the Mock
Meeting evaluation. The tool addresses challenges in consistently and accurately assessing students’ use of
language functions and argumentation skills during academic discussions. Traditional rubrics lacked specificity
and were difficult to apply in real-time. To overcome these limitations, the Dialogue Dashboard integrates
language function tracking with argumentation rubrics through two complementary tables: one monitoring
language function usage and the other capturing detailed qualitative performance comments. The methods
involve educators listing students and relevant language functions in a structured table format, marking
observed usage in real-time during discussions. Simultaneously, detailed comments on student participation
and communication skills are recorded to support transparent and fair grading. This dual-table approach
simplifies the evaluation process, reduces cognitive load on instructors, and enhances assessment accuracy.
The impact of the Dialogue Dashboard is significant. It promotes more objective, consistent, and efficient
evaluation of communicative competence in simulated professional contexts. Educators benefit from clearer
evidence of student performance, enabling improved feedback and alignment with instructional goals. Students
gain clarity on assessment criteria, leading to better preparation and more focused engagement in discussions.
Overall, the Dialogue Dashboard exemplifies a practical, adaptable solution for advancing language and
critical thinking assessment in higher education.
Keywords: Dialogue Dashboard, assessment, language function, method, academic talks
INTRODUCTION
Effective communication is fundamental to 21st-century education; nonetheless, the evaluation of students'
linguistic and argumentative abilities in academic discourse remains inconsistent. English for Occupational
Purposes is a subject specifically designed for workplace communication.
One of the evaluations to achieve this objective is mock meeting. Therefore, this innovation develops an
innovative tool that integrates language function assessment with argumentation rubrics to provide fairer,
clearer, and more effective evaluation in higher education, fostering confident communication, critical
thinking, and engaging discussions. The sample chosen was final year students taking English for
occupational purposes.
Therefore, the Dialogue Dashboard was developed during the English for Occupational Assessment known as
the Mock Meeting. The aim of this evaluation was to assess students' competency in speaking within a
specified context. Multiple rubrics were included to facilitate the evaluation process; however, it was found
that verifying the functions they used during the discussion was quite challenging. Consequently, the Dialogue
Dashboard was created to enable a clearer and easier assessment of the use of language functions in simulated
meeting environments.
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Page 108
www.rsisinternational.org
Problem Statement
Assessment tools in academic discussions often exhibit a limited focus on specific language functions such as
clarifying, agreeing, disagreeing, and summarizing. This oversight results in incomplete evaluations of
students’ communicative abilities, as crucial aspects of interaction are neglected. Additionally, current
assessment methods show significant inconsistency when evaluating students' speech communication and
reasoning during academic conversations, leading to unreliable and potentially inaccurate outcomes (Furtak &
Ruiz-Primo, 2008; Bheda, 2022).
Furthermore, existing evaluation forms lack appropriately detailed descriptors, making it difficult for
instructors to effectively access and assess these language functions while simultaneously listening to ongoing
discussions. This gap creates challenges in real-time assessment, hindering educators’ ability to provide
accurate, immediate feedback. Addressing these issues requires the development of more focused, consistent,
and user-friendly assessment frameworks that facilitate the practical evaluation of language use in dynamic
academic settings (Furtak & Ruiz-Primo, 2008; Bheda, 2022).
Objectives
Below are the objectives of the innovation:
1. To develop an innovative evaluation tool that integrates language function assessment with
argumentation rubrics.
2. To address inconsistencies in the assessment of students’ language and argumentation skills in academic
discussions.
3. To create user friendly evaluation form to use during assessment.
METHODOLOGY
Product Description
During the Mock Meeting assessment for the English for Occupational Assessment, the Dialogue Dashboard
was created. The tool, which was created to assess students' speaking skills in a structured, real-world setting,
tackles issues with several initial rubrics that made it challenging and time-consuming to track language
functions during discussions (Politis et al., 2023; Tauchid, Putri, & Suwandi, 2024). The Dialogue Dashboard
provides a straightforward and effective way to monitor and evaluate language function use in simulated
meeting settings, which helps to expedite this process (Gao, Roever, & Lau, 2024). The invention is based on
the two-stage Mock Meeting Assessment, which consists of a group discussion and an individual presentation.
The Latter Stage is Where it is Most Important to Evaluate Students' Participation and use of Language
Functions. The Dialogue Dashboard offers Specialised Evaluation Forms with two Distinct but
Complementary Tables: Table 1 (The Dashboard Dialogue) for Systematic Tracking of Language Function use
and Table 2 (Comment Board) to Gather Qualitative Feedback on Student Performance. This is Because The
Previous Rubric Was General and Lacked Specificity in Certain Aspects (Politis Et Al., 2023). The Instructor
uses these tables Simultaneously During Tests for Different but Related Reasons, Which Improves the
Evaluation Process's Efficacy, Fairness, and Clarity (Zhai & Wibowo, 2023).
METHODOLOGY
The Dialogue Dashboard was developed and implemented in the English for Occupational Purposes course to
improve the way students' speaking abilities were evaluated during the Mock Meeting evaluation, which is a
structured performance-based assessment that mimics professional communication in the real world. The
Dashboard Dialogue and the Comment Board are two separate but complementary assessment tools that were
developed as part of the methodology. A systematic table that recorded language functionslike clarifying,
agreeing, or summarizingthose students used during group discussions made up the Dashboard Dialogue,
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Page 109
www.rsisinternational.org
which enabled teachers to annotate these functions in real time. Simultaneously, the Comment Board was
created for qualitative documentation, allowing instructors to document in-depth observations on students'
critical thinking abilities, participation, and communication effectiveness.
Five groups of 86 final-year students enrolled in the English for Occupational Purposes course participated in
the tool's pilot and five instructors were chosen to be interviewed after they have used this for their assessment.
To reduce reliance on memory or post-session evaluation, instructor used the Dashboard to conduct concurrent,
immediate assessments during live mock meetings, which is how the data was collected. By combining
qualitative performance observations with quantitative language function usage, this dual-table method
allowed for transparent and consistent grading. Clear descriptors and efficient marking systems also improved
usability, allowing for real-time evaluation without interfering with conversation.
By using a mixed-method approach, it was possible to guarantee a thorough and accurate evaluation of
students' linguistic proficiency as well as their ability to argue in authentic communication situations. This
improved feedback helped students become more prepared for communication in the workplace.
POTENTIAL FINDINGS
Findings
In the pilot, 78% of students reported finding it easier to understand the criteria when viewing the Dialogue
Dashboard before their presentation. Additionally, 3 out of 5 instructors and 2 instructors prefer the evaluation
to be more specific but they agreed that by having the structured evaluation tables improved the overall
assessment process. By using a mixed-method approach, it was possible to guarantee a thorough and accurate
evaluation of students' linguistic proficiency as well as their ability to argue in authentic communication
situations. This improved feedback helped students become more prepared for communication in the
workplace. The accuracy and consistency of language function assessment in scholarly discussions were
greatly improved by the introduction of the Dialogue Dashboard. Evaluations became more trustworthy and
transparent because of instructors’ ability to objectively monitor students' use of critical communication skills
in real time. By offering both quantitative measurements and qualitative insights, the dual-table system
which also included a comment boardimproved the quality of the feedback. Students were more aware of the
evaluation criteria as a result, and this was linked to higher levels of confidence and participation. The
Dashboard's incorporation of argumentation rubrics further encouraged the growth of critical thinking abilities
necessary for effective communication in the workplace. The Dashboard was a useful and scalable tool in
higher education settings because of its versatility for both manual and digital use. But some of the instructors
also mentioned about the need to make it more detailed for future used.
Figure 1 The Dialogue Dashboard Table
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Page 110
www.rsisinternational.org
Figure 2 The comment board
Figure 3 Findings
Participants
Total
Number
Finding
Number/Count
Students
86
Found it easier to prepare for assessment using
Dialogue Dashboard
78
Instructors
5
Found having the evaluation form easier
3
Instructors
5
Felt evaluation form needed more specific descriptors
2
NOVELTY AND RECOMMENDATIONS
This innovation offers a novel, integrated approach that combines language use evaluation with argumentation
skill assessment in a single user-friendly format. It stands out by:
Bringing together linguistic and critical thinking evaluation in real-time academic discourse
Providing both structured and flexible scoring methods
Being adaptable for both manual and digital use (e.g., Google Forms, Socrative)
It is suggested that for future research, these tables can be utilized clearly by adding specific descriptors in
language form and functions and can be utilized into digital used.
ACKNOWLEDGEMENTS
Thank you to all members and respondents both students and instructors.
REFERENCES
1. Bheda, D. (2022, August 4). What are we doing wrong in assessment?
ExamSoft. https://examsoft.com/resources/limitations-current-assessment-practices-solutions/
2. Furtak, E. M., & Ruiz-Primo, M. A. (2008). Comparing written responses to classroom discussion
responses: Thoroughness and logic in student thinking. Foundations of Education and Instructional
Assessment. https://socialsci.libretexts.org/Bookshelves/Education and Professional
Development/Foundations of Education and Instructional Assessment (Kidd et al.)/17: Instructional
Assessment-/Assessment Strategies/17.03: How can classroom discussions be used for assessment
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Page 111
www.rsisinternational.org
3. Politis, Y., Clemente, I., Lim, Z., & Sung, C. (2023). The development of the conversation skills
assessment tool. Autism Developmental Language Impairment, 8, Article 23969415231196063.
https://doi.org/10.1177/23969415231196063
4. Tauchid, A., Putri, N. V. W., & Suwandi, E. (2024). Remote speaking tasks: Amplifying understanding,
proficiency, and confidence in English learners. ETERNAL: English Teaching Journal, 15(2), 229-250.
https//doi.org/10.26877/eternal. v15i2.489
5. Gao, X., Roever, C., & Lau, J. (2024). Innovations in language function assessment: streamlining evaluation
processes. Journal of Applied Linguistic, 18(1), 45-62.
6. Zhai, C., & Wibowo, S. (2023). A systematic review on artificial intelligence dialogue systems for
enhancing English as foreign language students interactional competence in the university. Computers and
Education: Artificial Intelligence 4, 100134. https://doi.org/10.1016/j.caeai.2023.100134