Page 240
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Designing Job Lingua AI: A Conceptual Framework for AI-Enhanced
Interview English in Higher Education
*1
Nur Syazwanie Mansor,
2
Norlizawati Md Tahir,
3
Rafidah Amat,
4
Mas Aida Abd Rahim,
5
Nor Asni Syahriza Abu Hassan,
6
Seehhazzakd Rojanaatichartasakul
1,2,3,4,5
Academy of Language Studies, Universiti Teknologi MARA Cawangan Kedah, Kampus Sungai
Petani, 08400, Merbok, Kedah, Malaysia
6
Chulalongkorn University Language Institute Pathumwan, Bangkok 10330, Thailand
*Corresponding Author
DOI:
https://dx.doi.org/10.47772/IJRISS.2025.925ILEIID000044
Received: 23 September 2025; Accepted: 30 September 2025; Published: 05 November 2025
ABSTRACT
The transition from university to workplace demands not only technical knowledge but also advanced
communication skills, particularly in job-seeking contexts. In Malaysia’s public and private sectors,
proficiency in English, especially in resume writing, cover letter development, and job interviews, is an
important determinant of graduate employability. However, existing English for Specific Purposes (ESP)
courses such as LCC502 often lack the personalized and formative feedback necessary to prepare learners for
these high-stakes real-world tasks. This paper presents JobLinguaAI, an AI-supported educational platform
developed to address this gap by integrating AI-powered writing assistance, speech coaching, and interview
simulations within a CEFR B2 High aligned framework. Grounded in a design-based research approach and
informed by the course outcomes of LCC502, JobLinguaAI supports students in producing professional
documents, improving spoken fluency, and acquiring domain specific vocabulary through iterative feedback
loops. Preliminary feedback from educators affirms its relevance, clarity, and alignment with course learning
outcomes while also suggesting enhancements such as localized vocabulary and lecturer dashboards. The
platform’s modular design allows for scalability across ESP domains and integration with institutional LMS
platforms. With its potential to transform job readiness pedagogy in Malaysian higher education, JobLinguaAI
offers a novel model for AI enhanced, task-based language learning that promotes learner autonomy,
performance accuracy, and employability.
Keywords: AI-assisted language learning; English for Specific Purposes (ESP); Job interview communication,
Graduate employability
INTRODUCTION
English language proficiency plays a pivotal role in enhancing employability, particularly in the field of public
administration where effective written and verbal communication is essential for policy implementation,
stakeholder engagement, and service delivery. As Malaysia transitions into a knowledge-driven economy, there
is an increasing demand for graduates who are not only technically competent but also communicatively agile
in workplace settings. Within this context, English for Specific Purposes (ESP) has emerged as a critical
subfield of English language instruction in Malaysian higher education institutions (HEIs), targeting domain-
relevant communication skills tailored to students’ academic and professional disciplines.
ESP courses such as LCC502 (English for Job Interviews) aim to bridge the gap between classroom learning
and professional readiness, focusing on competencies like resume writing, cover letter drafting, and oral
interview communication. However, current approaches often rely on static, instructor-led formats that may
not effectively address learners' individual weaknesses, pronunciation challenges, or lack of exposure to
authentic interview scenarios. This disconnect is particularly evident in the lack of personalized, formative
Page 241
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
feedback that aligns with learners’ proficiency levels and job-specific communication needs. Consequently,
there is a pressing need for pedagogical innovations that integrate technology, particularly AI driven tools, to
simulate real world interactions, provide adaptive feedback, and support learners’ development of confidence
and linguistic precision in high stakes settings.
Course Context: LCC502
LCC502 is a compulsory English for Specific Purposes course offered to second-year students in the field of
Administrative Science at Universiti Teknologi MARA. The course is aligned with the Common European
Framework of Reference for Languages, targeting a high B2 proficiency level, and focuses on preparing
students for real-world job application processes. Main learning outcomes include the ability to produce
professional documents such as resumes and cover letters, as well as the capacity to engage in effective verbal
interaction during job interviews. These outcomes reflect the broader aim of enhancing employability through
context-specific language competence.
Despite its structured objectives, a noticeable gap exists between syllabus goals and students’ actual readiness
for workplace communication. While students are introduced to functional writing and interview strategies,
they often lack opportunities to receive immediate, individualized feedback on their performance. Furthermore,
the course relies heavily on conventional classroom activities that may not fully simulate the dynamic and
high-pressure environment of actual job interviews. Research has shown that ESP learners benefit more from
adaptive, task-based learning models supported by technology, particularly when feedback is immediate and
aligned with their disciplinary context (Son et al., 2023; Kamaruddin et al., 2021). Without such tools, students
may complete the course having met formal assessment requirements, yet still feel underprepared for authentic
professional interactions. This disconnect underlines the need for pedagogical enhancement through intelligent
technologies that can close the gap between intended learning outcomes and real-world language performance.
Problem Statement
Although English for Specific Purposes courses such as LCC502 are designed to enhance students
employability through targeted instruction, there remains a significant disconnect between curriculum
objectives and students’ real-world readiness. Many learners struggle to apply formal language appropriately in
their job-related documents and verbal responses, particularly within the domain of public administration.
Traditional instructional methods often fall short in providing timely, formative feedback that addresses
learners' individual weaknesses in writing mechanics, pronunciation, and fluency. As a result, students may be
able to complete course tasks but still lack the confidence and language control required for successful
performance in high-stakes interview scenarios.
In addition, current classroom practices do not offer sufficient opportunities for repeated practice or
simulation-based learning. This limitation is compounded by a lack of exposure to domain-specific vocabulary
and culturally appropriate communication strategies. Studies have shown that the integration of intelligent
systems into language instruction significantly enhances student engagement, feedback quality, and
communicative competence (Qassrawi et al., 2024; Nguyen et al., 2025). However, such technologies are
rarely integrated into ESP teaching contexts in Malaysian universities, particularly in courses that prepare
students for public-sector roles. These gaps signal the urgent need for an AI-supported, discipline-specific
learning tool that can provide personalized learning pathways and support job readiness across significant
communicative domains.
Objectives
This innovation seeks to achieve the following objectives:
1. To Develop a System that Delivers Real-Time Ai Feedback on Job-Related Documents, Focusing on
Grammar, Structure, and Professional Tone.
2. To enhance students’ spoken communication through AI-powered pronunciation and fluency modules
simulating real interview conditions
Page 242
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
3. To support the acquisition of domain-specific vocabulary relevant to public administration and job
application contexts.
4. To promote culturally appropriate language use in professional settings through contextualised input
and feedback
5. To align LCC502 course outcomes with authentic communication tasks by embedding CEFR-aligned,
formative learning cycles.
6. To promote culturally appropriate language use in professional settings through contextualised input
and feedback.
PRODUCT DESCRIPTION & METHODOLOGY
Product Overview
Joblinguaai Is a Web-Based Educational Platform Developed to Address the Specific Communicative Needs of
Students Enrolled in Lcc502: English for job Interviews, Particularly Those Pursuing Degrees in
Administrative Science. The Innovation is Designed to Scaffold Both Written and Spoken Communication in
Job Application Contexts, Integrating Ai-Powered Feedback Mechanisms That Align With Course Learning
Outcomes and Cefr B2 Descriptors. its Primary Goal is to Provide Students With Immediate, Formative
Feedback on Job-Related Language Use, Enabling Them to Iteratively Improve Their Performance in Tasks
Such as Resume Writing, Cover Letter Drafting, and Job Interviews.
Target Users are Second-Year Undergraduate Students Preparing for Entry Into Public or Private Sector
Employment, Where Mastery of Administrative Language, Professional Tone, and Oral Fluency are Essential.
Joblinguaai is Not Intended to Replace The Role of Instructors But to Serve as a Complementary Tool That
Enhances Independent Learning and Improves the Quality of Performance Prior to Summative Assessments.
Core Features
Job Lingua AI integrates five core modules to address the multi-dimensional demands of job interview
preparation:
1. NLP-Based Writing Assistant: This module provides real-time feedback on grammar, structure, and
tone for resumes and cover letters. It draws from natural language processing (NLP) tools, similar to
those used in established platforms like Grammarly or Write & Improve, but is tailored specifically to job
application discourse in administrative fields.
2. Glossary Generator for Administrative Terms: Students can upload job postings or course texts, and
the system extracts and defines domain-specific vocabulary. This feature supports vocabulary acquisition
through automated corpus referencing and reinforces targeted learning strategies.
3. AI-Powered Speech Coach: Utilizing speech recognition technologies, such as automatic speech
recognition (ASR), this module evaluates pronunciation, fluency, and clarity. Students receive annotated
transcripts and can monitor progress over time. This directly supports speaking-oriented CLOs and
improves interview confidence.
4. Interview Simulation Module: Students engage in mock interviews with AI-generated questions.
Responses are recorded, transcribed, and assessed using CEFR-aligned rubrics. This repeated, low-stakes
practice environment fosters fluency and reduces speaking anxiety.
5. Cultural Comparison Tool: To support intercultural awareness, students receive feedback on cultural
appropriateness, formality levels, and indirect language strategies. This component draws from pragmatic
competence frameworks in ESP instruction.
System Design and Workflow
JobLinguaAI follows a cyclical user interaction model grounded in formative assessment principles and task-
based language learning. The user journey is designed to mimic the stages of authentic language production,
feedback, and revision, providing students with repeated, scaffolded practice that aligns with the demands of
job application tasks.
Page 243
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Students begin by submitting a written or spoken task. The AI engine then processes the input, analyzing main
linguistic features such as grammar, coherence, tone, and pronunciation based on CEFR B2 descriptors.
Feedback is delivered in real time through interactive annotations, scoring indicators, and suggested revisions.
Students are encouraged to engage in multiple revision cycles before final submission. This continuous loop
reinforces the development of learner autonomy and metacognitive awareness.
The platform’s architecture integrates open-source AI technologies such as GPT for writing support and
Whisper for speech processing. It is calibrated to match LCC502’s rubrics and CEFR-aligned criteria, ensuring
instructional coherence.
Figure 1 Job Lingua AI Student Revision Flowchart
This flowchart illustrates the iterative learning cycle enabled by JobLinguaAI. The process begins when a
student submits an input such as a resume, cover letter, or a recorded interview response. The platform’s AI
engine then analyses the submission using natural language processing (NLP) and automatic speech
recognition (ASR) technologies. It assesses important elements such as grammar, tone, vocabulary use, and
pronunciation accuracy.
Page 244
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Following analysis, the system provides immediate and personalized feedback. This includes writing
suggestions for clarity and tone, vocabulary tips tailored to administrative language, and pronunciation
transcripts to improve spoken fluency. The student then revises the content based on the AI feedback.
An optional decision point allows a lecturer to review the updated version and offer formative input if needed.
Whether or not the lecturer intervenes, the student can repeat the cycle, continuing to refine their output until
confident and ready for assessment or real-world application.
This closed feedback loop embodies principles of formative assessment, learner autonomy, and technology-
enhanced language learning, creating a dynamic environment for continuous improvement.
METHODOLOGY
The development of JobLinguaAI follows a design-based research (DBR) methodology. DBR is particularly
suited to innovation in educational settings because it allows iterative prototyping, testing, and refinement
based on user needs and contextual relevance. In the conceptual phase, needs analysis was conducted through a
review of LCC502 curriculum documents, task rubrics, and student performance samples. The design is
grounded in task-based language teaching (TBLT) principles and draws from established ESP pedagogies
emphasizing authentic, discipline-specific communication tasks.
While the platform is still in its early stage, a prototype has been outlined, and preliminary feedback is being
collected from educators to guide the next phase. Once piloted, data will be collected on student improvement,
task engagement, and feedback quality using a mixed-methods approach including performance rubrics, usage
logs, and student reflections.
Educational Relevance
Alignment with Course Outcomes
Job Lingua AI was developed with direct reference to the learning outcomes and assessment structure of the
LCC502 course, “English for Job Interviews,” offered to Administrative Science undergraduates at UiTM. This
course is aligned to the CEFR B2-high proficiency level and emphasizes practical English communication for
job applications, including resume writing, cover letter construction, and oral interviews.
The system addresses Course Learning Outcomes (CLOs) in a targeted and measurable way. Specifically, it
supports:
CLO1: Producing clear, structured written texts in a professional context
CLO2: Delivering verbal responses with appropriate tone, vocabulary, and fluency
CLO3: Demonstrating understanding of language use in administrative and intercultural contexts
By embedding Job Lingua AI within the job-seeking modules of LCC502, the platform enhances student
preparation for all three major assessments: resume and cover letter writing, the screening interview, and the
selection interview. These tasks are not only simulated within the AI system but also reinforced with formative
feedback, allowing for personalized skill-building.
The platform’s alignment with CLOs ensures pedagogical coherence and makes it a scalable supplement to
current classroom instruction. Students benefit from immediate AI feedback that supports revision and deeper
learning, while instructors are provided with options to integrate and monitor progress as needed.
Table 1: Alignment of JobLinguaAI Features with LCC502 Assessments and Course Learning Outcomes
(CLOs)
JobLinguaAI Feature
LCC502 Assessment
Supported
CLOs
Function/Learning Focus
Page 245
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
NLP-Based Writing
Assistant
Resume & Cover
Letter Writing
CLO1
Provides feedback on structure, tone, grammar, and
clarity of professional documents
AI-Powered Speech &
Pronunciation Coach
Screening & Selection
Interviews
CLO2
Offers pronunciation transcripts, fluency support, and
tone correction in spoken responses
Interview Simulation Module
Screening & Selection
Interviews
CLO2, CLO3
Replicates real-world verbal interactions; prepares
students for question-answer fluency and intercultural
competence
Glossary Generator
(Workplace Communication
Terms)
All assessments
CLO1, CLO3
Enhances vocabulary range specific to administrative
and public-sector contexts
Cultural Awareness
Comparison Tool
Interview
CLO3
Promotes awareness of formal and culturally appropriate
communication across global and local job markets
Lecturer Review Dashboard
(Optional)
Supports All
Assessments
(Formative Phase)
CLO1, CLO2,
CLO3
Enables instructor monitori ng and feedback during the
student’s revision cycle to enhance learning before
summative assessment
As shown in Table 1, the design of Job Lingua AI corresponds directly to the CLOs and task types emphasized
in LCC502.
Pedagogical Framework
JobLinguaAI is designed upon sound pedagogical principles that reflect current best practices in English for
Specific Purposes (ESP) instruction, especially in professional and administrative contexts. Its foundation lies
in Task-Based Language Teaching (TBLT), which promotes authentic language use through real-world
communication tasks. By focusing on concrete outputs such as resumes, cover letters, and interviews, the
platform supports outcome-driven learning that simulates workplace demands; a core expectation in ESP
curricula (Kamaruddin et al., 2021; Basturkmen, 2019).
To ensure relevance and motivation, the platform embeds authentic materials and domain-specific vocabulary
closely tied to job-seeking scenarios in both public and private administrative sectors. This aligns with the
recommendations of Purpura and Graziano-King (2004), who argue that ESP learners benefit most when
vocabulary and tasks are derived from actual workplace communication needs.
The system also reinforces formative feedback loops, a critical component in developing learner autonomy and
communicative competence (Teng, 2022). With AI delivering instant, adaptive responses, students can engage
in low-stakes, repeated practice and revising their writing or speech based on targeted feedback. This iterative
cycle promotes reflective learning, self-editing, and gradual mastery of professional language use.
Importantly, JobLinguaAI supports scaffolded, inclusive learning by accommodating varied proficiency levels.
Whether students require basic structural guidance or advanced tone refinement, the platform adapts its
feedback accordingly, fostering differentiated instruction; a critical principle for large, mixed-ability ESP
classes (Blaz, 2023; Devi & Perumandla 2024).
Through this multi-layered framework, JobLinguaAI not only aligns with instructional theory but also meets
the pragmatic needs of LCC502 students preparing to transition into multilingual, high-stakes employment
environments.
Commercialisation Potential
JobLinguaAI holds strong commercialisation potential beyond its initial application in Administrative Science
and LCC502. Its modular design and reliance on transferable AI technologies make it easily scalable across
other ESP domains, including Engineering, Business, Law, and Health Sciences. Since each of these fields
demands specialised vocabulary and communicative performance in job-seeking or workplace scenarios, the
system can be adapted by simply modifying its vocabulary banks and simulation prompts to fit the respective
disciplines (Blaz, 2023)
Page 246
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
The platforms lightweight and web-based architecture opens the door to integration as a plugin within major
Learning Management Systems (LMS) such as UiTMs i-Learn or Moodle. This would allow lecturers to
assign AI-enhanced revision tasks directly through their course portals, track student engagement, and embed
the tool seamlessly into existing classroom practices without requiring extensive training or infrastructure
changes.
In terms of financial sustainability, JobLinguaAI is well suited for a freemium model. Students could access
core writing and speaking features at no cost, while institutions could subscribe to a premium version offering
administrative dashboards, advanced analytics, and cohort-level tracking for curriculum teams or career
services. This approach balances student accessibility with institutional value and offers potential revenue
channels to support ongoing development, localisation, and maintenance.
Preliminary Feedback from Lecturers
To ensure pedagogical relevance and contextual fit, preliminary feedback on JobLinguaAI was collected from
two lecturers currently teaching LCC502. A short survey was distributed alongside a one-page overview and
flowchart illustrating the system framework. The goal was to evaluate the innovation’s alignment with course
outcomes, perceived usefulness, and implementation feasibility. There were only two respondents, as the
course is newly introduced and currently offered exclusively to Administrative Science students, with only two
classes being taught at this stage.
The feedback was uniformly positive. Both respondents rated the platform as “Very Relevant” to the learning
objectives of LCC502, particularly in strengthening job-related communication. All significant features
including the resume and cover letter writing assistant, pronunciation coach, glossary generator, and interview
simulator were deemed equally beneficial. Furthermore, both lecturers believed the innovation could
effectively support learners in reaching the CEFR B2 High level, especially in writing accuracy, tone, and
verbal fluency.
In terms of qualitative insights, the main strengths highlighted were:
The clarity of the system’s concept and user flow
Its strong alignment with the LCC502 syllabus and CLOs
The practicality of real-time feedback tools for formative assessment
Lecturers also offered constructive suggestions, including the need for:
Localised content to reflect Malaysian administrative contexts
A dashboard feature for lecturers to track student progress
Onboarding or tutorial support to guide student users
To address these points, the next stage will focus on embedding a localised vocabulary dataset, incorporating a
lecturer-facing interface for tracking, and including tutorial modules to support user onboarding in the
prototype phase.
Table 2 below summarises the feedback and proposed responses:
Table 2 Summary of Lecturer Feedback on Job Lingua AI and Planned Responses
Feedback Area
Action Taken
Clarity of Concept
Retain current visuals and overview
materials
Syllabus Alignment
Maintain CLO-linked modules
Page 247
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
Localised Content
Include local vocabulary dataset
Dashboard Feature
Design instructor dashboard in
prototype
Student Onboarding
Develop in-app tutorial and
onboarding screens
"This is the kind of tool that could really support autonomous ESP learning if localised well." Lecturer
feedback (anonymous).
This early input has provided validation for the innovation’s educational value while highlighting practical
areas for enhancement. By integrating these suggestions, JobLinguaAI will be better positioned for classroom
implementation and long-term scalability.
NOVELTY AND RECOMMENDATIONS
Novelty
JobLinguaAI introduces a novel approach to English for Specific Purposes (ESP) instruction by unifying AI-
enhanced writing and speaking tools within a single, course-aligned platform. While AI applications in
language learning are increasingly common, few innovations offer an integrated system that combines ESP-
focused content, CEFR-aligned formative feedback, and contextualised job interview training. This multi-
dimensional approach responds directly to the evolving needs of 21st-century learners in higher education.
Notably, JobLinguaAI is custom-designed for the Malaysian tertiary context, targeting students in
Administrative Science and aligning with the learning outcomes of LCC502. Its incorporation of real-world
tasks such as résumé and cover letter writing, as well as mock interview simulations, grounds the experience in
practical language use. At the same time, the platform’s AI-driven feedback on tone, accuracy, and
pronunciation reflects the rigor of CEFR B2 High benchmarks.
This innovation addresses gaps not only in pedagogical delivery (e.g., authentic tasks and formative feedback)
but also in technological accessibility, by offering autonomous revision loops and optional lecturer support. In
doing so, it aligns with broader institutional goals of enhancing graduate employability, supporting digital
transformation, and ensuring inclusive, skill-based education.
Recommendations
To enhance the educational value, usability, and long-term scalability of JobLinguaAI, several next steps are
proposed. First, a working prototype should be developed to reflect the platforms full user flow and core
features, including resume and cover letter feedback, glossary generation, and interview simulation. This will
allow hands-on testing and functional validation. Next, pilot testing is recommended within the LCC502
course to evaluate real classroom engagement, system usability, and its impact on learners CEFR-aligned
outcomes in both written and spoken English.
Following initial deployment, the platform could be adapted for use in other English for Specific Purposes
(ESP) domains such as Law, Engineering, or Business, given its flexible architecture and relevance to job
communication tasks. Another recommendation is to explore immersive technologies, such as Virtual Reality
(VR) or Augmented Reality (AR), to enhance the realism and interactivity of interview simulations especially
for speaking practice. Additionally, a longitudinal study should be planned to investigate the platforms long-
term effects on learners confidence, language precision, and employability outcomes over time.
Finally, to support educators, the platform should include a lecturer-facing dashboard with basic analytics and
tracking tools. This would allow instructors to monitor students’ submission patterns, progress, and
improvement areas ensuring a balanced blend of autonomous learning and guided instruction. These combined
Page 248
www.rsisinternational.org
ILEIID 2025 | International Journal of Research and Innovation in Social Science (IJRISS)
ISSN: 2454-6186 | DOI: 10.47772/IJRISS
Special Issue | Volume IX Issue XXV October 2025
strategies aim to ensure that Job Lingua AI is not only innovative but also pedagogically robust, contextually
grounded, and institutionally sustainable.
CONCLUSION
Job Lingua AI represents a timely and pedagogically grounded innovation in the field of English for Specific
Purposes (ESP), addressing the pressing need for personalised, skills-based instruction in job-related
communication. By combining AI-powered writing assistance, speech feedback, and interview simulation
within a CEFR-aligned framework, the platform responds directly to the challenges faced by learners in
preparing for real-world employment contexts, particularly in the Malaysian higher education system.
Rooted in the learning outcomes of the LCC502 course, JobLinguaAI has been intentionally designed to
address key instructional gaps through six interconnected objectives. It provides real-time feedback on
professional writing (Objective 1), enhances spoken communication through interview simulation and speech
analysis (Objective 2), and supports domain-specific vocabulary development for administrative settings
(Objective 3). Additionally, it promotes intercultural language awareness (Objectives 4 and 6) and embeds
CEFR-aligned, formative learning loops that align closely with LCC502 course assessments and CLOs
(Objective 5).
The integration of autonomous revision cycles, context-specific language feedback, and instructor-optional
tracking empowers learners with greater control over their skill development while reinforcing classroom
instruction. Early lecturer feedback confirms the platform's strong syllabus alignment and practical relevance
for learner development.
Looking forward, the next phase will focus on refining the prototype based on this feedback, piloting it in live
course environments, and systematically evaluating its pedagogical and technical impact. With its adaptable
architecture and alignment to institutional goals such as digital transformation and graduate readiness,
JobLinguaAI offers a scalable model for AI-supported ESP learning across disciplines and higher education
contexts.
ACKNOWLEDGEMENTS
The authors would like to express their sincere gratitude to the Kedah State Research Committee, UiTM Kedah
Branch, for the generous funding provided under the Tabung Penyelidikan Am. This support was crucial in
facilitating the research and ensuring the successful publication of this article.
REFERENCES
1. Basturkmen, H. (2019). ESP teacher education needs. Language Teaching, 52(3), 318-330.
2. Blaz, D. (2023). Differentiated instruction: A guide for world language teachers. Routledge.
3. Devi, R. S., & Perumandla, S. (2024). Revolutionizing Recruitment: Skill-Based Interview Models in the
Artificial Intelligence-Driven Economy. In AI-Oriented Competency Framework for Talent Management
in the Digital Economy (pp. 256-267). CRC Press.
4. Kamaruddin, A., Fitria, N., & Patmasari, A. (2021). Needs analysis-based ESP course design for
vocational high school students. KEMBARA: Jurnal Keilmuan Bahasa, Sastra, dan Pengajarannya, 7(2),
222-231.
5. Purpura, J. E., & Graziano-King, J. (2004). Investigating the foreign language needs of professional school
students in international affairs: A case study.
6. Son, J. B., Ružić, N. K., & Philpott, A. (2025). Artificial intelligence technologies and applications for
language learning and teaching. Journal of China Computer-Assisted Language Learning, 5(1), 94-112.
7. Teng, L. S. (2022). Explicit strategy-based instruction in L2 writing contexts: A perspective of self-
regulated learning and formative assessment. Assessing Writing, 53, 100645.