International Journal of Research and Innovation in Social Science

Submission Deadline- 14th October 2025
October Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-04th November 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-17th October 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Wielding Words in L2 Writing: Correlating Word Knowledge and Writing Ability in A Malaysian Classroom

  • Alice Su Chu Wong
  • Octavia Willibrord
  • 7034-7046
  • Oct 18, 2025
  • Education

Wielding Words in L2 Writing: Correlating Word Knowledge and Writing Ability in a Malaysian Classroom

Alice Su Chu Wong*, Octavia Willibrord

MARA University of Technology, Sabah, Malaysia

*Corresponding author

DOI: https://dx.doi.org/10.47772/IJRISS.2025.909000575

Received: 11 September 2025; Accepted: 17 September 2025; Published: 18 October 2025

ABSTRACT

This study examines the connection between second language (L2) learners’ vocabulary knowledge and their performance in academic writing. In many higher education settings, it is assumed that undergraduates possess a sufficient range of academic vocabulary. However, this assumption often overlooks actual learner proficiency. Nation (2006) notes that students need to understand around 98% of the words in a text to read and write academic texts with confidence and minimal assistance. Against this backdrop, the present study investigates how receptive vocabulary knowledge influences core features of written output, including text length and writing quality. The study involved 53 undergraduates enrolled in an English proficiency course at a Malaysian public university. Data were collected from vocabulary tests and two writing tasks. The analysis showed that a considerable number of students were still below the receptive vocabulary threshold expected at tertiary level, with the majority of their written texts corresponding to CEFR B1 to B2 levels. Additionally, vocabulary knowledge was found to have a significant correlation with overall writing performance. These findings deepen our understanding of how lexical development supports written expression and offer practical guidance for improving vocabulary instruction in higher education contexts.

Keywords: receptive vocabulary knowledge, academic writing performance, lexical threshold, Malaysian ESL learners, higher education language proficiency

INTRODUCTION

Among the many contenders for L2 competence, L2 vocabulary knowledge has consistently emerged as one of the strongest predictors. L2 competence can be operationalised with vocabulary abilities measured in terms of breadth and depth. As advocated by Laufer and Nation (1999), vocabulary size is a reliable measure of overall L2 proficiency, especially in reading and writing. In vocabulary studies, researchers have acknowledged the critical role that vocabulary knowledge plays in L2 writing and the importance of having a threshold level to produce good texts (Alsahafi, 2023; Barcroft, 2004 & Lessard‐Clouston, 2012). Nevertheless, several researchers have asserted that L2 learners do not possess this threshold level of vocabulary knowledge that allows them to effectively incorporate academic vocabulary into their writing (Aziz, Supian, Sukor & Nikman, 2021). This inadequacy can hinder students from performing well in academic writing tasks. Although vocabulary is widely recognised as an important component of writing, research exploring the link between second language (L2) vocabulary knowledge and writing skills, particularly within the tertiary classroom, remains limited. Gaining a clearer understanding of how the L2 vocabulary contributes to writing ability can offer fresh perspectives on how vocabulary shapes the development of tertiary learners’ writing. Pedagogically, the findings of the current work can inform more integrated approaches to vocabulary and writing instruction, especially in the Malaysian university contexts where students often struggle with both. By identifying the complexities of the interplay between vocabulary knowledge and writing ability, this study can propose an effective teaching and curriculum design tailored to the academic needs of L2 learners.

LITERATURE REVIEW

Second Language Vovabulary

Vocabulary knowledge has long been recognised as a central component of second language (L2) competence, often cited as one of the most reliable predictors across linguistic domains. As conceptualised by Meara (1996) and Nation (1999), vocabulary knowledge is broadly characterised by two dimensions: breadth and depth. Vocabulary breadth refers to the range of words a learner can recognise and recall at a basic level. This level is typically measured through standardised recall or matching tests. In contrast, depth of vocabulary includes more intricate knowledge such as meaning distinctions, word associations, collocations, grammatical behaviour, and register use (Qian, 1999). These dimensions offer valuable insight into both the quantity and quality of learners’ lexical repertoires.

Within these distinctions, vocabulary breadth or receptive vocabulary refers to the vocabulary that learners can recognise and understand, but not necessarily use in productive tasks. Although vocabulary breadth has traditionally been linked to reading comprehension and listening proficiency (Qian, 2002; Schmitt, Jiang, & Grabe, 2011), recent scholarship suggests that its influence extends into L2 writing. Schmitt (2014) argues that even at the recognition level, vocabulary functions as a critical resource for organising ideas, selecting appropriate expressions, and maintaining coherence in written discourse.

In applied linguistics, vocabulary knowledge is increasingly acknowledged not just as a language learning component but as a performance predictor, particularly in writing. Laufer and Nation (1999) note that vocabulary size is a strong indicator of overall L2 proficiency, while Nation (2001) asserts that robust lexical knowledge supports learners in generating content and maintaining precision in expression. Engber (1995) similarly observed that lexical diversity and correctness influenced how compositions were evaluated, particularly in constrained settings such as timed essays.

Receptive vocabulary plays a practical role in shaping how students respond to writing tasks. It helps them understand the prompt, recall the ideas they want to express, and decide how to organize those ideas. Choi et al. (2017) noted that vocabulary affects writing in more than one way. Not only does it have a direct impact, but it also works through reading. Students with better vocabulary often read more effectively, and that in turn supports stronger writing. Without a sufficient vocabulary base, students may struggle to plan, articulate, or refine their ideas. This link between vocabulary and writing has also been explored across different genres. Kaur, Ganapathy, and Yunus (2020) found that learners with stronger receptive vocabulary tended to write essays that were more varied in word choice, better structured, and more convincing overall. In contrast, students with limited lexical resources relied on simple sentence forms and often repeated the same phrases. In a related study, Said et al. (2021) highlighted how expanding vocabulary can improve expressive writing, particularly when learners are exposed to a wider range of tasks and writing contexts.

Vocabulary Proficiency and Malaysian Learners

In Malaysia, vocabulary continues to be a significant issue in ESL education, even after more than a decade of formal instruction. Nor Lateh et al. (2018) found that many undergraduates had not mastered even the 2,000‐word level based on the Vocabulary Levels Test (VLT); the level considered essential for basic reading and writing tasks. Chu et al. (2019) reported similar trends among secondary learners, whose vocabulary size often remained below the Academic Word List (AWL), which is widely used to measure readiness for academic study. These patterns show that a good number of students enter university without sufficient vocabulary knowledge to meet academic demands. Baek et al. (2023) reported that most tertiary learners possess only high‐frequency vocabulary, typically around the 2,000–3,000‐word level, while academic tasks require at least 5,000 words or more. As a result, learners often rely on overly simplistic expressions and generic sentence patterns, hindering both lexical sophistication and rhetorical depth. Indeed, limited exposure to lexical variety impairs learners’ ability to engage with higher‐level academic writing.

Part of the issue in poor vocabulary mastery can be attributed to the vocabulary teaching methods. In many classrooms, direct translation and rote memorisation remain the dominant practices (Azman, 2016; Kho, 2023), even though studies have repeatedly shown that these methods suppress learner motivation and limit long‐term retention. Yaacob and Yunus (2019) argue that such approaches prevent meaningful engagement with vocabulary and often result in shallow acquisition that does not transfer into productive skills. Palpanadan et al. (2019) observed that students who were exposed primarily to product‐based writing tasks, including copying templates and memorising phrases, were less successful in writing tasks than those who learned through process‐oriented methods.

In light of these vocabulary‐related predicaments, recent studies have advocated for stronger integration between vocabulary instruction and writing. Lee (2023) argues that students who had greater control over mid‐frequency and academic lexis produced better‐structured essays and demonstrated more coherent arguments. Likewise, Mukundan et al. (2013) observed notable improvements in writing performance, including lexical usage and content development, when direct vocabulary instruction was introduced in classroom settings. Amir and Sulaiman (2024) also confirmed that students who performed well on AWL‐based assessments tended to score higher in academic writing tasks, indicating that specialised vocabulary matters in university contexts.

Research Objectives

The present study aims to investigate participants’ academic writing ability and the relationship between students’ vocabulary size and their writing ability.

  • To determine the writing performance level of the participants in this
  • To identify the receptive and productive vocabulary size of the participants in this study.
  • To examine the correlation between receptive vocabulary size and L2 writing

Research Questions

  • What is the writing performance level of the participants in this study?
  • What is the vocabulary size (receptive and productive) of the participants in this study?
  • Are receptive vocabulary size and L2 writing performance significantly correlated?

METHODOLOGY

Participants

The participants were 55 undergraduate students, aged between 19 and 21, who were enrolled in an English proficiency course at a Malaysian university. Based on institutional placement tests and course performance, the majority of the students were at the B1 level of the Common European Framework of Reference for Languages (CEFR). All participants had received formal English instruction for approximately 11 years, beginning in primary school and continuing through secondary education. Participants were selected using convenience sampling from two intact class groups, and informed consent was obtained before data collection.

Instruments

Writing Tests

The writing component of this study was carried out across two sessions under regular classroom conditions. Each session gave students 40 minutes to complete the task. To ensure uniformity across the cohort, the same set of instructions and prompts was provided to all participants. In Task 1, students were asked to write an argumentative essay of about 250 words in response to the statement: “In this world, there are some people who may not succeed in school but end up being successful in life.” This prompt was adapted from the official IELTS website, chosen specifically for its clarity, relevance, and connection to real‐world writing expectations. Since IELTS is a widely recognised benchmark for English proficiency, using one of its writing prompts lends credibility to the assessment and aligns this study with established standards in language testing (Green & Hawkey, 2005; O’Loughlin, 2011). Students’ responses for Task 1 were analysed using an online Text Analyser, which provided automated feedback on each essay. This included estimates of the CEFR level, vocabulary usage, sentence length, lexical complexity, and total word count. However, only CEFR level and word count were used for this study’s analysis, as these were most relevant to the research focus.

For Task 2, students composed an expository essay of approximately 250 to 300 words on the topic, “Ways Malaysians Can Prevent Online Scams.” Essays were manually evaluated using a CEFR‐informed rubric that measured three areas: content (7 marks), language (10 marks), and organisation (3 marks). The total of 20 marks contributed 20% towards each student’s course grade. Word count was not included in the evaluation, as it was not part of the rubric. The rubric design drew on CEFR descriptors and was guided by writing assessment practices used in Cambridge English exams such as PET and FCE (Cambridge English, 2011). In Malaysian higher education, Yasin et al. (2024) introduced a CEFR‐based rubric, which was adjusted to a 20‐point scale for evaluating undergraduate ESL writing. Li et al. (2025) applied a similar approach by linking CET4 essay scores to CEFR levels using a 20‐mark framework. These references helped shape the rubric used in this study and support its relevance for assessing student writing within the Malaysian ESL context.

The inclusion of two different writing genres and the use of both automated and human scoring allowed for a broader view of students’ writing abilities. It also helped minimize the shortcomings that often arise when relying on just one genre or assessment method.

Vocabulary Levels Test (VLT)

To measure students’ vocabulary size, the study utilized the Vocabulary Levels Test (VLT) developed by Schmitt (2001). This is a well‐established test in L2 research and offers a clear benchmark for assessing learners’ vocabulary knowledge across different frequency bands. It includes five levels: the 2,000‐, 3,000‐, and 5,000‐word families, the Academic Word List (AWL), and the 10,000‐word level. Each level contains 30 items, where students match target words with their meanings. The format is designed to reduce the likelihood of random guessing and improve assessment accuracy. These levels represent progressively more complex layers of English vocabulary. The lower bands (2,000 and 3,000 words) consist mostly of high‐ frequency items used in everyday communication, while the 5,000‐word level reflects vocabulary found in general reading materials like newspapers. The AWL contains terms commonly seen in academic writing across disciplines, and the 10,000‐word level covers low‐frequency, often sophisticated vocabulary typically found in advanced texts. Scores for each level were calculated independently, and learners who scored at least 27 out of 30 (or 90% accuracy) were considered to have reached mastery, as recommended by Nation (2001). The results from the VLT gave a clear snapshot of each student’s receptive vocabulary knowledge and served as a foundation for examining how lexical ability relates to their writing performance.

Data Analysis

This study employed quantitative data analysis techniques to answer the three research questions. Data were first organized and screened in SPSS (Version 28), followed by descriptive and inferential statistical procedures. The analyses focused on participants’ writing performance, vocabulary size, and the relationship between vocabulary size and L2 writing ability. To address Research Question 1 regarding students’ writing performance, participants completed two writing tasks: an argumentative essay (Task 1) and an expository essay (Task 2). The argumentative essays were analysed using an online Text Analyser tool, which automatically assigned a CEFR level (e.g., B1, B2) to each text based on linguistic and lexical features. Since this tool does not provide analytic scores, each essay was assigned a single CEFR level to indicate general proficiency. On the other hand, the expository essays were assessed manually using a CEFR‐aligned rubric consisting of three criteria: content, organisation, and language. Each essay was given an overall score, which contributed to 20% of the students’ course assessment. Descriptive statistics were computed to determine the participants’ writing performance distribution for both tasks.

For Research Question 2, receptive vocabulary knowledge was measured across five frequency levels: the 2,000‐, 3,000‐, 5,000‐, and 10,000‐word families, as well as the Academic Word List (AWL). Scores from each level were analysed descriptively to determine students’ vocabulary profiles. To answer Research Question 3, normality tests (Shapiro–Wilk) were first conducted on the vocabulary and writing score variables. Since several variables were not normally distributed (p

< .05), non‐parametric statistical analysis was used. Specifically, Spearman’s rho correlation was conducted to examine the relationship between students’ receptive vocabulary size and their L2 writing performance. All statistical results are presented in the following sections, with interpretation guided by the significance level of α = .05.

FINDINGS AND DISCUSSION

Research Question 1

What is the writing performance level of the participants in this study?

Table 1 Descriptive Statistics for Students’ Writing Performance for Task 1 and Task 2: CEFRLevel, Word Count, and Essay Scores

Minimum Maximum Mean SD
Task 1 (N=53)
CEFR Level 2.0 5.0 3.0 .733
No. of words 252 478 320 46.04
Task 2 (N=52)
CEFR Level 3.0 6.0 3.9 .846
Essay Scores 10 20 13.9 2.67

Table 1 presents the descriptive statistics for students’ writing performance for Task 1 and Task 2, which include the CEFR levels, word count, and essay scores. For Task 1, the CEFR levels ranged from A1 to C1, with a mean level of 3.00 (SD = 0.733), corresponding to the B1 proficiency level. This indicates that, on average, students demonstrated intermediate writing ability. Regarding word count, students produced an average of 320 words (SD = 46.04), suggesting a moderate degree of consistency in the length of essays. For Task 2, the CEFR levels ranged from B1 to C2, with a mean level of 3.9 (SD = 0.846), indicating the B2 proficiency level. Essay scores ranged from 10 to 20, with 20 being the highest possible score. The average score was 13.9 (SD = 2.67), indicating that most students performed slightly above the midpoint of the scoring scale. Overall, these findings provide a broad overview of the group’s writing proficiency, indicating that most students are functioning between the intermediate and upper‐intermediate level of proficiency.

Table 2 Distribution of Students’ Writing Proficiency for Writing Task 1 Based on CEFR Levels(N=53)

CEFR Level Frequency Percent
A1 12 22.6%
B1 31 58.5%
B2 8 15.1%
 C1                  2                           3.8%                       

Table 2 presents the distribution of students’ writing proficiency for Task 1 according to the CEFR levels. The results indicate that a majority of the students (n = 31, 58.5%) were placed at the B1 level, suggesting an intermediate level of writing competence. A smaller proportion of students (n = 8, 15.1%) achieved the B2 level, reflecting upper‐intermediate proficiency. Notably, only two students (n = 2, 3.8%) reached the C1 level, indicating advanced writing ability. Meanwhile, a small group (n = 12, 22.6%) remained at the A1 level, reflecting poor writing skills. These findings suggest that students’ essays fall within the B1 range, which is below the level of writing proficiency expected of a student enrolling in university study.

Table 3 Distribution of Students’ Writing Proficiency for Writing Task 2 Based on CEFR Levels and total essay scores (N=52)

CEFR Level Frequency Percent
B1 18 35%
B2 24 46%
C1 7 13%
 C2                   3                            6%                   

Table 3 presents the distribution of students’ writing proficiency for Task 2 according to the CEFR levels. The results indicate that a majority of the students (n = 24, 46%) were placed at the B2 level, suggesting an upper‐intermediate level of proficiency. A smaller proportion of students (n = 18, 35%) achieved the B1 level, reflecting an intermediate level of writing competence. Meanwhile, a small group (n = 7, 13%) reached the C1 level, while only 3 students (6%) reached the C2 level, indicating advanced writing ability.

Overall, the majority of student essays in Task 1 and Task 2 were classified at the B1 (58.5%) and B2 (46%) proficiency levels, respectively. This discovery points to a proficiency level that remains below what is typically expected in tertiary academic writing. While B1 and B2 indicate functional language use, they often fall short in areas like lexical precision, argument development, and cohesion. Previous studies in similar contexts (Yasin et al., 2024; Li et al., 2025) presented comparable conclusions, showing that most students operate within this intermediate range. This further supports the need for instructional scaffolding that targets higher‐order writing skills associated with C1‐level output.

Research Question 2

What is the vocabulary size of the participants in this study?

Table 4 Descriptive statistics for receptive vocabulary scores across word frequency levels (N= 53)

Min Max Max possible Mean SD
score    
2000 Level 61 100 100 90 8.9
3000 Level 56 100 100 91 10.0
5000 Level 50 100 100 79 12.0
10000 Level 33 94 100 67 13.7
AWL 0 100 100 49 24.4

Table 4 presents the descriptive statistics for students’ receptive vocabulary size across five word levels, based on a maximum possible score of 100 for each level. At the 2,000‐word level, scores ranged from 61 to 100, with a mean score of 90 (SD = 8.9), indicating that, on average, students reached the mastery threshold. Similarly, at the 3,000‐word level, scores ranged from 56 to 100, with a mean of 91 (SD = 10.0), suggesting consistent mastery at this frequency band. For the 5,000‐word level, scores ranged from 50 to 100, with a mean of 79 (SD = 12.0), indicating performance below the mastery level. At the 10,000‐word level, scores ranged from 33 to 94, with a mean of 67 (SD = 13.7), reflecting limited vocabulary knowledge at this advanced level.

The lowest performance was observed for the Academic Word List (AWL), where scores ranged from 0 to 100, with a mean of 49 (SD = 24.4), indicating a low level of mastery. According to Nation (2001), a score of 90 or above is considered the threshold for mastery. Based on this criterion, students demonstrated mastery at the 2,000‐ and 3,000‐word levels but showed declining performance as word levels increased. Consistent with students’ writing ability, students also lacked vocabulary proficiency, which allows for smooth and coherent writing. This discovery aligns with existing literature that highlights the lexical constraints of university‐level ELL learners and how such limitations can impede written expression (e.g., Nation, 2001).

Research Question 3

Are receptive vocabulary size and L2 writing performance significantly correlated?

Table 5 Spearman’s Rank-Order correlations between vocabulary levels and writing performance for Task 1 (N = 53)

Vocabulary Levels Writing Performance (r) CEFR Level (r)
2000 level .356* ‐.225
3000 level .374** ‐.120
5000 level 0.234 ‐.072
10000 level .409** ‐.344*
AWL .337* ‐.184

Note. All values are Spearman’s rho.

p < .01 (2‐tailed). Correlations are significant at the 0.01 level.

To examine the relationship between students’ writing ability and receptive vocabulary knowledge across different word frequency bands, a Spearman’s rho correlation analysis was conducted. Spearman’s rho was selected due to the nature of the data, as CEFR scores were based on rubric‐driven levels, and vocabulary scores did not meet normality assumptions. Shapiro–Wilk tests confirmed that both sets of scores were not normally distributed (p < .05), making the non‐parametric method more appropriate for the analysis. The results revealed a statistically significant positive correlation between CEFR levels and vocabulary knowledge at several levels. A moderate positive correlation was found between CEFR levels and the 2,000‐word level (ρ = .356, p < .05), as well as the 3,000‐word level (ρ = .374, p

< .01). The strongest correlation emerged between CEFR levels and the 10,000‐word level (ρ = .409, p < .01), suggesting that students with higher writing proficiency tended to possess greater knowledge of lower‐frequency vocabulary. A weaker but still significant correlation was also observed between CEFR levels and academic vocabulary knowledge (ρ = .337, p < .05). However, the correlation between CEFR levels and the 5,000‐word level was not statistically significant (ρ = .234, p > .05). These findings suggest that receptive vocabulary size, particularly at both high‐ frequency and low‐frequency levels, is associated with writing proficiency as represented by CEFR levels. Consistent with findings in the literature on predictors of L2 writing, the current work revealed that vocabulary size predicts writing ability, and the lack of it would impede writing development.

To explore the association between students’ vocabulary knowledge and the amount of text they produced, a Spearman’s rho correlation analysis was conducted between receptive vocabulary scores and the total number of words. Overall, the results revealed weak negative correlations across all vocabulary levels. The only statistically significant finding came from the 10,000‐word level (ρ = –.344, p < .05), where students with stronger knowledge of low‐frequency words tended to write slightly shorter essays. At the 2,000‐word level, the correlation was also negative (ρ = –.225), but not statistically significant. Similar patterns were observed at the 3,000‐word (ρ = –.120), 5,000‐word (ρ = –.072), and academic vocabulary levels (ρ = –.184), though none of this reached significance. Taken together, these results suggest that having a broader vocabulary does not necessarily mean students will write more. This is consistent with the findings of Linuwih (2013), who found no significant correlation between receptive vocabulary size and the length of students’ written texts, suggesting that learners with broader vocabulary knowledge do not always produce lengthier essays. A possible reason for this is that students with a broader vocabulary size may prioritise conciseness and precision over quantity. As Quines (2023) argues, vocabulary level correlates with writing performance, but not necessarily with output volume.

Table 6 Spearman’s Rank-Order Correlations Between Vocabulary Levels and Writing Performance for Task 2 (N = 53)

Receptive Vocabulary Levels CEFR Levels (r) Essay Scores (r)
2000 level .406** .507**
3000 level .436** .664**
5000 level .426** .487**
10000 level .411** .497**
AWL .353** .469**

Note. All values are Spearman’s rho.

p < .01 (2‐tailed). Correlations are significant at the 0.01 level.

To better understand how students’ vocabulary knowledge contributes to their writing performance, a Spearman’s rho correlation analysis was conducted between writing scores and receptive vocabulary size across five word‐frequency levels. The analysis revealed significant and positive correlations across all vocabulary levels. Notably, the 3,000‐word level demonstrated the strongest relationship with essay scores (ρ = .664, p < .01), suggesting that familiarity with mid‐frequency vocabulary may play a particularly important role in shaping students’ written expression. Moderate correlations were also observed with the 2,000‐word level (ρ = .507, p < .01), the 5,000‐word level (ρ = .487, p < .01), and the 10,000‐word level (ρ = .497, p

< .01), indicating that a broad vocabulary base contributes positively to writing performance. In addition, academic vocabulary knowledge was significantly correlated with essay scores (ρ = .469, p < .01), pointing to the relevance of more specialised lexical knowledge in academic writing tasks. The consistent significant correlation between vocabulary levels and students’ essay scores implies that as vocabulary levels increase, the students’ essay scores will increase too. Together, these results suggest that students who possess stronger vocabulary knowledge tend to produce higher‐quality essays.

Limitations and suggestions for future research

While this study provides useful insights, it is not without its limitations. Firstly, the small sample of 53 students was limited to a single group of university students from within the Malaysian context. Hence, this may affect the generalisability of the results. Another limitation is the exclusive focus on receptive vocabulary. The focus on receptive vocabulary neglects vocabulary depth, which is equally important for capturing academic nuance. Writing performance was assessed through limited tasks, raising questions about applicability across genres and disciplines. Methodological transparency could also be strengthened by reporting scoring rubrics and establishing inter-rater reliability.

Future research should broaden writing tasks to include argumentative, analytical, and research-based genres. Additionally, researchers might consider conducting longitudinal studies to see how vocabulary knowledge and writing ability develop over time, especially when interventions are introduced. Experimental studies that incorporate vocabulary instruction as part of writing courses could also shed more light on what kinds of teaching strategies work in the classroom. For example, exploring affective factors such as motivation or writing anxiety might also provide a more holistic framework of second language writing predictors. Lastly, pedagogical implications would be more actionable if they included integrated vocabulary-writing interventions tailored to academic needs.

Overall, it is fair to suggest that vocabulary size matters and that the inquiry into the relationship between vocabulary and writing is far from exhaustive. As the current work has demonstrated, there is a gap in ELL classrooms that deserves more attention.

CONCLUSION

The current study investigated the relationship between vocabulary knowledge and writing ability among Malaysian ELL undergraduates. The results from the analyses yielded several key findings. Firstly, it was revealed that most students operated at the B1 CEFR level, which is below the proficiency expected at the tertiary level. Only a small number of students reached B2 and C1 levels, while a significant portion remained at the A1 level. Despite having completed at least 11 years of English education, these results suggest that many students lack the academic writing competence essential for tertiary education. Their moderate essay scores and CEFR levels further support this conclusion, revealing gaps in both quality and fluency of written output. In this regard, vocabulary instruction at the tertiary level should go beyond rote learning of high‐frequency words. Teachers need to place greater emphasis on helping students develop depth in their vocabulary, especially for mid and low‐frequency words, as well as academic vocabulary. These are the lexical resources that can truly empower students to express complex ideas and write more cohesively.

The second key finding revealed that concerning vocabulary knowledge, students demonstrated mastery of high‐frequency vocabulary at the 2,000‐word and 3,000‐word levels. However, their performance declined significantly for the 5,000‐ word and 10,000‐word bands. This was especially evident for the Academic Word List (AWL). This trend suggests that while learners are familiar with everyday vocabulary, they lack the lexical depth required for academic and advanced writing tasks. This points to an important message: while many university students may appear proficient in English due to years of schooling, their actual vocabulary knowledge still falls short of what is needed for strong writing performance. Curriculum designers and language instructors may want to re‐evaluate current proficiency assumptions. It is easy to assume that students at the university level are already equipped with sufficient vocabulary for academic writing, but this study has shown that this is not always the case. A more diagnostic approach at the start of university courses could help identify gaps early and guide intervention efforts more effectively.

The next key finding revealed that there were positive associations between receptive vocabulary and both CEFR‐based writing proficiency and students’ essay scores. The strongest correlation was observed at the 3,000‐word level, indicating that mid‐frequency vocabulary plays a vital role in producing better‐quality essays. The findings underscore that vocabulary knowledge, especially beyond high‐ frequency words, is a strong predictor of writing ability. However, vocabulary size did not correlate positively with essay length. A weak negative correlation was found, suggesting that students with stronger vocabularies may prioritise precision over verbosity. Based on this finding, it is fair to suggest that writing lessons be closely tied to vocabulary development. This means designing writing lessons that not only focus on grammar or structure but also integrate new vocabulary into writing tasks. For instance, students can be encouraged to use target vocabulary in paragraph writing, essay drafts, or peer feedback sessions. This would give them more opportunities to activate their vocabulary in meaningful contexts.

Theoretically, these findings strengthen Nation’s (2001) vocabulary framework, which emphasises the importance of word frequency levels in language proficiency. The results also reflect Vygotsky’s sociocultural theory, where the lack of vocabulary acts as a constraint within the learner’s Zone of Proximal Development (ZPD), thereby limiting their ability to express more complex ideas in writing. Most importantly, the data validate the view that vocabulary is not only a component of linguistic competence but a facilitator of written communicative ability. Pedagogically, this study emphasises the importance of explicit vocabulary instruction, particularly for low‐frequency and academic words. Teachers should move beyond general vocabulary and incorporate tasks that promote deeper engagement with less frequent, discipline‐specific lexis. Furthermore, writing instruction should integrate lexical awareness activities, encouraging students to use and experiment with more advanced vocabulary in meaningful writing contexts. This lexical development is essential to help ELL undergraduates meet the academic demands of university writing and to transition from basic communicators to proficient academic writers.

In conclusion, the study affirms that receptive vocabulary size is a crucial factor in writing performance. As presented in the findings of this study, the assumption that university students possess adequate vocabulary and writing skills must be re‐evaluated. Future research should consider intervention studies that assess the impact of focused vocabulary instruction on writing development over time, particularly in Malaysia’s tertiary education contexts.

REFERENCES

  1. Abdul Aziz, R., Supian, S. H., Abdul Sukor, F. S., & Nikman, K. (2021). Measuring lexical richness in the writings of ESL learners at a tertiary institution in Malaysia. Gading Journal for Social Sciences, 24(4), 101‐108. https://doi.org/10.24191/gading.v24i04.268
  2. Allagui, B., & Al Naqbi, S. (2024). The Contribution of Vocabulary Knowledge to Summary Writing Quality: Vocabulary Size and Lexical Richness. TESL‐EJ, 28(1), 1–15. https://doi.org/10.55593/ej.28109a5
  3. Alsahafi, M. (2023). The relationship between depth of academic English vocabulary knowledge and academic success of second language university students. Sage Open,                  13(1),
  4. https://doi.org/10.1177/21582440231153342
  5. Amir, M. M., & Sulaiman, N. A. (2024). Academic vocabulary knowledge among Malaysian ESL undergraduates. International Journal of Academic Research in Business and Social Sciences, 14(9), 181–193. https://doi.org/10.46886/IJAREG/v14‐i9/11388
  6. Baek, H., Lee, Y., & Choi, (2023). Proficiency versus lexical processing efficiency as a measure of L2 lexical quality: Individual differences in word‐frequency effects in L2 visual word recognition. Memory & Cognition, 51(8), 1858‐ 1869. https://doi.org/10.3758/s13421‐023‐01436‐0
  7. Barcroft, J. (2004). Second language vocabulary acquisition: A lexical input processing approach. Foreign Language Annals, 37(2), 200‐208. https://doi.org/10.1111/j.1944‐9720.2004.tb02193.x
  8. Caltabellotta, E., Van Steendam, E., & Noreillie, A. S. (2025). Vocabulary knowledge and vocabulary use in writing: A cross‐sectional comparison of L2 English and L2 French. Applied Linguistics. Retrieved from academic.oup.com https://doi.org/10.1093/applin/amaf009
  9. Chen, M., & Liu, Y. (2022). A comparative study of lexical richness in English writing by Chinese senior high school Asia Pacific Journal of Education, 45(1),                                     227–242. https://doi.org/10.1080/02188791.2022.2100739
  10. Choi, Y. H. (2017). Roles of Receptive and Productive Vocabulary Knowledge in L2 Writing through the Mediation of L2 Reading English Teaching, 72(1), 3‐24. https://doi.org/10.15858/ENGTEA.72.1.201703.3
  11. Chu, M. L., Ganapathy, M., & Mamat, A. (2019). Vocabulary knowledge among Malaysian secondary school students: A study of vocabulary size and vocabulary levels. 3L: Language, Linguistics, Literature, 25(4), 42–54. https://doi.org/10.17576/3L‐2019‐2504‐04
  12. Engber, C. A. (1995). The relationship of lexical proficiency to the quality of ESL compositions. Journal of second language writing, 4(2), 139‐155. https://doi.org/10.1016/1060‐3743(95)90004‐7
  13. Liu, D., & Chen, X. (2020). Visual search and reading comprehension in Chinese children: the mediation of word detection skill. Reading and Writing, 33(5), 1163‐1182. https://psycnet.apa.org/doi/10.1007/s11145‐019‐09996‐x
  14. Green, A., & Hawkey, R. (2005). Watching for washback: Observing the influence of the IELTS Academic writing test in the classroom. IELTS Research Reports Volume 11. Cambridge ESOL. https://doi.org/10.1080/15434300701333152
  15. Hao, Y., Jin, Z., Yang, Q., Wang, X., & Liu, H. (2023). To predict L2 writing quality using lexical richness indices: An investigation of learners of Chinese as a Foreign                                             Language.             System,             112,              102978.
  16. https://doi.org/10.1016/j.system.2023.103123
  17. Harji, M. B., Balakrishnan, K., Bhar, S. K., & Letchumanan, K. (2015). Vocabulary levels and size of Malaysian undergraduates. English Language Teaching, 8(9), 119–130. https://doi.org/10.5539/elt.v8n9p119
  18. Kamariah, Y., Noor Hasniza, H., Faizah, M. N., & Muthusamy, P. (2016). Academic vocabulary knowledge among Malaysian ESL undergraduates. Advances in Language                                             and            Literary          Studies,          7(4),           1–8. https://doi.org/10.7575/aiac.alls.v.7n.4p.1
  19. Karafkan, M. A., & Ansarin, A. A. (2022). Depth and breadth of vocabulary knowledge as predictors of narrative, descriptive, and argumentative writing. Journal of Modern Research in English Language Studies. Retrieved from search.ebscohost.com https://doi.org/10.30479/jmrels.2020.14268.1757
  20. Kaur, M., Ganapathy, M., & Yunus, M. M. (2020). The influence of vocabulary knowledge on writing performance: ESL learners in a Malaysian public university. Arab World English Journal, 11(3), 489–502. https://doi.org/10.24093/awej/vol11no3.31
  21. Lateh, N., Shamsudin, S., & Raof‐Abdul Halim, A. (2018). Receptive vocabulary levels of Malaysian university students. Arab World English Journal (AWEJ), 9(3), 338–351. https://www.researchgate.net/publication/328884894
  22. Lateh, N., Ramli, S., Hassan, R. A., & Salleh, N. M. (2024). Receptive vocabulary size and English language proficiency of Malaysian undergraduates. International Journal of Academic Research in Business and Social Sciences, 14(3), 1594–1610. https://doi.org/10.6007/IJARBSS/v14‐i3/17771
  23. Laufer, B., & Nation, P. (1999). A vocabulary‐size test of controlled productive ability. Language                                          testing,                       16(1),                       33‐51. https://doi.org/10.1177/026553229901600103
  24. Laufer, B., & Goldstein, Z. (2004). Testing vocabulary knowledge: Size, strength, and computer adaptiveness. Language Learning, 54(3), 399–436. https://doi.org/10.1111/j.0023‐8333.2004.00260.x
  25. Lervåg, A., & Aukrust, V. G. (2010). Vocabulary knowledge is a critical determinant of the difference in reading comprehension growth between first and second language learners. Journal of Child Psychology and Psychiatry, 51(5), 612‐ 620. https://doi.org/10.1111/j.1469‐7610.2009.02185.x
  26. Leki, I., & Carson, J. G. (1994). Students’ perceptions of EAP writing instruction and writing needs across the disciplines. TESOL Quarterly, 28(1), 81–101. https://doi.org/10.2307/3587199
  27. Lessard‐Clouston, M. (2012). Technical vocabulary use in English‐medium disciplinary writing: A native/non‐native case study. Linguistics Journal, 6(1). Linuwih, A. A. (2013). The correlation between receptive and productive vocabulary size and the quality of students’ writing of English Department students of Widya Mandala Catholic University Surabaya. [Undergraduate thesis, Widya Mandala Catholic University Surabaya]. Widya Mandala Repository. https://repository.ukwms.ac.id/id/eprint/1882
  28. Llach, M. P. A., & Gallego, M. T. (2009). Examining the Relationship between Receptive Vocabulary Size and Written Skills of Primary School Learners/Examen de la relación entre el conocimiento de vocabulario receptivo y las destrezas escritas de los alumnos de primaria. Atlantis, 129‐ http://www.jstor.org/stable/41055350
  29. Meara, P. (1996). The vocabulary knowledge framework. Vocabulary acquisition research group virtual library, 5(2), 1‐11.
  30. Milton, J., & Fitzpatrick, T. (2017). Dimensions of vocabulary knowledge. Bloomsbury Publishing.
  31. Min, C., & Sukying, A. (2024). Investigating the role of word knowledge components in Chinese L2 writing ability. 3L: Southeast Asian Journal of English Language Studies, 30(2), 15–30. https://doi.org/10.17576/3L‐2024‐3004‐19
  32. Mohd. Said, M. N., Thambirajah, R., Md Yunus, M., Tan, K. H., & Sultan, N. (2021). Creative vocabulary learning strategies in improving ESL descriptive writing among secondary school learners in Malaysia. 3L: Language, Linguistics, Literature, 27(3), 104–117. https://doi.org/10.17576/3L‐2021‐2703‐08
  33. Mukundan, J., & Rezvani Kalajahi, S. A. (2013). Evaluation of Malaysian English language teaching textbooks. International Journal of Education and Literacy Studies,                                                                          1(1),                                          38–46. https://journals.aiac.org.au/index.php/IJELS/article/view/154/1453
  34. Nation, I. S. P. (1990). Teaching and learning vocabulary. Newbury House Publishers. Nation, I. S. P. (2001). Learning vocabulary in another language. Cambridge University Press. https://doi.org/10.1017/9781009093873
  35. Nation, I. S. P. (2006). How large a vocabulary is needed for reading and listening? Canadian                           Modern        Language       Review,       63(1),        59–82. https://doi.org/10.3138/cmlr.63.1.59
  36. Nation, P. (2017). How vocabulary is learned. Indonesian JELT: Indonesian Journal of English Language Teaching, 12(1), 1‐14.Qian, D. (1999). Assessing the roles of depth and breadth of vocabulary knowledge in reading comprehension. Canadian modern language review, 56(2), 282‐308. https://doi.org/10.25170/ijelt.v12i1.1458
  37. O’Loughlin, K. J. (2011). The interpretation and use of proficiency test scores in university selection: How valid and ethical are they? Language Assessment Quarterly, 8(2), 146–160. https://doi.org/10.1080/15434303.2011.564698
  38. Palpanadan, S. T., Anthony, E. M., Ngadiran, N. M., & Zainal, A. (2019). Comparative analysis of writing approaches practised in Malaysian ESL classrooms. Journal of Education and Social Policy, 6(3), 138‐142. https://doi.org/10.30845/jesp.v6n3p17
  39. Qian, D. D. (1999). Assessing the roles of depth and breadth of vocabulary knowledge in reading comprehension. Canadian Modern Language Review, 56(2), 282–307. https://doi.org/10.3138/cmlr.56.2.282
  40. Qian, D. D. (2002). Investigating the relationship between vocabulary knowledge and academic reading performance: An assessment perspective. Language learning, 52(3), 513‐536. https://doi.org/10.1111/1467‐9922.00193
  41. Quines, Z. M. (2023). Impact of students’ vocabulary level to their reading and writing performance. International Journal of English Language and Linguistics       Research,                                                         11(2),                          18‐3https://doi.org/10.37745/ijellr.13/vol11n21832
  42. Schmitt, N., Schmitt, D., & Clapham, C. (2001). Developing and exploring the behaviour of two new versions of the Vocabulary Levels Test. Language testing, 18(1), 55‐88. https://doi.org/10.1177/026553220101800103
  43. Schmitt, N., Jiang, X., & Grabe, W. (2011). The percentage of words known in a text and reading comprehension. The modern language journal, 95(1), 26‐43. https://doi.org/10.1111/j.1540‐4781.2011.01146.x
  44. Schmitt, N. (2014). Size and depth of vocabulary knowledge: What the research shows. Language learning, 64(4), 913‐951.
  45. Stæhr, L. S. (2008). Vocabulary size and the skills of listening, reading and writing. Language                               Learning           Journal,           36(2),            139–152. https://doi.org/10.1080/09571730802389975
  46. Sukying, A. (2023). The Role of Vocabulary Size and Depth in Predicting Postgraduate Students’ Second Language Writing Performance. LEARN Journal: Language Education and Acquisition Research Network, 16(1), 575–603. retrieved from https://so04.tci‐thaijo.org/index.php/LEARN/article/view/263457
  47. Tan, A. W. L., & Goh, L. H. (2017). Relationship between vocabulary size and reading comprehension levels of Malaysian tertiary students. International Journal of English Language & Translation Studies, 5(4), 149‐155. https://www.eltsjournal.org/archive/value5%20issue4/19‐5‐4‐17.pdf
  48. Tong, Y., Hasim, Z., & Abdul Halim, H. (2022). The relationship between L2 vocabulary knowledge and listening proficiency: The mediating effect of vocabulary fluency. Journal of Language and Linguistic Studies, 18(1), 427‐ https://files.eric.ed.gov/fulltext/EJ1326052.pdf
  49. Unsworth, S. (2008). Comparing child L2 development with adult L2 development: How to measure L2 proficiency. In E. Gavruseva, E. & B. Haznedar (Eds.), Current Trends in Child Second Language Acquisition (pp. 301‐336). Amsterdam: John Benjamins. https://doi.org/10.1075/lald.46.15uns
  50. Webb, S., & Nation, P. (2017). How vocabulary is learned. Oxford University Press. http://dx.doi.org/10.25170/ijelt.v12i1.1458
  51. Yang, Y., Sun, Y., Chang, P., & Li, Y. (2019). Exploring the Relationship Between Language Aptitude, Vocabulary Size, and EFL Graduate Students’ L2 Writing Performance.                              TESOL         Quarterly,          53(3),           845–856. http://www.jstor.org/stable/45214958
  52. Zhang, D. (2012). Vocabulary and grammar knowledge in second language reading comprehension: A structural equation modelling study. The modern language journal, 96(4), 558‐575. https://doi.org/10.1111/j.1540‐ 4781.2012.01398.x
  53. Zhang, T., Yi, W., & Hu, L. (2022). A study on the correlation between vocabulary breadth and depth and English majors’ writing proficiency. International Journal                           of         Frontiers      in        Sociology,       4(12),       66–71. https://doi.org/10.25236/IJFS.2022.041212

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

0 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER