English Placement Test: Baseline Results of New Intake Students
- Faridah Abdul Malik
- Lily Azlina Ahmad
- Nellia Lizrina Salleh
- 3448-3461
- Jun 10, 2025
- Education
English Placement Test: Baseline Results of New Intake Students
Faridah Abdul Malik, Lily Azlina Ahmad, Nellia Lizrina Salleh
Centre for Languages and Pre-University Academic Development, International Islamic University Malaysia
DOI: https://dx.doi.org/10.47772/IJRISS.2025.905000267
Received: 29 April 2025; Accepted: 06 May 2025; Published: 10 June 2025
ABSTRACT
An English placement test (EPT) is an essential component of any foundation programme. It helps place students into their suitable language proficiency level so that they do not spend time learning materials below or above their levels. It also helps curriculum developers and teachers to prepare teaching materials to students of similar levels. This study examined the difference in performance between students from various course language groups of the International Islamic University Malaysia (IIUM) based on the English Placement Test (EPT) results from 2019-2023. 5729 new undergraduate students were analysed based on the Writing and Reading components of the EPT examinations. The correlation between the two components was then investigated, adjusting for the sex of the students involved. A 2-way ANOVA was conducted to determine the mean difference for writing and reading bands according to group. An ANCOVA test was then conducted to determine the correlation between the two components. It was found that overall, the English Major group showed a 0.687 mean difference in band compared to the English Medium group while the Non-English Major group showed a -0.3 mean difference in band in comparison. A positive correlation was found where an increase of 0.625 in reading band was observed for every increase in writing band. Furthermore, males generally performed slightly lower than females, but their performance showed to improve as their scores increased. In conclusion, this study serves as a foundational baseline for further studies on the EPT results in IIUM.
Keywords: Baseline study, English language proficiency, Reading, Writing, EPT
INTRODUCTION
English plays an important role in academic success, particularly in higher education, where it is often the medium of instruction (Bo et al., 2023; Rose et al., 2019). Proficiency in English enables students to understand academic texts, engage in discussions, and complete assignments that meet international standards. In recent years, the use of English as a medium of instruction has rapidly increased worldwide, especially in Asia (Fenton-Smith et al., 2017). Countries like Japan, Korea, and Taiwan have seen this growth in both private and public universities. Similarly, in Malaysia, while Malay is the official medium of instruction, many universities now offer courses in English.
However, students often face challenges in mastering English due to factors like first-language interference, lack of interest, and the perception that learning English is unnecessary, especially in rural areas (Kumaran & Krish, 2021; Renganathan, 2021). These issues can limit opportunities for higher education and careers, as poor English proficiency is often linked to graduate unemployment (Monogaran & Subramaniam, 2023; Sundram, 2024). A review of Malaysian universities revealed that 23 public and private institutions offer some courses in English. This growing reliance on English underscores the importance of standardised proficiency tests for ensuring quality. These tests serve multiple purposes, such as facilitating university admissions and identifying students’ language support needs, helping to better prepare them for academic and professional success.
The Centre for Languages and Pre-University Academic Development (CELPAD) has, since the establishment of IIUM, been the centre that administers English Proficiency Test (EPT) to all new students intending to pursue their studies in IIUM. Before embarking on their studies in IIUM, students must fulfil the language requirement set by their respective Kulliyyahs or faculties. Courses could be grouped by passing their course’s passing band where overall band and writing bands are taken into consideration. The lowest passing band applies to Non-English Major courses, which are taught in Arabic or Malay. This is followed by English Medium courses, which make up the majority of offerings and are taught in English. The highest passing band is reserved for English Major courses, taken by students pursuing degree programmes in English Language and Literature or Teaching of English as a Second Language (TESL).
Research Problem
The English Placement Test (EPT) at IIUM serves as a diagnostic tool for assessing the English language proficiency of newly enrolled undergraduate students. These students are categorized into three distinct academic groups based on their educational programmes: English Medium, English Major, and Non-English Major. Each group reflects unique language-learning backgrounds and academic demands. However, despite this categorization, little is known about how the students’ overall, writing, and reading band scores differ across these groups. Such knowledge is crucial for understanding whether the current categorization aligns with the actual language needs and abilities of the students.
Furthermore, writing and reading skills are interconnected in language learning, and the strength of their correlation can provide insights into the instructional focus required for each group. For example, a strong correlation might indicate a need for integrated skills training, while a weaker correlation could suggest differentiated approaches to teaching these skills.
This study, therefore, seeks to address two critical gaps:
Understanding the differences in overall, writing, and reading band scores among the three groups with respect to differences in genders.
Exploring the correlation between writing and reading scores within each group.
The findings from this study have the potential to inform curriculum design, placement criteria, and targeted instructional strategies for students across different academic programmes. By addressing these gaps, the study can contribute to optimizing English language instruction at IIUM and ensuring that the students’ diverse linguistic and academic needs are met effectively.
LITERATURE REVIEW
Standardised Test
Standardised proficiency tests are an integral way to assess a student’s proficiency and are widely recognized (Kadwa & Alshenqeeti, 2020; Krokhmal, 2023; Sheerah et.al, 2022). Common tests include the International English Language Testing System (IELTS), the Test of English as a Foreign Language (TOEFL), and the Malaysian University English Test (MUET). Standardise tests are also important as baseline assessments that are crucial for measuring skills and tracking changes in educational contexts. However, despite administering language proficiency tests to various cohorts over the years, no comprehensive baseline studies have been conducted in IIUM to monitor language learning progression. This lack of baseline data makes it challenging to identify trends, adjust instructional strategies, or accurately assess the effectiveness of interventions. As Kyriakides and Campbell (1999) noted, baseline assessments serve multiple purposes: identifying learning needs, providing summative evaluations, pinpointing students requiring additional support, and recording initial progress. This is particularly important in language education, where proficiency can evolve due to changes in educational practices, student demographics and external factors. By establishing foundational data through regular baseline assessments, educators at IIUM could more effectively tailor their language programs to both group and individual student needs, measure the impact of their interventions, and ensure that language programs remain effective and responsive to current needs. This approach would enable data-driven decision-making to enhance educational outcomes in language learning at the institution. However, critics have raised concerns about the predictive validity of placement tests (Johnson & Tweedie, 2021; Liao, 2022), stakeholders’ knowledge and views regarding these tests (Ali et al., 2020; Rezaeian et al., 2020), and the use of these standardized test scores for placement into language programs (Hille & Cho, 2020; Dang & Dang, 2023). Given the significant role of assessments in guiding decisions for both organizations and individuals, it is crucial to address these concerns to establish a robust and valid assessment system.
The EPT focuses on assessing students’ reading and writing skills, given their essential role in achieving academic success. In a study by Kerstjens and Nevy (2000), which involved 113 international students at an Australian university taking the IELTS, a significant correlation was found between reading and writing scores and initial GPA results. In contrast, listening and speaking scores showed no predictive value for academic performance. Similarly, Baharum et al. (2021) investigated the predictive validity of the MUET examination among second-year teacher trainees in a B.Ed. TESL program at a Malaysian public university. They found that the ‘Reading’ and ‘Writing’ components of the MUET closely align with typical higher education tasks, such as expository reading and analytical writing. This alignment underscores the effectiveness of MUET scores in ‘Reading’ and ‘Writing’ as predictors of academic success in both undergraduate and postgraduate studies. These findings highlight the importance of reading and writing proficiency assessments, such as the IELTS and MUET, in predicting academic success across diverse educational contexts
As such, EPT is administered to determine the language proficiency level of the students. The EPT specifically assesses two skills: writing and reading. The writing component consists of two questions: report writing, which focuses on analysing graphs, and essay writing, focusing on opinion essays. The reading component includes 40 multiple-choice questions based on four reading passages, assessing different reading skills. Those who do not meet the passing standards are placed in an English course that is relevant to their level of proficiency.
Correlation Between Reading and Writing
Reading and writing are important skills that students need to employ when they are at the tertiary level. These two language skills can help them become independent students, contributing to their success in academic and professional context (Jerrim & Moss, 2019). Reading skills refer to the ability to understand, interpret, and critically evaluate written texts involving a range of cognitive processes and abilities. Many studies have demonstrated the association between frequency of reading and higher attainment in academic context (Benwari & Nemine, 2014; Hernandez, 2011; Sikora et al., 2018).
Likewise, writing is another important skill which refers to the ability to effectively communicate ideas, information, and arguments through the written word. It also includes other important components such as syntax, conciseness, coherence and others (Randaccio, 2013). Without a doubt, writing is an important skill for tertiary students. The demands to complete their assignments and projects can be challenging. Thus, students might find inadequate writing skills can jeopardize them from performing well academically (Bram & Angelina, 2022). Even if both reading and writing are considered as an individual skill, research have shown that both are more effective if integratedly taught (Allen et.al, 2014). Writing has been found in particular to be enhanced by reading skills (Grabe & Zhang, 2013). A number of studies have shown the correlation between reading and writing performance where students with good reading proficiency performed well in their writing assessment (Hsieh, 2024; Lee & Schallet, 2016; Moon et.al, 2019)
Choi et al. (2018) examined the relationship between reading comprehension and writing performance among 146 advanced Korean EFL college seniors preparing for graduate studies abroad. Using cloze tests, multiple-choice reading assessments, and vocabulary tests, they found significant correlations between reading comprehension, writing ability, and vocabulary knowledge. Structural equation modelling (SEM) analysis revealed that reading comprehension positively influenced writing competence, with vocabulary knowledge primarily affecting reading comprehension indirectly rather than directly influencing writing performance. Guo and Yan (2017) analysed 865 essays from the HSK Dynamic Composition Corpus to explore how Chinese essay achievements correlate with reading performance among Chinese as a Second Language (CSL) learners. Using Pearson correlation analysis, they categorized participants into four proficiency groups (Groups A-D). The study revealed a statistically significant moderate positive correlation (Pearson’s r = 0.432, p < 0.05) between writing and reading scores. However, this correlation’s strength and significance varied across proficiency levels, underscoring the complex relationship between reading and writing skills in second language acquisition.
Hsieh (2024) found a correlation between reading and writing skills among young EFL learners. The results showed that students with stronger reading proficiency also demonstrated better writing performance across various measures, such as lexical sophistication, syntactic complexity, writing fluency, and idea development. This indicates that enhancing reading skills can support the development of writing skills in EFL learners. A El-Koumy examined the relationship between reading and writing skills in native English speakers (NES) and English as a foreign language (EFL) learners. The subjects were 150 NES English majors from U.S. universities and 150 EFL students from four Egyptian universities. Standardized reading and writing tests were used. Results showed a significant positive correlation between reading and writing scores for NES students, but not for EFL learners. The difference may be due to teaching methods, language proficiency, or language use outside class.
Gender Difference in English Proficiency
Walczak and Geranpayeh (2015) examined gender differences in English language proficiency among adolescents preparing for higher education. Their findings revealed that females generally performed better than males across all language skills, with more pronounced differences in writing and speaking. However, the performance gap was relatively small, typically less than half a band. Results varied between countries, with females outperforming males in some regions and the reverse observed in others. The authors attributed these variations to socio-cultural influences and differences in learning behaviours. Similarly, Reilly et al. (2019) analysed three decades of U.S. student achievement data and found consistent gender differences in literacy skills, with females outperforming males in both reading and writing. The gap widened with age, and by Grade 12, effect sizes were moderate (d = -0.32 for reading; d = -0.55 for writing). These results suggest that gender disparities in literacy skills are persistent and significant. Lasekan (2018) also explored gender differences in English proficiency and found that females’ stronger performance could be partly explained by their more positive attitudes toward learning English and higher levels of self-confidence. This highlights the importance of individual attitudes and motivation in shaping language outcomes.
Building on these findings, this study reinforces the idea that gender differences in English proficiency are shaped by both individual and contextual factors. While females consistently outperform males in reading and writing, males demonstrate notable improvements at higher proficiency levels. This suggests that targeted interventions tailored to gender-specific learning needs could be beneficial in addressing these disparities. Future efforts to reduce gender gaps should account for socio-cultural influences and individual learning trajectories.
METHODOLOGY
A retrospective cohort study was conducted on the new intake of undergraduate students at IIUM who underwent the English Placement Test (EPT) as part of their matriculation process. This study focused on new undergraduate students from 2019 to 2023 that were matriculated through direct entry. The assessments were conducted during the first week of orientation in an exam hall setting. A two and half-hour examination was conducted concurrently with both reading and writing components, with the exception of the 2020/2021 session where an online assessment was conducted resulting from precautions caused by the COVID-19 pandemic. Students were then separated into 3 language groups according to their course requirements of ‘English Medium’, ‘English Major’, and ‘Non-English Major’. The language requirements vary across the groups, with the highest standard set for the English Major group, followed by the English Medium group, and the Non-English Medium group having the lowest requirement.
Exclusion Criteria
Students were excluded from the study if they had incomplete data for any individual component of the EPT or if they were enrolled as postgraduate or pre-university students.
Baseline Analysis
To analyse baseline results, a two-way ANOVA was performed to examine potential statistical differences for reading, writing, and overall band scores. The analysis included groupings based on language course groups, with sex included as a covariate. Following the ANOVA, a Tukey HSD test was conducted to identify significant differences between pairs of group means.
Correlation Between Reading and Writing Bands
To investigate the relationship between writing and reading bands, an ANCOVA was conducted. The analysis assessed the correlation between the two scores while accounting for language course groups, with sex added as a covariate.
Statistical Assumptions
All statistical analyses were conducted after verifying the necessary assumptions for the tests. For the two-way ANOVA and ANCOVA, the assumptions of normality, homogeneity of variance, and independence were assessed. Normality of residuals was evaluated using Q-Q plots, which indicated a positive skew in the residuals, suggesting a slight deviation from normality. Homogeneity of variance was assessed using Levene’s test, which revealed a significant violation (p < 0.05), indicating unequal variances across the groups.
Although these assumption violations were observed, the residuals’ spread in the Residuals vs. Fitted plot did not show any major violations, with no clear funnelling or non-linear patterns. Outliers were present, but with a large sample size (n = 5726), these are unlikely to have a substantial impact on the results. Therefore, despite the assumption violations, the analyses remain robust due to the large sample size and the general pattern of residuals, which align with the expectations under the Central Limit Theorem. However, further investigation into the outliers and consideration of robust methods could be useful in future analyses. All analyses were done using R.
RESULTS
5729 new undergraduate students in IIUM were analysed from 2019 to 2023. Students were from 92 courses across 15 different faculties in IIUM where they were separated into 3 language groups, ‘English Medium’, ‘English Major’ and ‘Non-English Major’. The Non-English Major consisted of students undertaking courses which required either Arabic or Malay such as Arabic and Malay Communication. English Major courses were predominantly English courses such as English Communication and TESL apart from Law which was grouped as an English Major course due to its stringent requirement for English subjects prior to admission and better than average results as observed in Fig. 1. All other courses were grouped as ‘English Medium’ courses due to lower requirements for passing the EPT compared to English Major courses. 3706 females and 2023 males were analysed in the cohort with the breakdown by language group shown in Supplementary Table I.
Mean Difference Between Group Bands
Fig. 1 Median and mean of overall EPT band (A), writing band (B) and reading band (C) according to each faculty where faculties with multiple course language groups are separated. Mean is shown by the red diamond. 0 = English Medium , 1 = English Major , 2 = Non-English Major.
English Major groups were shown to have higher median and mean bands compared to other faculties apart from KOD, KOM and KAHS. No clear distinction can be observed between English Medium and Non-English Major courses. The CFL faculty shows the highest range for all 3 graphs.
Students were then analysed according to their language group where the results for Writing, Reading and Overall could be compared between groups. This was done using a 2-way ANOVA where sex was added as a covariate. Supplementary Fig. 1 illustrates the different results for each group where the statistical analyses were summarised in Table I.
Table I Mean Difference and Pairwise Comparisons between Each Language Group for Writing, Reading and Overall Bands
Category | Degrees of
Freedom |
Sum of
Squares |
Mean
Square |
F-value | p-value |
Writing | |||||
Course Group | 2 | 420 | 208.82 | 175.588 | <0.001*** |
Males | 1 | 49 | 49.27 | 41.235 | <0.001*** |
Group: Males | 2 | 6 | 2.82 | 2.356 | 0.0948 |
Residuals | 5723 | 6839 | 1.19 | ||
Comparison | Mean Difference | 95% CI | p-value | ||
English Major – English Medium | 0.694 | [0.590,0.799] | <0.001*** | ||
Non-English Major – English Medium | -0.241 | [-0.323,-0.160] | <0.001*** | ||
Non-English Major – English Major | -0.936 | [-1.054,-0.818] | <0.001*** | ||
Male – Female | -0.193 | [-0.252,-0.134] | <0.001*** | ||
Reading | |||||
Course Group | 2 | 578 | 289.24 | 173.609 | <0.001*** |
Males | 1 | 8 | 8.12 | 4.875 | 0.027* |
Group: Males | 2 | 16 | 7.84 | 4.705 | 0.009** |
Residuals | 5723 | 9535 | 1.67 | ||
Comparison | Mean Difference | 95% CI | p-value | ||
English Major – English Medium | 0.69 | [0.56,0.81] | <0.001*** | ||
Non-English Major – English Medium | -0.42 | [-0.52,-0.33] | <0.001*** | ||
Non-English Major – English Major | -1.11 | [-1.25,-0.97] | <0.001*** | ||
Male – Female | -0.08 | [-0.15,-0.01] | 0.028 | ||
Overall | |||||
Course Group | 2 | 491 | 245.50 | 208.106 | <0.001*** |
Males | 1 | 28 | 27.63 | 23.419 | <0.001*** |
Group:Males | 2 | 10 | 5.05 | 4.282 | 0.014* |
Residuals | 5723 | 6751 | 1.18 | ||
Comparison | Mean Difference | 95% CI | p-value | ||
English Major – English Medium | 0.687 | [0.584-0.791] | <0.001*** | ||
Non-English Major – English Medium | -0.336 | [-0.417,-0.255] | <0.001*** | ||
Non-English Major – English Major | -1.023 | [-1.141,-0.906] | <0.001*** | ||
Male – Female | -0.145 | [-0.203,-0.086] | <0.001*** |
For all components, course group and sex were shown to be significant (p = <0.001) while the interaction between course group and males was only significant in Reading and Overall bands. For writing, it was observed that the English Major group had a 0.694 mean difference in band compared to the English Medium group, which is further exacerbated to a 0.936 mean difference when compared to the Non-English Major group. Males were also shown to have a mean difference where a band decrease of 0.193 was observed compared to females.
For Reading, it was observed that the English Major group had a 0.69 mean difference in band compared to the English Medium group. In contrast, the Non-English Major group showed a -0.42 mean difference when compared to the English Medium group, and this difference was further amplified to -1.11 when compared to the English Major group. Males were shown to have a band decrease of 0.08 compared to females, which was statistically significant (p = 0.028). The interaction between course group and males was also significant (p = 0.009), indicating that the effect of course group on reading bands varied by sex.
For Overall, the English Major group had a 0.687 mean difference compared to the English Medium group,
while the Non-English Major group showed a -0.336 mean difference relative to the English Medium group. When compared to the English Major group, the Non-English Major group exhibited a further decrease, with a -1.023-mean difference. Males demonstrated a band decrease of 0.145 compared to females (p < 0.001). Similar to Reading, the interaction between course group and males was significant (p = 0.014), suggesting that the overall band scores were influenced differently by course group depending on sex.
Correlation Between Reading and Writing
We then analysed the correlation between the Writing and Reading components by using ANCOVA which allowed sex and course group to be added as covariates. Fig. 2 illustrates the ANCOVA analysis with a density plot showing the most common bands for each student broken down into their course group. The most common results for individuals are band 6 for Writing and band 8 for Reading, followed by band 5 for Writing and band 8 for Reading and band 6 for Writing and band 7 for Reading (Supplementary Figure 2). The English Major groups are shown to be focused on the upper right corner of the plot.
Fig. 2 Density plot of writing bands by reading bands according to course group
Overall, the ANCOVA model (Table II) was significant (p<0.001) and explained 38.1% of the variance in reading bands. Main effects of writing band, course group and sex were all significant. Writing band had a significant positive effect on the reading band (β=0.625, p<0.001), showing that higher writing bands are associated with higher reading bands. Course Group: English Major had a significant positive effect (β=1.464, p<0.001), suggesting that the English Major group had higher reading band scores than the English Medium group. Course Group: Non-English Major showed a negative effect (β=−0.311, p=0.035), indicating that this group had lower reading band scores than the English Medium group. Sex (Male) had a significant negative effect (β=−0.654, p<0.001), with males having lower reading band scores than females. The interaction between Writing Band and English Major was significant (β=−0.221, p<0.001), indicating that the relationship between writing and reading bands was stronger for the English Major group. The interaction between Writing Band and Sex was also significant (β=0.146, p<0.001), suggesting that the effect of writing band on reading band varied by sex. However, the interaction between Writing Band and Non-English Major was not significant (p=0.737).
DISCUSSION
This study sought to examine the differences in language proficiency scores, specifically writing and reading, among three groups of new undergraduate students at IIUM: English Medium, English Major, and Non-English Major. Additionally, it explored the correlation between students’ writing and reading proficiency. Through a robust statistical analysis, including a two-way ANOVA and ANCOVA, several key findings emerged that offer significant insights into the effectiveness of the English Placement Test (EPT) and its role in language program placement and academic success.
As the courses were first broken down into Kulliyyah’s, it could be observed that the median and mean of Kulliyyah’s that are within the English Major group such as AIKOL (law) displayed a higher level of proficiency compared to their peers in other groups for Reading, Writing and Overall bands. Students from the Kulliyyah of Medicine (KOM), Kulliyyah of Dentistry (KOD) and Kulliyyah of Allied Health Science (KAHS) displayed similar on-par scores to the English Major group. This may be caused by the stringent requirements that are traditionally set for the courses offered by these Kulliyyah’s for both SPM and pre-university level where a high score for English is expected as a screening tool for these courses. However, due to the small sample size and that these courses do not require higher proficiency in English such as Law, these courses were not included in the English Major group.
Differences in Language Proficiency Across Groups
The analysis revealed distinct differences in the proficiency levels across the three language groups. The English Major group consistently exhibited higher mean scores in writing, reading and overall bands compared to the other two groups. This is reflected where the English major group showed about a 0.687 mean difference in overall band compared to the English Medium group, where this difference was observed to increase to an overall 1 band difference compared to the Non-English Major group.
This aligns with the expectation that students in this group, whose courses require higher proficiency in English, would demonstrate superior language skills. The Non-English Major group displayed significantly lower scores compared to the English Medium group where the overall band difference was observed to be -0.336 compared to the English Medium group. However, even as the difference is significant, it is observed to be minor as the EPT bands are graded with 0.5 being the smallest increment. The same minor difference could be observed when comparing males to females as a -0.145 overall band was observed. This shows as the Non-English Major group and males are shown to have a significantly lower band, the difference is minimal.
TABLE II Effect Estimate of Correlation between Reading and Writing bands by Course Group and Sex
Variable | Effect Estimate (β) | Standard Error | t-Value | p-Value |
Intercept | 3.71057 | 0.09996 | 37.120 | <0.001*** |
Writing Band | 0.62507 | 0.01977 | 31.622 | <0.001*** |
Course Group:
English Major |
1.46384 | 0.29504 | 4.962 | <0.001*** |
Course Group:
Non-English Major |
-0.31077 | 0.14696 | -2.115 | 0.034* |
Sex: Male | -0.65439 | 0.12473 | -5.246 | <0.001*** |
Writing Band:
English Major |
-0.22145 | 0.05295 | -4.182 | <0.001*** |
Writing Band:
Non-English Major |
0.01022 | 0.03047 | 0.336 | 0.737 |
Writing Band: Males | 0.14623 | 0.02495 | 5.861 | <0.001*** |
The minor difference between the Non-English Major group and English Medium group is notable as the overall difference was smaller than the difference between the passing band of each course of 0.5. This could reflect on the course requirement as a whole, where English is used as a medium of communication rather than a critical component of the course such as in the English Major group. This could also reflect the English proficiency of the student population as Non-English Major students have almost on-par results on English Medium students.
Correlation Between Reading and Writing Proficiency
The study also aimed to examine the relationship between writing and reading proficiency, as these two skills are highly interconnected in language learning (Grabe, 2003). The density plot (Supplementary Fig. 2) showed that more than one fifth of students analysed achieved band 6 for Writing and band 8 for reading. Our findings indicate a positive correlation between writing and reading scores in all three groups, though the strength of the correlation varied. A 0.624 increase in reading band was observed for every 1 band increase in writing overall. This remains true for both Non-English Major and English Medium group. However, a 0.221 decrease of effect estimate was observed for the English Major group. This was caused by the majority of English Major students having an overall higher score for both reading and writing, leading to the decrease of effect estimate compared to the Non-English Major and English Medium group. Males were also shown to be a significant covariate where males were shown to have a significantly lower score compared to females. However, an increase in effect estimate was observed for males, showing that a greater reading score was observed following an increase in writing score when compared to females.
The greater score of English major group was expected as the group scored generally better compared to their peers as observed in the previous analysis. This was also observed as most individuals in the group were observed to be in the upper right corner of the plot, showing better results and leading to a weaker effect compared to the other two groups. The English Medium and Non-English Major groups displayed a similar effect of writing band on the reading results. This could have been a result of an almost similar distribution of students with a slight decrease in results compared to English Medium groups as observed by the previous analysis and the -0.310 difference in intercept compared the English Medium group. Overall, this is consistent with other research that highlights the mutual reinforcement between reading and writing (Choi et al., 2018; Hsieh, 2024). Students who performed better in reading tended to have better writing abilities, which reinforces the argument for an integrated approach to teaching these skills.
Implications for Curriculum Design and Instructional Strategies
The findings of this study provide a foundational baseline for understanding English Placement Test (EPT) results at IIUM, representing the first analysis of its kind at the institution. By systematically evaluating language proficiency scores in writing and reading across English Major, English Medium, and Non-English Major groups, this study establishes a reference framework for future assessments and comparisons. The insights derived from this study not only enhance the understanding of students’ language proficiency levels but also establish a robust foundation for assessing the efficacy of existing placement mechanisms and curriculum design strategies at IIUM.
A key takeaway from the findings is the validation of EPT in fostering the interconnectedness of language skills. The observed strong positive correlation between writing and reading proficiency aligns with findings from other studies (e.g., Choi et al., 2018; Hsieh, 2024), suggesting that IIUM’s integrated approach to language instruction is effective. The English Major group, in particular, exhibited consistently higher scores, reinforcing the idea that curricula emphasizing English proficiency contribute to tangible skill development. Similarly, the performance of English Medium and Non-English Major groups, while varying in degree, underscores that the EPT accurately reflects the language proficiency needed for academic success.
The overlapping performance of the English Medium and Non-English Major groups suggests that the classification of these groups and their respective passing bands may need to be re-evaluated, as minimal differences were observed between them. This overlap indicates that these two groups could potentially be combined into a single category, simplifying the placement process and ensuring more consistent language support. Treating these groups as a unified category could streamline instructional strategies and better allocate resources, while still addressing the shared language needs of students in these groups. Such an approach could also provide a more cohesive framework for curriculum design and language proficiency assessment at IIUM.
The findings of this study align with previous research that demonstrates a gender gap in English proficiency, with females generally outperforming males, particularly in writing and reading (Walczak & Geranpayeh, 2015). Similarly, an analysis decades of U.S. student data and found consistent patterns of females outperforming males in reading and writing, with the gap widening over time, especially in writing (Reilly et al,. 2019). These findings suggest that gender disparities in literacy are both persistent and significant. Lasekan (2018) further highlighted that females’ stronger performance in English could be linked to their positive attitudes toward learning and greater self-confidence. In line with these studies, this research observed that while males achieved lower overall scores, their performance improved as their scores increased, supporting the idea that gender-specific interventions, tailored to both socio-cultural contexts and individual learning approaches, might be effective in bridging proficiency gaps.
Furthermore, these findings highlight the curriculum’s role in driving academic improvement. The correlation between writing and reading proficiency shows how these skills support each other. For groups like Non-English Majors, focusing more on writing through workshops or tailored courses could help improve their performance, bringing them closer to their peers. By presenting these findings, this study establishes a foundation for ongoing curriculum evaluation and enhancement. It emphasizes the importance of using baseline data to refine teaching strategies and placement mechanisms, ensuring that IIUM students are equipped with the language proficiency needed for academic and professional success. Additionally, the validation of the curriculum through this study provides evidence that the EPT and related language instruction methodologies are not only relevant but effective in meeting the diverse needs of IIUM’s student body.
Limitations and Future Research
This study has several limitations. First, it is retrospective in nature, relying on existing data, which limits the ability to control for other confounding variables such as students’ prior exposure to English, the quality of instruction, and the external factors that may have impacted their performance on the EPT. Additionally, while the study covers a large sample size, future research could explore more granular data, such as how specific courses within each language group influence student performance. Limited demographic data also limits the scope of the investigation. We were also unable to analyse student performance according to Kulliyyah due to the uneven spread and sample size.
Strengths of this study include its large sample size, which enhances the generalisability of the findings. Additionally, the retrospective design allows for a broad view of trends and patterns, offering valuable insights into how each component could impact the other.
Further studies could also investigate the effectiveness of post-EPT interventions and whether the tailored English support offered to students is successful in improving their academic performance. Longitudinal studies would be particularly useful in tracking language proficiency progression over time and assessing the long-term impact of initial language placements.
CONCLUSION
Overall, the English Major group demonstrated higher proficiency across overall, writing, and reading bands compared to the other two groups. Although the Non-English Major group exhibited significantly lower scores than the English Medium group, the difference was minimal. The observed positive correlation between writing and reading proficiency underscores the interdependence of these skills, suggesting that an integrated instructional approach may enhance outcomes, particularly for students in the Non-English Major and English Medium groups. Furthermore, while males generally performed slightly lower than females, their performance showed to improve as their scores increased.
This study also validates the effectiveness of the EPT in assessing English proficiency, as its results are consistent with findings from other studies in the literature, which highlight the interconnectedness of writing and reading skills. The observed correlations between proficiency scores across the groups align with previous research, reinforcing the EPT’s relevance and accuracy in measuring language skills. Furthermore, the findings reflect current practices at IIUM and can serve as a reference for improving language support systems and program designs. This research also establishes a foundation for future studies examining the long-term impact of language proficiency on academic success.
REFERENCES
- Allen, L. K., Crossley, S. A., Snow, E. L. & McNamara, D. S. (2014). L2 Writing Practice: Game Enjoyment as a Key to Engagement. Language Learning & Technology, 18(2), 124–150. http://dx.doi.org/10125/44373
- Ali, M. M., & Hamid, M. O. (2020). Teaching English to the test: why does negative washback exist within secondary education in Bangladesh? Language Assessment Quarterly, 17(2), 129-146. https://doi.org/10.1080/15434303.2020.1717495
- Benwari, N. N., & Nemine, E. B. B. (2014). Intensive Reading as a Study Habit and Students’ Academic Achievement in Economics in Selected Secondary Schools in Bayelsa State, Nigeria. Journal of Curriculum and Teaching, 3(2), 94-99. https://doi.org/5430/jct.v3n2p94
- Bo, W. V., Fu, M., & Lim, W. Y. (2023). Revisiting English language proficiency and its impact on the academic performance of domestic university students in Singapore. Language Testing, 40(1), 133-152. https://doi.org/10.1177/02655322211064629
- Bram, B., & Angelina, P. (2022). Indonesian Tertiary Education Students’ Academic Writing Setbacks and Solutions. International Journal of Language Education, 6(3), 267-280. https://doi.org/10.26858/ijole.v6i3.22043
- Choi, J., Moon, Y., Paek, J. K., & Kang, Y. (2018). Examining the relationship between reading and writing of advanced Korean EFL Learners. Korean Journal of Applied Linguistics, 34(1), 91-116. https://doi.org/10.17154/kjal.2018.3.34.1.91
- Dang, C. N., & Dang, T. N. Y. (2023). The predictive validity of the IELTS test and contribution of IELTS preparation courses to international students’ subsequent academic study: Insights from Vietnamese international students in the UK. RELC Journal, 54(1), 84-98. https://doi.org/10.1177/0033688220985533
- Grabe, W., & Zhang, C. (2013). Second language reading-writing relations. Reconnecting reading and writing, 108-133. https://wac.colostate.edu/books/reconnecting/chapter6.pdf/
- Hernandez, D. J. 2011. Double Jeopardy: How Third-Grade Reading Skills and Poverty Influence High School Graduation. The Annie E. Casey Foundation: New York, NY. https://www.readingrockets.org/resources/resource-library/double-jeopardy-how-third-grade-reading-skills-and-poverty-influence
- Hille, K., & Cho, Y. (2020). Placement testing: One test, two tests, three tests? How many tests are sufficient?. Language Testing, 37(3), 453-471. https://doi.org/10.1177/0265532220912412
- Hsieh, C. N. (2024). The Role of Task Types and Reading Proficiency on Young English as a Foreign Language Learners’ Writing Performances. TESOL Quarterly, 58(2), 978-990.
- https://doi.org/10.1002/tesq.3286
- Jerrim, J., & Moss, G. (2019). The link between fiction and teenagers’ reading skills: International evidence from the OECD PISA study. British Educational Research Journal, 45(1), 181-200. https://doi.org/10.1002/berj.3498
- Johnson, R. C., & Tweedie, M. G. (2021). “IELTS-out/TOEFL-out”: Is the end of general English for academic purposes near? Tertiary student achievement across standardized tests and general EAP. Interchange, 52(1), 101-113. https://doi.org/10.1007/s10780-021-09416-6
- Kadwa, M. S., & Alshenqeeti, H. (2020). The impact of students’ proficiency in english on science courses in a foundation year program. International Journal of Linguistics, Literature and Translation, 3(11), 55-67. https://doi.org/10.32996/ijllt.2020.3.11.5
- Kyriakides, L., & Campbell, R. J. (1999). Primary teachers’ perceptions of baseline assessment in mathematics. Studies in Educational Evaluation, 25(2), 109-130. https://doi.org/10.1016/S0191-491X(04)90002-8
- Krokhmal, A. (2023). Validity of the English Placement Test at University of Illinois Urbana-Champaign (Doctoral dissertation, University of Illinois at Urbana-Champaign). https://hdl.handle.net/2142/120271
- Kumaran, P. N., & Krish, P. (2021). Mother tongue interference in English writing among
Tamil school students. Gema Online Journal of Language Studies, 21(1), 110-123.
http://doi.org/10.17576/gema-2021-2101-07 - Lee, J., & Schallert, D. L. (2016). Exploring the reading–writing connection: A yearlong classroom‐based experimental study of middle school students developing literacy in a new language. Reading Research Quarterly, 51(2), 143-164. https://doi.org/10.1002/rrq.132
- Liao, Y. F. (2022). Using the English GSAT for placement into EFL classes: accuracy and validity concerns. Language Testing in Asia, 12(1), 31. https://doi.org/10.1186/s40468-022-00181-6
- Monogaran, M., & Subramaniam, T. (2023). Skills acquisition and employability among arts and social sciences interns in a Malaysian public university. Institutions and Economies, 15(23), 59-86. https://doi.org/10.22452/IJIE.vol15no2.3
- Moon, Y., Choi, J., & Kang, Y. (2019). Does reading and vocabulary knowledge of advanced korean efl learners facilitate their writing performance?. Journal of Asia TEFL, 16(1), 1-447. https://doi.org/10.18823/asiatefl.2019.16.1.10.149
- Randaccio, M. (2013). Writing skills: theory and practice. http://hdl.handle.net/10077/10093
- Renganathan, S. (2021). English language education in rural schools in Malaysia: a systematic review of research. Educational Review, 75(4), 787–804. https://doi.org/10.1080/00131911.2021.1931041
- Reilly, D., Neumann, D. L., & Andrews, G. (2019). Gender differences in reading and writing achievement: Evidence from the National Assessment of Educational Progress (NAEP). The American psychologist, 74(4), 445–458. https://doi.org/10.1037/amp0000356
- Rezaeian, M., Seyyedrezaei, S. H., Barani, G., & Seyyedrezaei, Z. S. (2020). Construction and Validation of Educational, Social and Psychological Consequences Questionnaires of EPT as a High-Stakes Test. International Journal of Language Testing, 10(2), 33-70. http://www.ijlt.ir/issue_15652_16074.html
- Rose, H., Curle, S., Aizawa, I., & Thompson, G. (2019). What drives success in English medium taught courses? The interplay between language proficiency, academic skills, and motivation. Studies in Higher Education, 45, 1-13. https://doi.org/10.1080/03075079.2019.1590690.
- Sheerah, H., & Yadav, M. (2022). The Use of English Placement Test (EPT) in Assessing the EFL Students’ Language Proficiency Level at a Saudi University. Rupkatha Journal on Interdisciplinary Studies in Humanities. 14(3), 1-8. https://doi.org/10.21659/rupkatha.v14n3.24.
- Sikora, J., Evans, M.D.R., & Kelley, J. (2018). Scholarly culture: How books in adolescence enhance adult literacy, numeracy and technology skills in 31 societies. Social Science Research. 77, 1-15. https://doi.org/1016/j.ssresearch.2018.10.003.
- Sundram, P. G. P. H., Rangan, P., Sio Ching, H., Baskaran, S., Mohd. Jaya, S. S., & Krishnan, I. A. (2024). Analysing Part-of-Speech Errors in Job Interviews Among Malaysian Fresh Graduates in Malaysia: Implications for Communication Competence. Malaysian Journal of Social Sciences and Humanities (MJSSH), 9(3), e002776. https://doi.org/10.47405/mjssh.v9i3.2776.
- Van Hek, M., Buchmann, C., & Kraaykamp, G. (2019). Educational systems and gender differences in reading: A comparative multilevel analysis. European Sociological Review, 35(2), 169-186. https://doi.org/10.1093/esr/jcy054
- Walczak, A., & Geranpayeh, A. (2015). The Gender Gap in English Language Proficiency. Insights from a Test of Academic English. https://www.cambridgeassessment.org.uk/Images/gender-differences-cambridge-english.pdf
Appendix
Supplementary Table I
Supplementary Table I
Breakdown of Course Language Group According to Sex and Year |
||||
Course Language Group | ||||
Sex | English Medium | English Major | Non-English Major | Total |
Male | 1336 | 165 | 522 | 2023 |
Female | 2327 | 553 | 826 | 3706 |
Year | ||||
2019 | 683 | 162 | 361 | 1206 |
2020 | 551 | 194 | 482 | 1227 |
2021 | 745 | 155 | 256 | 1156 |
2022 | 779 | 133 | 112 | 1024 |
2023 | 905 | 74 | 137 | 1116 |
Supplementary Figure 1
Supplementary Fig. 1 illustrates a boxplot of Overall band (A), Writing band (B) and Reading band (C) by language Group where it further broken down into sex. Red diamond represents mean.
Supplementary Figure 2
Supplementary Fig. 1 shows a heat density plot of Writing Band and Reading band with counts.