Assessing Efficiency in Student Mathematics Performance Using Slack-Based DEA: Integrating Prior Performance, Engagement, and Chapter-Level Results
- Noor Hafizah Zainal Aznam
- Shahida Farhan Zakaria
- Siti Nur Alwani Salleh
- Nor Athirah Mohd Zin
- 7194-7202
- Oct 21, 2025
- Education
Assessing Efficiency in Student Mathematics Performance Using Slack-Based DEA: Integrating Prior Performance, Engagement, and Chapter-Level Results
Noor Hafizah Zainal Aznam*, Shahida Farhan Zakaria, Siti Nur Alwani Salleh, Nor Athirah Mohd Zin
Faculty of Computer and Mathematical Sciences, Universiti Teknologi MARA Cawangan Kedah, 08400 Merbok, Kedah, Malaysia
*Corresponding Author
DOI: https://dx.doi.org/10.47772/IJRISS.2025.909000587
Received: 12 September 2025; Accepted: 18 September 2025; Published: 21 October 2025
ABSTRACT
Assessing student performance in mathematics is often limited to raw scores or averages, which overlook differences in how effectively students use their learning opportunities and fail to identify hidden weaknesses across topics. To address this issue, this study applies the Slack-Based Measure (SBM) model of Data Envelopment Analysis (DEA) to evaluate the efficiency of students’ mathematics performance in a higher education setting. Academic efficiency was assessed based on test scores across three chapters of a standardized assessment. Each student was treated as a Decision-Making Unit (DMU), with pre-assessment, class attendance and pre-university mathematics results as inputs, and chapter-specific scores as outputs. An output-oriented SBM approach was employed to examine how effectively students converted instructional inputs into academic outcomes while also identifying chapter-level slacks that reveal hidden inefficiencies. The results highlight that while some students achieve high efficiency, many demonstrate inefficiencies driven by underperformance in specific chapters despite relatively strong input profiles. The SBM model’s ability to capture slack allows for the precise identification of these gaps, making it possible to diagnose individual weaknesses that traditional measures may overlook. Chapter-level analysis further revealed variability in content mastery, emphasizing the importance of topic-specific interventions. Additionally, efficiency patterns varied across student groups, with attendance and GPA emerging as critical factors influencing learning efficiency. These findings provide actionable insights for educators, including the development of personalized tutoring, targeted revisions, and more effective feedback mechanisms tailored to students’ unique needs. By diagnosing individual strengths and weaknesses, the SBM framework supports evidence-based teaching strategies that enhance overall academic achievement. This study contributes to the growing field of educational analytics by demonstrating how operations research methods can be integrated into classroom evaluation, offering a practical framework for improving instructional quality, supporting student success, and guiding institutional benchmarking.
Keywords: Slack-Based Measure (SBM), Student Performance Evaluation, Learning Efficiency, Educational Assessment, Educational Analytics
INTRODUCTION
The evaluation of student learning efficiency has become a foundational concern in contemporary education research, as institutions strive not just to measure performance, but to understand how effectively students convert their own resources and capabilities into academic outcomes (Petra & Aziz, 2020; Rana et al., 2021). Traditional assessments are largely focused on absolute performance such as test scores or grade averages which fail to capture this efficiency dimension, overlooking the nuanced relationship between student inputs such as attendance, prior achievement, and preparatory assessments, and academic outputs (He, 2024; Khan et al., 2019; Zhou et al., 2024). This limitation highlights the need for analytical frameworks that connect student inputs to outputs while revealing hidden inefficiencies. Without such approaches, educators may overlook weaknesses in specific topics or the influence of factors like attendance and prior achievement.
Data Envelopment Analysis (DEA) offers one such robust approach, as it allows for multi-dimensional input–output efficiency assessment without reliance on predefined functional forms(Mudawali et al., 2025; Shero et al., 2022). Its Slack-Based Measure (SBM) variant enhances this capability by explicitly identifying input redundancies and output deficiencies, making it particularly compelling for educational efficiency studies (Mahla et al., 2021). Recent applications of DEA include analyses of classroom-level performance (Nocera Alves Junior et al., 2024), regional educational systems (Aikins et al., 2025) and higher education institutions (Gori et al., 2025).
Despite these advances, critical gaps remain. First, most research examines efficiency at macro levels such as institutions or national systems rather than at the granularity of individual student chapters or course units, where inefficiencies may be more instructive (Aparicio et al., 2022; Ucar & Karsak, 2023). Second, very few studies integrate the three critical dimensions needed to capture learning efficiency fully: (1) alignment of student performance with their input capabilities, (2) the maximization of performance based on those inputs, and (3) identification of which students extract the most benefit from their inherent attributes.
To address these shortcomings, this study applies a Slack-Based DEA model, augmented to conduct a chapter-wise efficiency analysis of student assessment outcomes. By focusing on efficiency rather than raw scores, the study aims to uncover latent inefficiencies, identify students who excel despite limited inputs, and spotlight those who underperform despite strong initial conditions. These insights are vital for developing targeted teaching strategies, optimizing resource deployment, and informing policy reforms (Eren & Aydın, 2025). To the best of our knowledge, this study contributes to systematically applying SBM-DEA at the chapter level to enhance both its methodological precision and practical relevance.
METHODOLOGY
DEA is a powerful tool for evaluating the efficiency of decision-making units (DMUs) by comparing multiple inputs and outputs. SBM models in DEA are particularly useful for directly assessing efficiency in the presence of slack values, which indicate inefficiencies in input surplus and output shortfall (Abdullah et al., 2018; Azizi et al., 2015; Hamdi et al., 2014). When applied to educational settings, SBM can help assess the efficiency of student learning, identify areas for improvement and provide detailed insights into specific areas where resources are not being used optimally.
Motivated by the advantages of the SBM model, this study applies this model within an output-oriented framework, as proposed by Tone (2001). The output-oriented SBM model in DEA serves as an effective approach for efficiency assessment, particularly in contexts where a detailed examination of slacks and non-proportional variations in inputs and outputs is required (Paramanik et al., 2023). The output-oriented framework is chosen as it focuses on maximizing outputs while keeping inputs constant, which is useful for scenarios where the goal is to enhance output efficiency. For instance, in the context of education, this approach focuses on improving students’ academic achievements and performance without requiring additional resources such as teaching hours, instructional materials, or faculty.
This study considers a sample of 129 students across four groups, with each student treated as a DMU. In DEA, it is essential to ensure that the number of DMUs is sufficiently large relative to the number of input and output variables in order to obtain reliable efficiency estimates. Note that, the determination of sample size must satisfy the minimum DEA requirement, where the number of DMUs must be at least three times the total number of inputs and outputs (Bowlin, 1998).
In this study, three input variables and three output variables are considered. Within the DEA framework, the input variables represent the resources utilized by DMUs, while the output variables reflect the results or achievements generated from these resources. This balanced specification of inputs and outputs provides a solid foundation for assessing the relative efficiency of DMUs.
Hence, to find the number of DMUs:
3 × (inputs + outputs) = 3 × (3 + 3) = 18.
With 129 students serving as DMUs, the sample size substantially exceeds this threshold, thereby fulfilling the minimum DEA requirement and enhancing the reliability of the efficiency analysis.
The inputs chosen were attendance rate, pre-assessment, and pre-university mathematics grades, while the outputs were the marks for chapter 5, chapter 6, and chapter 7. Although the Pearson correlation between some inputs and outputs was below 0.5, indicating a weak linear relationship, these variables were retained in the SBM-DEA model due to their conceptual relevance. As highlighted by Afsharian et al. (2016), careful selection of input and output factors is critical in DEA to avoid common pitfalls, including issues with factor selection, dual-role variables, and undesirable factors. The inclusion of these variables aligns with the approach discussed by (Dobos & Vörösmarty, 2024), who highlight that the selection of input and output items is crucial for the successful application of DEA, as they should reflect the decision maker’s preferences and perceptions of factors influencing the efficiency of a DMU.
Before the analysis, the raw data were pre-processed to ensure consistency and validity. For instance, student grades were first converted to a GPA system, with a special adjustment where an A+ was assigned a value of 4.33 to distinguish it from a standard A grade.
Next, the data were normalized using the mean to allow fair comparisons across different scales and units. Data completeness was then validated, particularly because the output-oriented SBM model uses ratios relative to outputs. In this context, any output value of zero is invalid, as it would cause undefined ratios in the SBM calculations (Lee & Zhu, 2012) . To address this, zero outputs need to be replaced with a small positive value (1.0 × 10-6), ensuring computational stability and preventing distortions in the efficiency scores (Shao et al., 2021; Wang et al., 2019). This pre-processing step is crucial for the reliability and accuracy of SBM results, as it guarantees that all DMUs are properly evaluated and that the calculated efficiency scores truly reflect relative performance.
The output-oriented SBM model of Tone (2001) was then applied to evaluate the relative efficiency of students as DMUs. The analysis was carried out in LINGO programming, where each DMU was evaluated individually by specifying its inputs and outputs. The SBM efficiency score (0 < θ ≤ 1) was then obtained by solving a fractional program that minimizes the average normalized output shortfalls with inputs held at their observed levels. Unlike CCR, which measures proportional reduction, SBM explicitly evaluates slack inefficiencies, yielding a score of 1 for efficient units and less than 1 when slacks are present.
For efficient DMUs, a further distinction was made using the super efficiency SBM approach. In this procedure, the DMU under evaluation was temporarily excluded from the reference set, and the SBM model was re-estimated. The efficient DMUs were ranked based on their super-efficiency scores, which helped distinguish among those with equal efficiency. This ranking offered a more detailed assessment of student performance, making it possible to identify the highest performance gain beyond the basic efficiency classification.
Finally, inefficient DMUs were then examined through their slack values to identify specific output shortfalls, which highlighted potential areas for improvement. The flow process for identifying efficient DMUs is illustrated in Figure 1.
Figure 1. Flow process for identifying efficient DMUs
RESULTS AND DISCUSSIONS
This section presents the findings of the efficiency analysis conducted using the SBM and super-efficiency models. The results are organized to highlight overall efficiency levels among the students, the distribution of efficiency scores across groups, and the identification of output slacks that indicate areas requiring improvement. Special attention is given to students with very low efficiency scores, which are often attributable to zero outputs in specific chapters, as well as to super-efficient students whose performance surpasses the efficiency frontier. By examining both inefficient and super-efficient cases, the analysis not only captures the variability of student performance but also provides a basis for understanding individual and group-level learning gaps, while identifying potential benchmark students for peer learning and mentoring.
Table 1 summarizes the descriptive statistics for the inputs and outputs. Students demonstrated consistently high attendance, with a mean of 88.85 percent and limited variation. Pre-assessment scores, however, were more widely dispersed, with a mean of 56.18 and a standard deviation of 23.64, indicating differing levels of prior knowledge. Pre-university mathematics grades were relatively uniform, averaging 3.21 on a GPA scale. In terms of outputs, performance was strongest in Chapter 5 with a mean of 74.18, weakest in Chapter 6 with a mean of 59.08, and moderately high in Chapter 7 with a mean of 69.46. Chapter 7 also showed the greatest variability, with a standard deviation of 32.54. These results suggest that while attendance and prior academic background were relatively stable, students’ preparedness and learning outcomes varied considerably across chapters.
Table 1. Descriptive statistics for inputs and outputs
Attendance Rate
|
Pre Assessment
|
Pre-University Mathematics Grade
|
Marks of Chapter 5
|
Marks of Chapter 6
|
Marks of Chapter 7
|
|
Minimum | 33.33 | 6.7 | 1.67 | 18.75 | 0 | 0 |
Maximum | 100 | 100 | 4.33 | 100 | 100 | 100 |
Mean | 88.85 | 56.18 | 3.21 | 74.18 | 59.08 | 69.46 |
Standard Deviation | 14.10 | 23.64 | 0.82 | 21.34 | 25.33 | 32.54 |
Figure 2. Radar chart of mean score and standard deviation
Figure 1(a) shows the mean performance of four groups across three inputs (x₁, x₂, x₃) and three outputs (y₁, y₂, y₃). Group 1 achieved consistently strong results, while Group 2 performed the weakest, particularly in inputs x₂ and x₃ and outputs y₁ and y₃. Groups 3 and 4 displayed selective strengths but lacked overall balance, highlighting Group 1 as the most efficient and Group 2 as the most in need of improvement. From a pedagogical perspective, targeted interventions such as remedial support for Group 2 and reinforcement of weaker inputs for Groups 3 and 4 could help reduce performance disparities and promote more balanced learning outcomes across cohorts.
Figure 1(b) presents the variability of group performance measured by standard deviation. Most dimensions, including x₁, x₃, y₁, and y₂, showed low variability, while input x₂ displayed moderate variation. Output y₃ exhibited the highest variability, with Groups 2 and 4 deviating substantially more than Groups 1 and 3, making it the most distinguishing factor among the groups. From an instructional perspective, this suggests the need to focus on reducing variability in output y₃, potentially through targeted practice, feedback, or differentiated learning activities to ensure more equitable performance across student groups.
Figure 3. Scatter plot of efficiency scores for 129 students across four groups
The scatter plot in Figure 2 displays the efficiency distribution of 129 students across four groups. Most students cluster between 0.6 and 1.0, with several achieving the maximum efficiency score of 1.0, signifying alignment with the efficiency frontier. A smaller subset records very low efficiency, in some cases approaching zero, largely due to poor performance in one or more outputs, such as obtaining a zero score in a chapter. Color-coding by group confirms that inefficiency is dispersed across all categories rather than concentrated in a single group, indicating that variation stems from individual performance differences.
Table 2 further clarifies this pattern by showing the distribution of efficient and inefficient students across groups. Eighteen students reached full efficiency, spread relatively evenly across the four groups, demonstrating balanced use of inputs and strong academic outcomes, while 111 were inefficient. This confirms that efficiency is not determined by group membership but reflects individual-level disparities, with efficient students serving as benchmarks for their peers.
Table 2. Distribution of efficient and inefficient students across four groups based on the DEA results
Group 1 | Group 2 | Group 3 | Group 4 | Total | |
Number of efficient DMUs | 3 | 4 | 4 | 7 | 18 |
Number of inefficient DMUs | 30 | 27 | 27 | 27 | 111 |
The results in Table 3 identify the top-performing students (DMUs) based on their super-efficiency scores, which exceed 1.0 and indicate performance beyond the efficiency frontier. The leading student, DMU 124, is followed by DMU 65 and DMU 127, all of whom serve as academic benchmarks by consistently outperforming their peers. Group-level patterns reveal that Group 4 dominates the rankings, contributing 7 of the 18 super-efficient students, while Groups 2 and 3 also show strong representation. Group 1, though less prominent, includes notable cases such as DMU 7 and DMU 17, demonstrating that high achievement is not confined to a single group. Collectively, these super-efficient students exemplify best practices in learning and can serve as peer models to guide knowledge sharing and mentoring across groups.
Table 3. Ranking of super-efficient students (DMUs) and their group distribution based on SBM super-efficiency scores
Inefficient students were further examined to identify output shortfalls, represented by slack values, ,
,and
. Table 4 presents a sample of such students along with their respective slacks, while the most efficient student is also included for comparison.
Table 4. Efficiency scores and output slacks for selected students
DMU | Efficiency Score | |||
4 | 0.524 | 45 | 25 | 85 |
38 | 0.802 | 5 | 34 | 0 |
87 | 0.334 | 40 | 47 | 43 |
124 | 1.000 | 0 | 0 | 0 |
The slack analysis highlights the specific output improvements required for inefficient students to reach the efficiency frontier. DMU 124 is efficient with no shortfalls, establishing the benchmark. DMU 38, though inefficient, requires only minor adjustments, primarily in outputs 1 and 2, to achieve efficiency. In contrast, DMU 4 and DMU 87 display substantial inefficiencies. DMU 4 requires marked improvements in all outputs, especially output 3, while DMU 87 is the least efficient, with large shortfalls across all three outputs. These results demonstrate how slack values provide actionable insights beyond the efficiency scores, by pinpointing the precise areas of underperformance.
The evaluation highlights both inefficiencies and exemplary performances within the student cohort. The SBM identifies output shortfalls, particularly in Chapter 2, where the largest performance gaps were observed. These results confirm that subject-specific weaknesses can significantly reduce overall effectiveness, even when students demonstrate adequate inputs. At the same time, the detection of super-efficient students provides a valuable benchmark, as these individuals surpass the efficiency frontier and exemplify best practices in learning.
The identification of super-efficient students has practical implications for peer learning and mentoring. In DEA terms, these students are not necessarily the highest achievers in absolute scores, but those who most effectively transform their available inputs such as attendance, prior knowledge, and formative assessments into strong academic outputs. High performers such as DMU 124 and DMU 65 can therefore serve as efficiency benchmarks, sharing study strategies, problem-solving approaches, and time management practices that optimize learning resources. Their involvement in structured peer-learning activities, such as study circles, mentoring schemes, or cross-group tutorials, can foster knowledge sharing and provide relatable guidance for lower-performing peers. This approach not only reduces performance disparities but also enhances motivation, as peer explanations are often perceived as more relatable than instructor-led feedback (Araya & Gormaz, 2021; Tullis & Goldstone, 2020).
From a broader perspective, efficiency analysis demonstrates a dual function: it diagnoses inefficiencies by pinpointing specific chapters where improvements are required, while also identifying benchmark students who can act as catalysts for collective improvement. For educators, these insights suggest a two-pronged strategy: integrating targeted remedial programs focused on Chapter 2 with structured peer-learning initiatives anchored by super-efficient students. This balanced approach addresses individual weaknesses while leveraging cohort strengths, thereby promoting sustainable improvements in learning efficiency.
In sum, the results extend DEA’s application in education by moving beyond performance measurement to actionable recommendations. By linking inefficiency analysis with peer-mentoring strategies, the study establishes a framework where efficiency evaluation serves both diagnostic and developmental purposes.
CONCLUSION
This study employed the output-oriented SBM and super-efficiency DEA models to evaluate the mathematics performance of 129 students. The results revealed substantial variation in efficiency, with some students achieving or surpassing the efficiency frontier while others recorded very low scores due to output deficiencies in specific chapters. These findings underscore that consistent performance across all assessment components is essential, as weaknesses in a single chapter can significantly reduce overall efficiency. Beyond measurement, the analysis offers practical value by identifying chapter-level gaps for targeted reinforcement and highlighting super-efficient students who can serve as benchmarks for peer learning. The study demonstrates the applicability of DEA as a diagnostic and developmental tool in education, providing a basis for evidence-based interventions. Future research may extend this approach through longitudinal analysis, cross-cohort comparisons, or hybrid models integrating DEA with complementary statistical techniques.
REFERENCES
- Abdullah, D., . T., Suwilo, S., Efendi, S., . H., & Ita Erliana, C. (2018). A Slack-Based Measures for Improving the Efficiency Performance of Departments in Universitas Malikussaleh. International Journal of Engineering & Technology, 7(2.14), 491. https://doi.org/10.14419/ijet.v7i2.11253
- Afsharian, M., Ahn, H., & Neumann, L. (2016). Generalized DEA: an approach for supporting input/output factor determination in DEA. Benchmarking: An International Journal, 23(7), 1892–1909. https://doi.org/10.1108/BIJ-07-2015-0074
- Aikins, E., dos Santos, S. P., & Amado, C. A. F. (2025). Exploring the Use of Data Envelopment Analysis for Formative Evaluation of Senior High Schools in Ghana. Journal of the Knowledge Economy. https://doi.org/10.1007/s13132-025-02747-0
- Aparicio, J., Cordero, J. M., & Ortiz, L. (2022). Plausible values and their use in efficiency analyses with educational data. Applied Economics, 54(29), 3340–3352. htzps://doi.org/10.1080/00036846.2021.2006136
- Araya, R., & Gormaz, R. (2021). Revealed Preferences of Fourth Graders When Requesting Face-to-Face Help While Doing Math Exercises Online. Education Sciences, 11(8), 429. https://doi.org/10.3390/educsci11080429
- Azizi, H., Kordrostami, S., & Amirteimoori, A. (2015). Slacks-based measures of efficiency in imprecise data envelopment analysis: An approach based on data envelopment analysis with double frontiers. Computers & Industrial Engineering, 79, 42–51. https://doi.org/10.1016/j.cie.2014.10.019
- Bowlin, W. F. (1998). Measuring Performance: An Introduction to Data Envelopment Analysis (DEA). The Journal of Cost Analysis, 15(2), 3–27. https://doi.org/10.1080/08823871.1998.10462318
- Dobos, I., & Vörösmarty, G. (2024). Input and output reconsidered in supplier selection DEA model. Central European Journal of Operations Research, 32(1), 67–81. https://doi.org/10.1007/s10100-023-00845-5
- Eren, F. C., & Aydın, S. (2025). Bootstrapping Efficiency in Education: A Multi‐Stage <scp>DEA</scp> Analysis With <scp>TIMSS</scp> Data. European Journal of Education, 60(3). https://doi.org/10.1111/ejed.70159
- Gori, R. S. L., Lacerda, D. P., Piran, F. S., & Silva, N. A. (2025). Efficiency in higher education institutions: an analysis of data envelopment analysis applications. International Journal of Management in Education, 19(1), 32–59. https://doi.org/10.1504/IJMIE.2025.142875
- Hamdi, K., Lotfi, F. H., & Moghaddas, Z. (2014). An application of DEA in efficiency evaluation of universities. International Journal of Mathematics in Operational Research, 6(5), 550. https://doi.org/10.1504/IJMOR.2014.064841
- He, C. (2024). Formative Student Assessment Intelligent Management System Based on B/S. 2024 International Conference on Informatics Education and Computer Technology Applications (IECA), 199–203. https://doi.org/10.1109/IECA62822.2024.00044
- Khan, S., Hassan, M., Husain, M., & Jetley, S. (2019). Video projected practical examination as an introduction to formative assessment tool for undergraduate examination in pathology. Indian Journal of Pathology and Microbiology, 62(1), 79. https://doi.org/10.4103/IJPM.IJPM_30_18
- Lee, H.-S., & Zhu, J. (2012). Super-efficiency infeasibility and zero data in DEA. European Journal of Operational Research, 216(2), 429–433. https://doi.org/10.1016/j.ejor.2011.07.050
- Mahla, D., Agarwal, S., & Mathur, T. (2021). A novel fuzzy non-radial data envelopment analysis: An application in transportation. RAIRO – Operations Research, 55(4), 2189–2202. https://doi.org/10.1051/ro/2021097
- Mudawali, M., Abdulllah, D., & Sahputra, I. (2025). Analysis Of The Performance Of Junior High Schools In The Nisam Sub-District Using The Data Envelopment Analysis Method. International Journal of Engineering, Science and Information Technology, 5(2), 256–266. https://doi.org/10.52088/ijesty.v5i2.823
- Nocera Alves Junior, P., Leger, P., & Costa Melo, I. (2024). Efficiency analysis of engineering classes: A DEA approach encompassing active learning and expositive classes towards quality education. Environmental Science & Policy, 160, 103856. https://doi.org/10.1016/j.envsci.2024.103856
- Paramanik, A. R., Sarkar, S., & Sarkar, B. (2023). A two-stage improved Base Point Slacks-Based Measure of super-efficiency for negative data handling. Computers & Operations Research, 150, 106057. https://doi.org/10.1016/j.cor.2022.106057
- Petra, T. Z. H. T., & Aziz, M. J. A. (2020). Investigating reliability and validity of student performance assessment in Higher Education using Rasch Model. Journal of Physics: Conference Series, 1529(4), 042088. https://doi.org/10.1088/1742-6596/1529/4/042088
- Rana, P., Raj Gupta, L., Dubey, M. K., & Kumar, G. (2021). Review on evaluation techniques for better student learning outcomes using machine learning. 2021 2nd International Conference on Intelligent Engineering and Management (ICIEM), 86–90. https://doi.org/10.1109/ICIEM51511.2021.9445294
- Shao, Q., Yuan, J., Lin, J., Huang, W., Ma, J., & Ding, H. (2021). A SBM-DEA based performance evaluation and optimization for social organizations participating in community and home-based elderly care services. PLOS ONE, 16(3), e0248474. https://doi.org/10.1371/journal.pone.0248474
- Shero, J. A., Al Otaiba, S., Schatschneider, C., & Hart, S. A. (2022). Data envelopment analysis (DEA) in the educational sciences. The Journal of Experimental Education, 90(4), 1021–1040. https://doi.org/10.1080/00220973.2021.1906198
- Tone, K. (2001). A slacks-based measure of efficiency in data envelopment analysis. European Journal of Operational Research, 130(3), 498–509. https://doi.org/10.1016/S0377-2217(99)00407-5
- Tone, K. (2003). Dealing with undesirable outputs in DEA: a Slacks-Based Measure (SBM) approach. https://www.researchgate.net/publication/284047010
- Tullis, J. G., & Goldstone, R. L. (2020). Why does peer instruction benefit student learning? Cognitive Research: Principles and Implications, 5(1). https://doi.org/10.1186/s41235-020-00218-5
- Ucar, E., & Karsak, E. E. (2023). Evaluating Educational Performance of OECD Countries with Common-Weight DEA-Based Models. Journal of the Knowledge Economy, 15(3), 13673–13700. https://doi.org/10.1007/s13132-023-01619-9
- Wang, C.-N., Luu, Q.-C., Nguyen, T.-K.-L., & Day, J.-D. (2019). Assessing Bank Performance Using Dynamic SBM Model. Mathematics, 7(1), 73. https://doi.org/10.3390/math7010073
- Zhou, N., Wang, Y., Hu, P., & Chen, X. (2024). Construction and Application of a Formative Assessment System for Online and Offline Course Learning. Proceeding of the 2024 International Conference on Artificial Intelligence and Future Education, 46–51. https://doi.org/10.1145/3708394.3708403