Identifying Learning Gaps: A Diagnostic Chemistry Test for Secondary Students

Authors

Raihana M. Mangilala

Mindanao State University-Iligan Institute of Technology (Philippines)

Edna B. Nabua

Mindanao State University-Iligan Institute of Technology (Philippines)

Article Information

DOI: 10.47772/IJRISS.2026.1026EDU0027

Subject Category: Education

Volume/Issue: 10/26 | Page No: 365-374

Publication Timeline

Submitted: 2025-12-24

Accepted: 2025-12-30

Published: 2026-01-14

Abstract

This study sought to develop and psychometrically validate a Grade 10 Chemistry achievement instrument intended for diagnostic assessment and to determine students’ least mastered Chemistry competencies. A researcher-constructed 50-item multiple-choice instrument was developed based on selected Grade 10 Chemistry learning competencies prescribed in the curriculum. The instrument underwent readability analysis and content validation by three experts in Chemistry education to establish content relevance, clarity, and alignment with intended learning outcomes. Following expert review, the instrument was pilot tested with 150 Grade 10 students to evaluate reliability and perform item analysis. Reliability estimation using Cronbach’s alpha yielded a coefficient of 0.846, indicating high internal consistency. Item difficulty and discrimination indices were examined, resulting in the retention of 30 items that satisfied acceptable psychometric standards. The validated 30-item achievement instrument was subsequently administered to 88 Grade 10 students from two purposively selected sections at MSU–University Training Center, Marawi City. Student performance was analyzed using a criterion-referenced framework to determine mastery levels across Chemistry learning competencies. Findings revealed differential levels of mastery, with Chemical Reactions emerging as the least mastered competency, while topics related to Solutions, Acids and Bases, and Gases and Gas Laws were moderately mastered. Overall, the results provide empirical evidence that the developed instrument demonstrates satisfactory validity and reliability and functions effectively as a diagnostic assessment tool. The study highlights the critical role of validated diagnostic assessments in identifying learning gaps and informing targeted instructional interventions to enhance the teaching and learning of Chemistry at the secondary level.

Keywords

diagnostic assessment, Grade 10

Downloads

References

1. Bain, K., Rodriguez, J. M. G., Moon, A., & Towns, M. H. (2019). The characterization of college students’ reasoning about chemical reactions using a mechanistic framework. Chemistry Education Research and Practice, 20(4), 741–755. https://doi.org/10.1039/C9RP00025A [Google Scholar] [Crossref]

2. Boateng, G. O., Neilands, T. B., Frongillo, E. A., Melgar-Quiñonez, H. R., & Young, S. L. (2018). Best practices for developing and validating scales for health, social, and behavioral research: A primer. Frontiers in Public Health, 6, 149. https://doi.org/10.3389/fpubh.2018.00149 [Google Scholar] [Crossref]

3. Cooper, M. M., Underwood, S. M., & Hilley, C. Z. (2012). Development and validation of the implicit information from Lewis structures instrument (IILSI): Do students connect structures with properties? Chemistry Education Research and Practice, 13(3), 195–200. https://doi.org/10.1039/C2RP00010E [Google Scholar] [Crossref]

4. DeVellis, R. F. (2017). Scale development: Theory and applications (4th ed.). Sage Publications. [Google Scholar] [Crossref]

5. Dori, Y. J., & Hameiri, M. (2003). Multidimensional analysis system for quantitative chemistry problems: Symbol, macro, micro, and process aspects. Journal of Research in Science Teaching, 40(3), 278–302. https://doi.org/10.1002/tea.10090 [Google Scholar] [Crossref]

6. Ebel, R. L., & Frisbie, D. A. (1991). Essentials of educational measurement (5th ed.). Prentice Hall. [Google Scholar] [Crossref]

7. Furr, R. M., & Bacharach, V. R. (2014). Psychometrics: An introduction (2nd ed.). Sage Publications. https://doi.org/10.4135/9781483390869 [Google Scholar] [Crossref]

8. Gulacar, O., Overton, T., & Bowen, C. W. (2014). Development and application of a diagnostic assessment tool to evaluate students’ understanding of thermodynamics concepts. Chemistry Education Research and Practice, 15(3), 447–458. https://doi.org/10.1039/C4RP00017H [Google Scholar] [Crossref]

9. Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2019). Multivariate data analysis (8th ed.). Cengage Learning. https://doi.org/10.1002/9781119409137 [Google Scholar] [Crossref]

10. Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. Routledge. https://doi.org/10.4324/9780203850381 [Google Scholar] [Crossref]

11. Johnstone, A. H. (2006). Chemical education research in Glasgow in perspective. Chemistry Education Research and Practice, 7(2), 49–63. https://doi.org/10.1039/B5RP90021B [Google Scholar] [Crossref]

12. Kline, R. B. (2021). Principles and practice of structural equation modeling (4th ed.). Guilford Press. https://doi.org/10.1037/0000157-000 [Google Scholar] [Crossref]

13. Kalkbrenner, M. T. (2021). A practical guide to instrument development and score validation in the social sciences: The MEASURE approach. Practical Assessment, Research, and Evaluation, 26(1), Article 1. https://doi.org/10.7275/svg4-e671 [Google Scholar] [Crossref]

14. Kozma, R. B., & Russell, J. (2005). Students becoming chemists: Developing representational competence. Visualization in Science Education, 1, 121–145. https://doi.org/10.1007/1-4020-3613-2_8 [Google Scholar] [Crossref]

15. Kyriazos, T. A., & Stalikas, A. (2018). Applied psychometrics: The steps of scale development and standardization process. Psychology, 9(11), 2531–2560. https://doi.org/10.4236/psych.2018.911145 [Google Scholar] [Crossref]

16. Magno, C. (2017). Developing and validating achievement tests in educational settings. International Journal of Educational and Psychological Assessment, 18(2), 1–15. https://doi.org/10.2139/ssrn.2966335 [Google Scholar] [Crossref]

17. McMillan, J. H. (2018). Classroom assessment: Principles and practice for effective standards-based instruction (7th ed.). Pearson Education. [Google Scholar] [Crossref]

18. Morgado, F. F. R., Meireles, J. F. F., Neves, C. M., Amaral, A. C. S., & Ferreira, M. E. C. (2017). Scale development: Ten main limitations and recommendations to improve future research practices. Psicologia: Reflexão e Crítica, 30(1), 3. https://doi.org/10.1186/s41155-016-0057-1 [Google Scholar] [Crossref]

19. Nitko, A. J., & Brookhart, S. M. (2014). Educational assessment of students (7th ed.). Pearson Education. [Google Scholar] [Crossref]

20. Popham, W. J. (2017). Classroom assessment: What teachers need to know (8th ed.). Pearson Education. [Google Scholar] [Crossref]

21. Sevian, H., & Talanquer, V. (2021). Rethinking chemistry education: From teaching concepts to supporting sensemaking. Chemistry Education Research and Practice, 22(1), 8–15. https://doi.org/10.1039/D0RP00231F [Google Scholar] [Crossref]

22. Soeharto, S. (2021). Development of a diagnostic assessment test to evaluate science misconceptions in terms of school grades: A Rasch measurement approach. Journal of Turkish Science Education, 18(3), 351–370. https://doi.org/10.36681/tused.2021.78 [Google Scholar] [Crossref]

23. Stowe, R. L., & Cooper, M. M. (2019). Practicing what we preach: Assessing “critical thinking” in chemistry. Journal of Chemical Education, 96(5), 837–846. https://doi.org/10.1021/acs.jchemed.8b00736 [Google Scholar] [Crossref]

24. Taber, K. S. (2018). The nature of the chemical concept: Reconsidering chemical ideas. Royal Society of Chemistry. https://doi.org/10.1039/9781788012627 [Google Scholar] [Crossref]

25. Taber, K. S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 48(6), 1273–1296. https://doi.org/10.1007/s11165-016-9602-2 [Google Scholar] [Crossref]

26. Talanquer, V., & Pollard, J. (2010). Let’s teach how we think instead of what we know. Chemistry Education Research and Practice, 11(2), 74–83. https://doi.org/10.1039/C005349J [Google Scholar] [Crossref]

27. Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55. https://doi.org/10.5116/ijme.4dfb.8dfd [Google Scholar] [Crossref]

28. Zamanzadeh, V., Ghahramanian, A., Rassouli, M., Abbaszadeh, A., Alavi-Majd, H., & Nikanfar, A. R. (2015). Design and implementation content validity study: Development of an instrument for measuring patient-centered communication. Journal of Caring Sciences, 4(2), 165–178. https://doi.org/10.15171/jcs.2015.017 [Google Scholar] [Crossref]

Metrics

Views & Downloads

Similar Articles