International Journal of Research and Innovation in Social Science

Submission Deadline- 11th September 2025
September Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-03rd October 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-19th September 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Tracking Growth and Gaps: A Socio-Constructivist Intervention of Student Performance Using GRACE PASS

  • Ethel D. Nabor
  • Hermae Joyce L. Toraja
  • Ace P. Uy
  • 6345-6377
  • Sep 6, 2025
  • Social Science

Tracking Growth and Gaps: A Socio-Constructivist Intervention of Student Performance Using GRACE PASS

Ethel D. Nabor, Hermae Joyce, L. Toraja, Ace P. Uy

Holy Name University, Philippines

DOI: https://dx.doi.org/10.47772/IJRISS.2025.903SEDU0462

Received: 02 August 2025; Accepted: 08 August 2025; Published: 06 September 2025

The Problem and Its Scope Introduction

The fundamental subject of mathematics gives students the analytical, logical, and critical problem-solving abilities needed for academic and practical applications. Nonetheless, many educational institutions struggle to raise math proficiency among their students (Organization for Economic Co-operation and Development [OECD], 2019). Teachers and legislators constantly look for evidence-based methods to assess and improve students’ mathematical comprehension and learning results in response to these difficulties.

One such strategy is using formative and summative exams that appropriately reflect students’ competency and material mastery performance. Based on the ideas of standards-based evaluation, the Global Resources for Assessment Curriculum and Evaluation (GRACE)

-Performance Assessment of Standards and Skills (PASS) assessment tool offers a methodical way to monitor students’ learning progress using pretest and posttest measures. With the help of this assessment tool, teachers may pinpoint students’ areas of strength and weakness, guiding focused interventions to raise performance in critical areas of mathematics, such as patterns, algebra, geometry, statistics, and probability.

Using the GRACE PASS assessment tool’s pretest and posttest data, the current study examines the mathematical proficiency of eighth-grade students. In particular, it looks at distributions of skill levels, evaluates performance across specific learning competencies, and assesses topic mastery. The study sheds light on how well teaching methods work and how well students learn by assessing student performance concerning the skills and standards established by the Department of Education (DepEd).

Two theoretical pillars serve as the study’s foundation. According to Bourdieu’s Social Reproduction Theory, students’ varied amounts of cultural capital affect their academic performance, and educational systems may unintentionally perpetuate current socioeconomic disparities through examinations. Constructivist theory, which Piaget and Vygotsky put out, emphasizes the value of scaffolded, active learning in which students acquire knowledge through directed instruction and meaningful involvement (Piaget, 1952; Vygotsky, 1978). Understanding how student performance reflects their social surroundings and cognitive development as influenced by instruction and assessment is one of the twin goals supported by these viewpoints.

Although the advantages of formative and standards-based evaluations are widely known, little empirical study has been done on how the GRACE PASS framework might monitor student performance in Philippine basic education, specifically in mathematics in Grade 8. There is a lack of research on how teachers might use GRACE PASS pretest and posttest data to identify particular learning gains, mastery levels, and content-area performance. GRACE PASS’s evidence foundation for guiding instruction and intervention in math classrooms is limited by this lack of localized, data-driven evaluation. In order to close this gap and advance a more sophisticated understanding of assessment-driven instruction in secondary mathematics education, the current study will analyze competency-based performance data from students.

Numerous recent studies demonstrate the profound effects of data-driven training and evaluation on students’ academic performance. Because it enables teachers to modify their education based on students’ present understanding, formative assessment has continued to be the driving force behind instruction, as noted by Andrade and Brookhart in 2020. According to a different study by Hattie and Zierer (2021), the key circle of influence over student learning

outcomes continued to include timely and appropriately targeted feedback and the cumulative use of assessment data.

In light of this new research, it is becoming clear how teachers might use assessment tools to evaluate student performance and make instructional decisions that promote academic success and deep learning. This work is pertinent and contemporary since it uses facts to add to the current discussion about enhancing mathematics education. The study is in favor of promoting high-quality, fair education that is in line with 21st-century learning requirements.

Theoretical Background of the Study

This study is anchored on two foundational theories: Bourdieu’s Social Reproduction Theory and the Constructivist Theory as advanced by Piaget and Vygotsky. Social Reproduction Theory is a theoretical perspective in sociology and education that focuses on how social inequalities are reproduced or perpetuated through education and other societal institutions (Bourdieu, 1986). The theory declares that educational systems often mirror and perpetuate existing social structures. According to this perspective, students’ academic performance is shaped by individual ability and access to cultural capital such as language, learning resources, and prior exposure to academic practices, which varies across socioeconomic backgrounds. In this sense, standardized assessments can unintentionally reinforce inequalities if they fail to account for these disparities, making it important to interpret student performance through academic and sociocultural lenses.

Social Reproduction Theory has been widely used to understand how socioeconomic status influences long-term educational and career outcomes. Luo (2025) emphasized that this

theory provides a valuable lens for analyzing the impact of class on student opportunity while recognizing that inequality is multifaceted, intersecting with factors such as gender, race, and ethnicity. Similarly, Ikpuri (2025) applied Social Reproduction Theory in examining persistent disparities in the American education system, underscoring how students’ social class and cultural capital shape access to academic opportunities. The study concluded that schools often reinforce class-based hierarchies by favoring learners from more privileged backgrounds. In the Philippine context, Suhaili (2025) affirmed the relevance of Bourdieu’s theory through a study conducted in public secondary schools in Jolo, revealing that while schools implement inclusive practices such as free education and curriculum reforms, structural challenges in addressing class-based disparities remain. Collectively, these studies validate the Social Reproduction Theory as a vital framework for interpreting the reproduction of inequality in educational systems across different contexts.

Complementing this is the Constructivist Theory, which views learning as an active, developmental process in which students construct knowledge through experience, social interaction, and guided instruction. Piaget emphasized the developmental stages through which learners progress (Piaget, 1952). At the same time, Vygotsky introduced the Zone of Proximal Development (ZPD) concept—the gap between what a learner can do independently and what they can achieve with appropriate support (Vygotsky, 1978). These theories suggest that learning occurs best when instruction is scaffolded and responsive to students’ current understanding, which can be assessed through diagnostic tools such as pretests and posttests.

Active construction of knowledge has been shown to increase student engagement, appreciation, and comprehension in real-world classroom settings (Santos et al., 2020; Villanueva & Cruz, 2021). Middle school students who engaged in constructive activities and regular assessments of their algebraic knowledge showed a deeper content understanding of the topic, according to research by Becerra and Diaz (2023). Garcia (2022), however, discovered that students who received constructivist instruction scored better on problems involving statistics and probability than students who received traditional instruction.

Grounded in these theoretical perspectives, the Global Resources for Assessment Curriculum and Evaluation – Performance Assessment of Standards and Skills (GRACE PASS) is a structured framework for evaluating students’ mastery of specific competencies. Designed around the principles of standards-based assessment, GRACE PASS enables educators to monitor progress using pretest and posttest data aligned with key content areas and learning outcomes. Through this framework, teachers can identify students’ strengths and learning gaps, tailor instructional strategies accordingly, and assess the effectiveness of interventions over time. Using pretest and posttest assessments within GRACE PASS reflects a constructivist and sociological understanding of learning. From a constructivist standpoint, these assessments capture the growth in students’ conceptual understanding resulting from instruction. From a sociological lens, they provide data that, when disaggregated and analyzed, can reveal how different groups of learners may experience and benefit from the same curriculum in varied ways. Together, these theories support the study’s aim to explore whether students improved and how and why their performance varied across content areas and proficiency levels.

Understanding how mathematics education establishes the groundwork for students’ growth in analytical, logical, and problem-solving abilities (OECD, 2019). Given this crucial significance, research indicates that there are no significant obstacles to raising pupils’ performance in mathematics, especially at the junior high school level. This is an excellent explanation of using formative assessments; the existing literature justifies novel, evidence-based strategies that would probably enhance learning outcomes (Wiliam, 2020; Garcia & Molina, 2021). Formative assessments are likely to impact students’ learning outcomes when they are successfully incorporated into teaching practices, as demonstrated by Black and Wiliam’s (2019) comprehensive meta-analysis. Similarly, Hattie’s (2020) research has shown that feedback and assessments, particularly those aligned with learning objectives, are highly significant.

Structured pretesting and post-testing enable tracking learners’ progress and identifying misconceptions that call for tailored instruction, as Lee and Lee (2021) stated. Additionally, Alonzo (2020) confirmed that structured formative evaluations increased student achievement by 15% in a single semester. Pretest-posttest findings were also found to be a strong predictor of end-of-year performance in the mathematics areas of number sense, measurement, and geometry in Johnson’s (2022) study. According to Ramos and Mendoza (2021), teachers can evaluate students’ mastery of particular mathematical competencies using assessments that align with standards, such as GRACE PASS.

Nguyen et al. (2020) discovered that student fluency rose considerably when targeted interventions specifically addressed Number Sense. In a long-term study, Tan and Chua (2022) found that regularly measuring concept assessment led to longer-lasting understanding. Martinez and Ramos (2021) demonstrated that while evaluating competence levels, pretest-posttest frameworks assisted in identifying hidden learning gaps among students in Grade 7, especially in geometry and algebra. This is supported by Chang and Zhao’s (2021) research, which demonstrates that learners’ overall proficiency distributions were improved by intervention programs based on pre-assessment data.

The analysis of competency performance is another crucial element that has been emphasized in recent studies. Delgado and Rivera (2023) investigated the effects of competency-based assessments and discovered that abilities that were particularly addressed outperformed generic content regarding posttest results. Dominguez and Salazar (2022), who argued that students’ knowledge of competencies is directly linked to their capacity to employ mathematical skills in unexpected circumstances, also emphasized the need for competence alignment in assessments. A thorough evaluation by Ortega et al. (2021) found that competency-driven education supported by pretest-posttest cycles considerably increased academic attainment, particularly in statistics and mathematics.

The significance of pretest and posttest designs is growing. According to Park and Kim (2020), these designs offered unique diagnostic insights that helped teachers make early modifications to their educational practices. However, Espinoza and Cruz (2023) found that schools that regularly employed pretest-posttest data in mathematics outperformed other schools by 18% on standardized examinations. A comparable study by Abad and Francisco (2021) indicated that posttest gains were greatest when education was modified based on pretest deficits rather than following a rigid syllabus.

These theoretical and empirical viewpoints work together to give this study a thorough foundation. The constructivist theory emphasizes the value of scaffolding and guided instruction in promoting conceptual understanding. At the same time, Bourdieu’s Social Reproduction Theory sheds light on how students’ diverse backgrounds may affect their success on arithmetic tests. With its focus on standards-based evaluation and pretest and posttest measures, the GRACE PASS assessment tool, which is based on these frameworks, helps monitor student development and guide instructional decisions. A more thorough examination of student performance is made possible by placing this study within these theoretical frameworks, which enables the investigation of how much students learn and how and why their learning differs across subject areas and skill levels.

Statement of the Problem

Using the GRACE PASS assessment tool’s pretest and posttest data, this study attempts to measure the mathematical proficiency of eighth-grade students. The study evaluates student performance in content mastery, competency performance, and proficiency distribution. The researchers specifically aimed to respond to the following questions:

What is the level of student performance in the various Mathematics content areas, as reflected in the GRACE PASS pretest and posttest results? Specifically, in the following domains:

Patterns and Algebra

Geometry

Statistics and Probability

What is the students’ performance in each targeted learning competency before and after instruction?

What is the distribution of students across proficiency levels in Patterns and Algebra, Geometry, and Statistics and Probability in the pretest and posttest?

Is there a significant difference between the pretest and posttest scores of the students in content area mastery?

RESEARCH METHODOLOGY

Research Design

The effectiveness of instruction and the learning outcomes of students in mathematics are assessed in this study using a quantitative, quasi-experimental, one-shot pretest-posttest methodology. In a one-shot pretest-posttest design, students take a pretest to determine their baseline knowledge, receive standard instruction, and then take a posttest to see if their performance has changed.

This design allows researchers to pinpoint learning improvements that may be linked to the teaching given during the grading period, and it is especially appropriate for real classroom settings where random assignment is not feasible. The study compares students’ performance before and after teaching to (1) assess students’ topic mastery in important areas of mathematics.

Evaluate how well students are performing in particular learning competencies. (3) Examine changes in the distribution of proficiency levels.

Research Participants

This survey was completed by all Holy Name University eighth-graders enrolled for the 2024–2025 school year. A thorough enumeration procedure was employed because the study aimed to evaluate each participant’s academic performance. The GRACE PASS pretest and posttest were mandatory for all students. However, only the data from participants who completed both exams were included in the study. This criterion ensured the legitimacy of the results and the trustworthiness of the data while offering a comprehensive representation of student achievement within the grade level.

Grade 8 was purposefully selected as the focus of this study because it marks a crucial stage in the mathematics learning progression, where students are introduced to more abstract and foundational topics such as algebra, geometry, and probability. The participants enrolled during School Year 2024–2025 were taught under the K to 12 curriculum. As such, the data collected from this group may serve as baseline evidence for future comparisons once the MATATAG curriculum is implemented at the same grade level in School Year 2025–2026. This comparative potential adds value to the study, providing a meaningful reference point for evaluating the effects of curriculum revisions on instructional practices and student learning outcomes.

Research Instrument

The PASS (Performance Assessment of Standards and Skills), a standards-based assessment tool created by GRACE (Global Resources for Assessment, Curriculum, and Evaluation), is the primary tool in this study. The purpose of the instrument is to assess students’ levels of competency in the Grade 8 Mathematics competencies and standards set out by the Department of Education (DepEd).

Through clear, grade-level explanations of what students should know and be able to do, PASS (Performance Assessment of Standards and Skills) assesses students’ learning and progress and identifies their specific strengths and weaknesses in the main areas of mathematics, including Patterns and Algebra, Geometry, and Statistics and Probability.

Every item in the PASS tool is categorized using the proficiency level descriptors initially described in DO 73, s.., and relates to competencies established by DepEd. 2012, which is still in line with evaluation instruments based on standards. With the help of the DepEd grading scale equivalents for these proficiency levels—Beginning, Developing, Approaching Proficiency, Proficient, and Advanced—teachers can clearly and practically assess their students’ competence levels.

The PASS tool was developed using a rigorous multi-stage approach that included the following steps to guarantee the assessment’s quality and reliability: (1) Development is based on a conceptual framework guided by Anderson and Krathwohl’s Cognitive Approach Dimensions.

Under the direction of GRACE’s Table of Specifications (TOS), a group of subject matter experts (SMEs) from different disciplines create items. (3) Phoenix Publishing House will provide a second panel of SMEs to conduct an external examination of the goods. (4) Pilot testing and content validation in a few private Philippine schools. (5) Item analysis to determine construct validity and internal consistency, two psychometric qualities.

The PASS tests showed excellent reliability, with Cronbach’s alpha values varying from 0.5 to 0.8 across several scales. Furthermore, correlational investigations using teacher assessment scores and contrast group analyses were used to establish construct and convergent validity. These studies revealed significant differences between the low- and high-performing groups at α =.05.

Data Collection Procedure

This study used a systematic and structured data-gathering approach, beginning with ethical considerations and progressing through four main phases: administration of the pretest, the instructional period, the administration of the posttest, and data tabulation and organization. These methods were implemented to ensure that Grade 8 students’ performance was measured accurately and that their learning gains in Mathematics were analyzed meaningfully.

Ethical Considerations

To preserve all participants’ rights, dignity, and welfare, this study will closely follow accepted ethical guidelines for conducting educational research. Formal consent from the school administration will be acquired before the study starts, and all Grade 8 students’ parents or legal guardians will be notified. All registered Grade 8 students must take the GRACE PASS pretest and posttest since they are a component of the school’s regular academic evaluation process, which aims to evaluate the effectiveness of education and enhance teaching methods. Even though taking the tests is an academic requirement, all personal data gathered will be treated with the utmost confidentiality, and all analyses and results will be presented with the students’ identities hidden. Stakeholders will be informed of the study’s goals and methods to maintain ethical standards and guarantee openness throughout the investigation.

All information gathered will be handled confidentially and used only for research and academic purposes. No personal identifiers will be noted or shared during the data analysis, presentation, and publication of the findings. All performance data and responses will be anonymized through coding techniques to protect participant privacy further. The study intends to uphold the most excellent standards of integrity, accountability, and responsibility in examining math student performance by implementing these ethical measures.

These are the four main phases for this study:

Administration of the Pretest – At the beginning of the grading period, all Grade 8 students enrolled for the School Year 2024–2025 took the GRACE PASS pretest. The pretest aimed to establish baseline data on students’ performance in the five fundamental areas of mathematics: patterns, algebra, geometry, statistics, and probability. Proctors and subject teachers attended an orientation before exam administration to guarantee uniformity of practices.

The pretest was given in a classroom environment where students were accustomed to reducing anxiety and distractions. In order to objectively assess their past knowledge and abilities without outside influence, students were encouraged not to prepare, especially for the pretest. Standard testing procedures were adhered to to guarantee the validity, reliability, and impartiality of the results.

Instructional Period – After the pretest, students participated in a full year of instruction where they learned mathematics using the Department of Education’s Most Essential Learning Competencies (MELCs). The Constructivist Learning Theory, which prioritizes experiential engagement, active learning, and knowledge creation via meaningful activities, is the foundation for instructional practices. Several constructivist teaching strategies, including contextualized problem-solving, cooperative group projects, problem-based learning, and guided discovery, were used in the classroom. In order to track student development and make well-informed instructional adjustments, teachers were also urged to use continuous formative assessments. Based on pretest results, the instruction was designed to support mastery of the competencies needing improvement.

Administration of the Posttest – The same group of students was given the GRACE PASS posttest following the class period. In order to accurately assess learning gains, the posttest was created to be similar to the pretest in structure and difficulty. Similar to the pretest, the posttest was given by qualified staff members in standardized settings. In order to ascertain increases in topic mastery, competency performance, and proficiency level distribution across the evaluated domains, the posttest findings were utilized as a summative assessment of student progress.

Data Tabulation and Organization – Student scores were gathered, validated, and entered into a secure database when the pretest and posttest were finished. Following that, the data were methodically tabulated and grouped according to (1) the mathematics content domains (Patterns and Algebra, Geometry, and Statistics and Probability); (2) the DepEd MELC-aligned targeted learning competencies; and (3) the GRACE PASS assessment tool’s proficiency levels (Beginning, Developing, Approaching Proficiency, Proficient, and Advanced).

This organization made a thorough evaluation of both individual and group performance possible. In order to find significant changes between pretest and posttest scores and to perform a gap analysis that identifies areas that need instructional reinforcement, the results were prepared for statistical testing. The analysis’s findings were the foundation for creating a focused intervention program that filled the observed learning gaps.

Statistical Treatment of Data

Various statistical techniques were used to examine the data gathered to answer the research objectives and assess how instruction affected students’ mathematics performance. Analyses were conducted to ascertain student achievement in topic mastery, competency performance, and proficiency level distribution using the GRACE PASS assessment tool’s pretest and posttest data. The following methods were applied:

Descriptive Statistics – Students’ performance levels in the five mathematics curriculum areas, patterns, algebra, geometry, statistics, and probability, were described using measures of central tendency, including mean, frequency counts, and percentages. Performance in particular learning abilities, was analyzed using descriptive statistics.

Proficiency Level Analysis – The GRACE PASS assessment tool’s competency level descriptors, Beginning, Developing, Approaching Proficiency, Proficient, and Advanced, were used to categorize student scores. This classification assesses students’ mastery of the DepEd criteria and establishes their distribution throughout proficiency levels.

Paired Sample t-test – This was done to compare the students’ mean pretest and posttest scores. In particular, competency-based performance and content-area mastery are examined to see if there are statistically significant variations between student performance before and after instruction.

FINDINGS AND RESULTS

Table 1. Level of student performance in the various Mathematics content areas, as reflected in the GRACE PASS pretest and posttest results.

n=137

Content Area Pre-Test (%) Description Post Test (%) Description
Patterns and Algebra 32 D 41 AP
Geometry 30 D 38 D
Statistics and Probability 25 D 31 D
Overall 30 D 38 D

Legend: B -Beginning, D-Developing, AP-Approaching Proficiency, P-Proficient, & A– Advanced

Table 1 presents the level of student performance in various Grade 8 Mathematics content areas based on the GRACE PASS pretest and posttest scores. The overall pretest mean score was 30%, classified under the Developing (D) level, indicating that most students partially understood the core concepts before instruction. After the intervention, the overall posttest mean improved slightly to 38%, remaining within the Developing category. Although this reflects an eight percentage point gain, it suggests that while students made progress, they still require further support to achieve complete content mastery.

When examined by content area, the most significant improvement occurred in Patterns and Algebra, where students moved from 32% (Developing) in the pretest to 41% (Approaching Proficiency) in the posttest. This shift indicates that instruction in this domain was more effective in facilitating conceptual understanding and application skills, potentially due to its foundational nature and more direct integration in classroom activities. In contrast, geometry, Statistics, and Probability remained within the Developing level in both assessments, with only marginal improvements of 8% and 6%, respectively. Due to abstract content, limited instructional time, or students’ prior misconceptions, these results may present challenges in effectively delivering these topics.

From a constructivist perspective, the incremental learning gains suggest that while students are beginning to construct mathematical knowledge, the depth and transfer of understanding may not yet be fully established. This aligns with Vygotsky’s Zone of Proximal Development concept, highlighting the need for more targeted scaffolding to move learners toward higher proficiency levels. Meanwhile, Bourdieu’s Social Reproduction Theory provides a lens to understand why some students may continue to struggle despite instruction; the persistent “Developing” rating overall may reflect disparities in access to prior learning support, cultural capital, or home learning environments.

These findings underscore the need for focused instructional strategies and differentiated remediation, especially in areas where posttest scores remained below the Approaching Proficiency threshold. According to Ramos and Mendoza (2021), teachers can evaluate students’ mastery of particular mathematical competencies using assessments that align with standards, such as GRACE PASS, highlighting the value of using it to inform responsive teaching and identify content areas that require intensified intervention.

Table 2.1 presents students’ performance in various learning competencies under the Patterns and Algebra content area, based on GRACE PASS pretest and posttest scores. The overall performance improved from 32% (Developing) in the pretest to 41% (Approaching Proficiency) in the posttest. This shift suggests moderate improvement in student understanding of the targeted competencies after instruction. However, the results also reveal significant variability across specific competencies, with some areas showing substantial gains while others stagnated or declined.

Table 2.1 Students’ performance in Patterns and Algebra learning competency before and after instruction.

n=137

Learning Competencies Pre-test (%) QD Post-test (%) QD
1.  Illustrates the rectangular coordinate system and its uses 36 D 55 AP
2. Solves problems involving systems of linear equations in two variables 44 AP 58 AP
3. Factors completely different types of polynomials (polynomials with common monomial factor, trinomials, and general trinomials) difference of two squares, sum and difference of two cubes, perfect square 35 D 31 D
4.  Illustrates a system of linear equations in two variables 52 AP 66 P
5. Finds the equation of a line given (a) two points; (b) the slope and a point; (c) the slope and its intercepts 35 D 43 AP
6.  Illustrates rational algebraic expressions 32 D 47 AP
7.  Graphs a system of linear equations in two variables 20 D 58 AP
8.  Illustrates the slope of a line 29 D 61 P
9.  Solves problems involving linear functions 30 D 37 D
10. Graphs a linear function’s (a) domain; (b) range; (c) table of values; (d) intercepts; and (e) slope 46 AP 30 D
11. Writes the linear equation ax + by = c in the form y = mx + b and vice versa 17 B 12 B
12. Simplifies rational algebraic expressions 22 D 27 D
13. Solves a system of linear equations in two variables by (a) graphing; (b) substitution; (c) elimination 26 D 33 D
14. Finds the slope of a line given two points, equation, and graph 20 D 28 D
15. Illustrates linear equations in two variables 35 D 23 D
16. Illustrates linear inequalities in two variables 23 D 36 D
17. Categorizes when a given system of linear equations in two variables has graphs that are parallel, intersecting, and coinciding 32 D 26 D
18. Solves problems involving linear inequalities in two variables 27 D 30 D
19. Solves problems involving systems of linear inequalities in two variables 24 D 25 D
Overall 32 D 41 AP

Legend: B -Beginning, D-Developing, AP-Approaching Proficiency, P-Proficient, & A– Advanced

Notably, students achieved the highest gains in “Illustrates the slope of a line” (from 29% to 61%, D to P) and “Graphs a system of linear equations” (from 20% to 58%, D to AP), indicating that visual and applied tasks may have been more effectively taught or better retained. The competency “Illustrates a system of linear equations in two variables” also showed strong performance, rising from 52% (AP) to 66% (Proficient). These gains align well with Constructivist Theory; active construction of knowledge has been shown to increase student engagement, appreciation, and comprehension in real-world classroom settings (Santos et al., 2020; Villanueva & Cruz, 2021), particularly Vygotsky’s emphasis on learning through visual, scaffolded, and context-based experiences, suggesting that graphs and real-world representations likely helped students construct meaningful knowledge.

Conversely, there are learning competencies where performance declined or remained stagnant. For instance, “Factors completely different types of polynomials” dropped from 35% to 31%, and “Graphs a linear function’s domain, range, etc.” declined from 46% (AP) to 30% (D). Similarly, competencies like “Writes linear equations in the form y = mx + b and vice versa” remained low (17% to 12%, both at the Beginning level). These indicate possible instructional gaps, cognitive overload, or insufficient student readiness for abstract or procedural content.

Overall, the data affirm that instruction positively affected student learning in several competencies but also exposed specific areas that need enhanced teaching strategies, remediation, or conceptual reinforcement. These insights can inform targeted interventions, especially in areas involving algebraic manipulation, abstract representation, and transformation of equations.

Table 2.2 Students’ performance in Geometry learning competency before and after instruction.

n=137

Learning Competencies Pre-test (%) QD Post-test (%) QD
1. Determines            the    relationship                               between                               the hypothesis and the conclusion of conditional statements 40 AP 53 AP
2. Illustrates the SAS, ASA, and SSS congruence postulates 19 D 45 AP
3.  Applies theorems on triangle inequalities 40 AP 39 D
4.  Illustrates triangle congruence 24 D 49 AP
5. Illustrates theorems on triangle inequalities (exterior angle inequality theorem, triangle inequality theorem, hinge theorem) 50 AP 30 D
6.  Prove statements on triangle congruence 29 D 46 AP
7. Solves             corresponding    parts    of                       congruent triangles 35 D 44 AP
8. Prove properties of parallel lines cut by a transversal 29 D 28 D
9. Transforms a statement into an equivalent if-then statement 43 AP 31 D
10. Uses inductive or deductive reasoning in an argument 23 D 25 D
11. Proves inequalities in a triangle 43 AP 33 D
12. Illustrates the need for an axiomatic structure of a mathematical system in general, and in geometry in particular: (a) defined terms; (b) undefined terms; (c) postulates; and (d) theorems 29 D 15 B
13.     Applies   triangle   congruence to construct perpendicular lines and angle bisectors 13 B 25 D
14. Illustrates the equivalences of: (a) the statement and its contrapositive; and (b) the converse and the inverse of a statement 19 B 38 D
15. Write a proof (both direct and indirect) 31 D 28 D
16. Determines the conditions under which lines and segments are parallel or perpendicular 27 D 42 AP
17.      Determines    the    inverse,           converse,       and contrapositive of an if-then statement 14 B 44 AP
Overall 30 D 37 D

Legend: B -Beginning, D-Developing, AP-Approaching Proficiency, P-Proficient, & A– Advanced

Table 2.2 presents the pretest and posttest performance of Grade 8 students in various Geometry learning competencies based on GRACE PASS assessment data. The overall mean score improved slightly from 30% (Developing) in the pretest to 37% (Developing) in the posttest. Although this indicates a seven percentage-point gain, the results suggest that students’ progress in Geometry was limited, with many competencies remaining within the Developing category after instruction.

Some competencies showed encouraging improvements. Notably, “Illustrates the SAS, ASA, and SSS congruence postulates” improved from 19% to 45% (D to AP), and “Illustrates triangle congruence” rose from 24% to 49% (D to AP). Similarly, “Determines the inverse, converse, and contrapositive of an if-then statement” jumped from a low 14% (Beginning) to 44% (Approaching Proficiency). These suggest students were more likely to internalize when visual, structured, or rule-based concepts were taught with clarity and scaffolding. This supports Constructivist Theory, particularly Vygotsky’s emphasis on using guided instruction and sequential tasks to develop understanding through the Zone of Proximal Development (ZPD) (Vygotsky, 1978).

However, several competencies showed slight improvement, and some even declined. For example, “Illustrates theorems on triangle inequalities” dropped from 50% to 30% (AP to D), and “Transforms a statement into an equivalent if-then statement” decreased from 43% to 31%. These regressions raise questions about how the retention and transfer of abstract logic and proof-based reasoning were addressed during instruction. Additionally, “Illustrates the need for an axiomatic structure of a mathematical system” declined from 29% to 15% (D to B), suggesting that students struggled with foundational yet abstract meta-concepts in Geometry.

As students go from intuitive, experience-based spatial thinking to formal, rule-based geometric reasoning, constructivist teaching strategies in geometry must be thoughtfully designed to offer the required scaffolding (Liang et al., 2023). They contend that without this deliberate process, students might enthusiastically participate in practical exercises but cannot integrate or synthesize formal features and generalizations, limiting their capacity to apply what they have learned to abstract problems.

Overall, the data suggest that while some Geometry competencies were positively impacted by instruction, many areas require further support, especially those involving logical reasoning, abstract thinking, and formal proofs. These findings highlight the need for teachers to employ differentiated instruction, use of manipulatives, visual aids, and step-by-step scaffolding, particularly when teaching high-cognitive-demand Geometry concepts. Additionally, formative feedback and continuous reinforcement may be critical in helping students retain complex topics beyond initial instruction.

Table 2.3 presents the performance of Grade 8 students in the Data and Probability learning competencies, as measured by GRACE PASS pretest and posttest scores. The overall mean score increased slightly from 25% to 30%, but both scores remain within the Developing

(D) proficiency level. This minimal gain of only five percentage points suggests that instruction had a limited impact on improving students’ mastery in this content area.

Among the individual competencies, the most notable improvement was observed in “Solves problems involving probabilities of simple events,” which rose from 36% (D) to 53% (Approaching Proficiency). This indicates that students could grasp basic probability concepts when presented in familiar or straightforward contexts. However, the majority of competencies remained at the Beginning or Developing level. For example, “Counts the number of occurrences in an experiment” and “Illustrates an experiment, outcome, sample space, and event” showed only slight gains, while “Illustrates experimental and theoretical probability” even declined from 41% to 26%.

Table 2.3 Students’ performance in Statistics and Probability learning competency before and after instruction.

n=137

Learning Competencies Pre-test (%) QD Post-test (%) QD
1. Solves problems involving probabilities of simple events 36 D 53 AP
2. Counts           the    number    of                     occurrences     in    an experiment: (a) table; (b) tree diagram; (c) systematic listing; and (d) fundamental counting principle 16 B 32 D
3.  Finds the probability of a simple event 32 D 31 D
4. Counts the number of occurrences of an outcome in an experiment: (a) table; (b) tree diagram;           (c)    systematic                         listing;                   and                         (d) fundamental counting principle 29 D 27 D
5.  Illustrates an experiment, outcome, sample space, and event 11 B 23 D
6.  Illustrates  an  experimental probability and a theoretical probability 41 AP 26 D
Overall 25 D 30 D

Legend: B -Beginning, D-Developing, AP-Approaching Proficiency, P-Proficient, & A– Advanced

Overall, the limited improvement in the Data and Probability strand highlights the need for enhanced instructional support, including using manipulatives, real-life applications, visual aids, and step-by-step guidance. Greater emphasis on building conceptual understanding before formalization may help bridge gaps and support deeper learning. Moreover, performance in this area reinforces the importance of equity-sensitive and differentiated instruction to address varied learning needs.

Table 3. Distribution of students across proficiency levels in the pretest and posttest

n=137

Proficiency Level Pre-Test Post Test
No. of Students Percentage (%) No. of Students Percentage (%)
Beginning 6 4 4 3
Developing 117 84 82 60
Approaching Proficiency 16 12 45 33
Proficient 0 0 6 4
Advanced 0 0 0 0
Total 139 100 137 100

Table 3 illustrates the distribution of Grade 8 students across five proficiency levels—Beginning, Developing, Approaching Proficiency, Proficient, and Advanced—based on their performance in the GRACE PASS pretest and posttest. The results reveal a positive shift in student performance after instruction, though most remained below the Proficient level.

In the pretest, most students (117 out of 137 or 84%) were categorized as Developing, while only 12% reached the Approaching Proficiency level. Notably, no students were in the Proficient or Advanced categories, indicating a general lack of mastery of the mathematics competencies prior to instruction. This pattern suggests a need for substantial instructional support across the cohort at the start of the learning cycle.

After instruction, the posttest results show a clear improvement: the number of students in the Approaching Proficiency category increased from 16 to 45 (12% to 33%), and six students (4%) reached the Proficient level. The Developing category decreased from 84% to 60%, showing that several students advanced to higher proficiency levels. Additionally, the Beginning group slightly decreased from 6 to 4 students. Although these gains are promising, it is important to note that no students have reached the advanced level, and 64% remain at or below the developing level.

In sum, while the posttest results indicate a positive learning trajectory, the data also highlight the need for more targeted interventions, equity-focused teaching strategies, and continuous assessment to support students, particularly those in the Developing and Beginning categories. This supports the rationale for using GRACE PASS for evaluation and as a diagnostic tool to guide instruction and remediation. As Lee and Lee (2021) emphasized, structured pretesting and post-testing enable the tracking of learners’ progress and the identification of misconceptions, which are essential for implementing tailored and responsive teaching approaches.

Table 3.1 Distribution of students across proficiency levels in the pretest and posttest (Patterns and Algebra)

n=137

Proficiency Level Pre-Test Post Test
No. of Students Percentage (%) No. of Students Percentage (%)
Beginning 12 9 7 5
Developing 103 74 64 47
Approaching Proficiency 22 16 50 37
Proficient 2 1 15 10
Advanced 0 0 1 1
Total 139 100 137 100

The distribution of students across proficiency levels in Patterns and Algebra showed notable shifts from the pretest to the posttest. Before intervention, most students (74%) were classified as Developing, with only 1% reaching the Proficient level and none at the Advanced level. A small proportion (16%) were Approaching Proficiency, while 9% were at the Beginning level. Following the intervention, the proportion of students at the Developing level decreased substantially to 47%. At the same time, the percentage of students Approaching Proficiency more than doubled, rising to 37%. The number of students achieving Proficient status increased markedly to 10%, and one student (1%) reached the Advanced level. The proportion of students at the Beginning level also decreased to 5%.

These results indicate a clear improvement in students’ Proficiency in Patterns and Algebra after the intervention. The reduction in the percentage of students at the Beginning and Developing levels and the increase in those at Approaching Proficiency, Proficient, and Advanced levels suggest that the instructional approach or intervention effectively promoted student learning and progression. Notably, the shift from 1% to 10% in the Proficient category and the emergence of students in the Advanced category highlight significant gains among higher-achieving students. The substantial increase in students moving from Developing to Approaching Proficiency demonstrates positive movement across proficiency thresholds. These findings support the effectiveness of the intervention in elevating overall student achievement in Patterns and Algebra, as evidenced by the improved distribution of proficiency levels in the posttest.

Table 3.2 Distribution of students across proficiency levels in the pretest and posttest (Geometry)

n=137

Proficiency Level Pre-Test Post Test
No. of Students Percentage (%) No. of Students Percentage (%)
Beginning 16 12 12 9
Developing 107 76 77 56
Approaching Proficiency 16 12 37 27
Proficient 0 0 11 8
Advanced 0 0 0 0
Total 139 100 137 100

The data analysis in Table 3.2 reveals a clear shift in students’ proficiency levels in Geometry from pretest to posttest. Before the intervention, most students (76%) were at the Developing level, while 12% were at both the Beginning and Approaching Proficiency levels. Notably, no students achieved Proficient or Advanced status in the pretest. After the intervention, the proportion of students at the Developing level decreased to 56%, and those at the Beginning level dropped to 9%. The percentage of students Approaching Proficiency more than doubled to 27%. Additionally, 8% of students reached the Proficient level in the posttest, though no students attained the Advanced level.

The results demonstrate a positive impact of the intervention on students’ Geometry proficiency. The substantial reduction in students at the Beginning and Developing levels, alongside increases in the Approaching Proficiency and Proficient categories, suggests that the instructional strategies employed effectively promoted student progress. The emergence of students in the Proficient category (from 0% to 8%) is particularly noteworthy, indicating that some students made significant gains. However, the absence of students at the Advanced level in both assessments suggests that while the intervention supported overall improvement, further strategies may be needed to help students reach the highest proficiency levels. Overall, the data indicate meaningful advancement in Geometry understanding among the participants.

Table 3.3 Distribution of students across proficiency levels in the pretest and posttest (Statistics and Probability)

n=137

Proficiency Level Pre-Test Post Test
No. of Students Percentage (%) No. of Students Percentage (%)
Beginning 42 30 32 24
Developing 81 58 63 46
Approaching Proficiency 14 11 40 29
Proficient 2 1 2 1
Advanced 0 0 0 0
Total 139 100 137 100

Table 3.3 shows the distribution of students across proficiency levels in Statistics and Probability before and after the intervention. In the pretest, the majority of students were at the Developing level (58%), followed by Beginning (30%), Approaching Proficiency (11%), and only 1% at the Proficient level. No students reached the Advanced level. After the intervention, the percentage of students at the Developing level decreased to 46%, and those at the Beginning level dropped to 24%. Notably, the proportion of students Approaching Proficiency increased substantially to 29%. The percentage of students at the Proficient level remained unchanged at 1%, and no students achieved the Advanced level in the posttest.

The results indicate a positive shift in students’ proficiency in Statistics and Probability following the intervention. There was a marked decrease in the proportion of students at the Beginning and Developing levels, and a significant increase in those Approaching Proficiency. This suggests that the intervention was effective in moving a considerable number of students towards higher proficiency. However, the percentage of students at the Proficient level did not increase, and no students achieved the Advanced level. This indicates that while the intervention supported overall improvement, additional strategies may be needed to help students reach the highest proficiency tiers. The data reflect meaningful progress in foundational understanding, with room for further growth at advanced levels.

Table 4 presents the results of the paired sample t-test comparing pretest and posttest scores across three content areas: Patterns and Algebra, Geometry, and Statistics and Probability, as well as the overall performance. The analysis shows statistically significant improvements in all areas, with p-values less than 0.05, leading to the rejection of the null hypothesis in each case.

In Patterns and Algebra, the mean score increased from 31.93 (SD = 10.73) in the pretest to 40.58 (SD = 14.74) in the posttest, with a t-value of 5.904 and p-value of 0.000. This indicates a significant improvement in students’ understanding after instruction, suggesting that the interventions applied in this strand effectively addressed learning gaps. Similar results were observed in Geometry, where the mean rose from 29.82 to 37.45, with a highly significant t-value of 5.683 (p = 0.000). These findings are consistent with the constructivist view that scaffolded, concept-focused instruction enhances student understanding when delivered effectively over time.

Table 4. Difference between the pretest and posttest scores of the students in terms of content area mastery.

n=137

Mean Std. Deviatio n Std. Error Mean t-value p-value Decision
Patterns And Algebra Pre Test 31.9333 10.73229 0.92369   5.904   0.000   Reject Ho
Posttest 40.5778 14.74027 1.26864
Geometry Pre Test 29.8222 9.07997 0.78148   5.683   0.000   Reject Ho
Posttest 37.4519 14.25080 1.22651
Statistics And Probability Pre Test 23.9111 13.37321 1.15098   3.501   0.001   Reject Ho
Posttest 30.3481 15.68549 1.34999
  Overall Pre Test 30.0222 7.20019 0.61969   7.720   0.000   Reject Ho
Posttest 37.8741 11.43863 0.98448

In Patterns and Algebra, the mean score increased from 31.93 (SD = 10.73) in the pretest to 40.58 (SD = 14.74) in the posttest, with a t-value of 5.904 and p-value of 0.000. This indicates a significant improvement in students’ understanding after instruction, suggesting that the interventions applied in this strand effectively addressed learning gaps. Similar results were observed in Geometry, where the mean rose from 29.82 to 37.45, with a highly significant t-value of 5.683 (p = 0.000). These findings are consistent with the constructivist view that scaffolded, concept-focused instruction enhances student understanding when delivered effectively over time.

In Statistics and Probability, while the initial pretest mean was lower at 23.91, the posttest mean rose to 30.35, with a t-value of 3.501 and p-value of 0.001. Although still statistically significant, this smaller t-value suggests that gains in this area were more modest, possibly due to the abstract nature of probability and data interpretation or time constraints in covering the topic.

The overall mean performance also reflected meaningful progress, increasing from 30.02 to 38.13, supported by a t-value of 7.720 and a p-value of 0.000. These results confirm that instruction, as measured through the GRACE PASS framework, positively impacted student learning across all domains.

While the gains are encouraging, the results indicate a continued need for targeted interventions, particularly in content areas where posttest scores remain below mastery levels. This reinforces the importance of using assessment data to measure learning and inform and adapt instructional strategies that address diverse student needs. According to Park and Kim (2020), assessment designs incorporating pretest-posttest measures provide valuable diagnostic insights that allow teachers to make timely and responsive adjustments to their teaching. Similarly, Abad and Francisco (2021) found that posttest gains were most significant when instruction was tailored to address specific learning gaps identified in pretests rather than strictly following a fixed curriculum. These findings underscore the value of using tools like GRACE PASS for evaluation and as a means to enable equity-focused, data-informed, and flexible teaching practices.

CONCLUSIONS

The findings of this study reveal that instruction guided by the GRACE PASS framework had a statistically significant impact on student performance in Grade 8 Mathematics. Students demonstrated notable improvements across all three content areas (Patterns and Algebra, Geometry, and Statistics and Probability) as evidenced by higher posttest scores and the rejection of the null hypothesis in all cases. The most substantial gains were observed in Patterns and Algebra, which reached the Approaching Proficiency level, while Geometry, Statistics, and Probability remained within the Developing category. These results suggest that while the instruction was generally effective, its impact varied across content areas, with some competencies remaining difficult for students to master.

The analysis of specific learning competencies further revealed that students benefited most from visual and structured tasks, aligning with Constructivist principles of scaffolding and active learning. However, persistent low scores in specific abstract or procedural topics—particularly in Geometry and Probability—indicate conceptual understanding and knowledge retention challenges. Most students remained at or below the Developing proficiency level, underscoring the need for continued academic support. This outcome affirms the relevance of Social Reproduction Theory, suggesting that disparities in cultural and academic capital may contribute to uneven progress among learners, even with instructional interventions.

Overall, the GRACE PASS tool proved helpful for performance evaluation and as a diagnostic instrument that can inform targeted remediation and adaptive instruction. While the posttest results indicate a positive learning trajectory, the data emphasize the need for ongoing, equity-focused strategies to close performance gaps and help all students progress toward higher proficiency levels.

RECOMMENDATIONS

Based on the findings and conclusions, the following recommendations are proposed:

Differentiate Instructional Strategies – Teachers should intensify support in areas where students remain at the Developing level, especially in Geometry, Statistics, and Probability. This includes using manipulatives, visual aids, and step-by-step modeling to scaffold abstract and procedural concepts.

Strengthen Diagnostic Use of GRACE PASS – Schools are encouraged to use GRACE PASS for assessment and as a formative diagnostic tool to continuously monitor students’ understanding and adjust instruction based on pretest results and ongoing classroom data.

Focus on Conceptual Reinforcement – Instruction should emphasize deep, conceptual learning over procedural memorization. Reinforcing foundational concepts through real-life contexts and repeated exposure can help solidify understanding in underperforming areas.

Provide Targeted Remediation and Enrichment – Implement structured remediation programs for beginning and lower-developing students while offering enrichment opportunities for those nearing Proficiency to ensure differentiated learning support.

Support Equity in Instructional Access – Schools and policymakers should address contextual and structural factors affecting student performance, such as access to learning materials, home support, and instructional time, especially for learners from underserved backgrounds.

Conduct Longitudinal Monitoring – Future studies should track students’ learning over multiple quarters or school years to assess long-term gains and the sustained impact of GRACE PASS–aligned instruction under curriculum changes, such as the transition to the MATATAG curriculum.

REFERENCES

  1. Abad, , & Francisco, M. (2021). Instructional modifications based on pretest weaknesses: An outcome-based approach. Journal of Educational Interventions, 19(2), 55–68.
  2. Alonzo, D. (2020). Impact of structured formative assessments on academic
  3. Philippine Journal of Education, 96(2), 34–48.
  4. Andrade, L., & Brookhart, S. M. (2020). Classroom assessment as the foundation for teaching and learning. Harvard Education Press.
  5. Becerra, R., & Diaz, C. (2023). Constructive learning and algebraic mastery in middle
  6. Mathematics Education Review, 18(1), 41–57.
  7. Black, , & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21(1), 5–31. https://doi.org/10.1007/s11092-008-9068-5
  8. Bourdieu, (1986). The forms of capital. In J. Richardson (Ed.), Handbook of theory and research for the sociology of education (pp. 241-258). Greenwood Press
  9. Bourdieu, , & Passeron, J.-C. (1977). Reproduction in Education, Society and Culture (R. Nice, Trans.). London: Sage Publications
  10. Chang, , & Zhao, Y. (2021). Pre-assessment-driven intervention programs in mathematics. Journal of Mathematics Learning, 14(3), 221–240.
  11. Delgado, M., & Rivera, G. (2023). Competency-specific assessments and student Journal of Curriculum Studies, 35(4), 108–122.
  12. Dominguez, , & Salazar, P. (2022). Assessing competency alignment in classroom assessments. Philippine Journal of Educational Assessment, 27(2), 58–73.
  13. Espinoza, , & Cruz, F. (2023). Pretest-posttest utilization and its correlation with standardized test success. Educational Measurement and Policy Studies, 11(1), 12–26.
  14. Garcia, (2022). Comparing constructivist and traditional teaching in Statistics and Probability. Asian Journal of Mathematics Education, 20(2), 130–144.
  15. Garcia, M., & Molina, S. (2021). Using formative assessments to enhance learning Teaching Mathematics in Asia, 16(1), 23–38.
  16. Hattie, (2020). Visible learning: A synthesis of over 1,600 meta-analyses relating to achievement (Updated ed.). Routledge.
  17. Hattie, , & Zierer, K. (2021). 10 mindframes for visible learning: Teaching for success. Routledge.
  18. Ikpuri, E. O. (2023). The role of social reproduction theory in understanding the issue of inequality in the United States education system. International Journal of Latest Research in Humanities and Social Science, 6(9), 140-146.
  19. Johnson, (2022). Predictive value of pretest-posttest scores in mathematics domains. International Journal of Educational Research, 45(2), 94–106.
  20. Lee, , & Lee, J. (2021). Assessment for learning: The role of pre- and post-tests. Asia-Pacific Education Journal, 33(1), 67–82.
  21. Liang, J., Hwang, G. J., & Chen, S. Y. (2023). Effects of a visual analytics-supported inquiry-based learning model on learners’ mathematical Educational Technology & Society, 26(1), 45–60.
  22. Luo, (2025). Is social reproduction a useful theoretical lens for understanding the relationship between education and career destinations? Scientific and Social Research, 7(5), 137–145. https://doi.org/10.26689/ssr.v7i5.10675
  23. Martinez, , & Ramos, S. (2021). Identifying learning gaps in geometry and algebra through pretest-posttest. Middle School Mathematics Review, 15(2), 112–126.
  24. Nguyen, , Lopez, M., & de Vera, C. (2020). Targeted intervention on number sense. International Journal of Numeracy Education, 8(1), 78–91.
  25. (2019). PISA 2018 results: Combined executive summaries. https://www.oecd.org/pisa Ortega, N., Santos, G., & Villaruel, A. (2021). The effect of competency-driven instruction on academic achievement. Educational Innovations Journal, 19(3), 88–104.
  26. Park, J., & Kim, H. (2020). The role of pretest-posttest designs in instructional decision- Teaching and Teacher Education, 90, 103019.
  27. Piaget, J. (1952). The origins of intelligence in children. International Universities Press. Ramos, , & Mendoza, A. (2021). Standards-aligned assessment and competency acquisition. Journal of Curriculum and Instruction, 12(2), 77–90.
  28. Santos, , Villanueva, F., & Cruz, M. (2020). Constructivist pedagogy in Filipino mathematics classrooms. Philippine Journal of Constructivist Education, 8(2), 101–116.
  29. Suhaili, (2025). An Assessment of the Role of Education in Promoting Social Equality among Selected Public Secondary Schools in Jolo. Journal of Education and Academic Settings, 2(1), 1–28. https://doi.org/10.62596/1h2w1k67
  30. Tan, , & Chua, A. (2022). Long-term comprehension of measurement concepts through evaluation. Mathematics Instruction Journal, 11(3), 39–53.
  31. Villanueva, , & Cruz, A. (2021). Students’ experiences in constructivist math instruction. Asian Journal of Constructivist Teaching, 10(1), 66–78.
  32. Vygotsky, S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.

APPENDIX A

School Profile Report on GRACE PASS

School Profile Report on GRACE PASS

APPENDIX B

Students' Competency Performance on GRACE PASS Pre-Test

Students’ Competency Performance on GRACE PASS Pre-Test

Students' Competency Performance on GRACE PASS Post-Test

APPENDIX C

Students' Competency Performance on GRACE PASS Post-Test

Students’ Competency Performance on GRACE PASS Post-Test

APPENDIX D

Students’ Performance Progress on GRACE PASS (Pre Test & Post Test)

APPENDIX E

 Students' Proficiency Distribution on GRACE PASS (Pre Test & Post Test)

 Students’ Proficiency Distribution on GRACE PASS (Pre Test & Post Test)

APPENDIX F

Students' Class Report (Section) on GRACE PASS (Pre Test)

 Students’ Class Report (Section) on GRACE PASS (Pre Test)

APPENDIX G

Students' Class Report (Section) on GRACE PASS (Post Test)

Students’ Class Report (Section) on GRACE PASS (Post Test)

Students' Class Report (Section) on GRACE PASS (Post Test)

Students' Class Report (Section) on GRACE PASS (Post Test)

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

10 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER