International Journal of Research and Innovation in Applied Science (IJRIAS)

Submission Deadline-Today
November 2024 Issue : Publication Fee: 30$ USD Submit Now
Submission Deadline-05th December 2024
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-20th November 2024
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Performance-Based Assessment Task in Mathematics: A Standard-Based Practice in the Face of Health Crisis

  • Janet F. Rabut
  • 423-440
  • Apr 18, 2024
  • Education

Performance-Based Assessment Task in Mathematics: A Standard-Based Practice in the Face of Health Crisis

*Janet F. Rabut, PhD

Faculty, Sultan Kudarat State University

ACCESS, EJC Montilla, 9800 City of Tacurong, Sultan Kudarat, Philippines

DOI: https://doi.org/10.51584/IJRIAS.2024.90339

Received: 02 March 2024; Accepted: 13 March 2024; Published: 18 April 2024

ABSTRACT

This study utilized the Performance-based Assessment Task (P-bAT) to evaluate respondents based on two key criteria: (a) accuracy and procedural skills, as well as the (b) conveyance of logical and mathematical ideas. The tool was rigorously assessed for its quality features, including the (a) clarity of learning goals, (b) appropriateness of the assessment tool, (c) fairness, and (d) real-life relevance, before being implemented through face-to-face methods despite the challenges posed by the COVID-19 pandemic. The findings indicated that students performed at a moderate to high level due to tasks being appropriately challenging yet achievable, leading to above-average performance levels. Subsequently, a select group of five respondents engaged in a focused discussion to test their comprehension of the tool in terms of (a) task understanding, (b) knowledge of how to answer the tasks, and (c) awareness of scoring methods. Participants acknowledged the uniqueness and difficulty of P-bAT, recognizing its holistic measurement of abilities and skills. The tool’s emphasis on challenging students’ critical thinking was evident, with grading conducted using rubrics to ensure accurate, efficient, fair, and reliable assessment. This approach instilled confidence in students that their grades reflected their true capabilities and efforts.

Keywords: performance-based, accuracy, procedural, mathematical communication, rubric

INTRODUCTION

In an effort to create standard-based practices of assessment and grading procedures, the researcher administered Performance-based Assessment Task (P-bAT) to the students. It is a fact that performance-based methods of learning and assessment are reliable goals for standards-based instruction and demonstrate a promise for change. Since performance-based assessment involves students engaging in an extended, meaningful mathematical task while teachers facilitate and assess their learning. Danielson (2016) stated unequivocally that assessment provides valuable information for both teachers and students about how well everyone is doing. As a result, performance evaluations can also be used to assess student learning.

Tacurong National High School is the only secondary school that offers STEM (Science Technology Engineering and Mathematics) strand in the entire City Schools Division of Tacurong. The school strictly follows the assessment standards as stipulated in the Policy Guidelines on Classroom Assessment for the K to 12 Basic Education Program of the Department of Education (“DO No. 08 s. 2015.pdf). In the grading system followed by the school in terms of giving the students their grades, the Performance Task component is allotted the highest percentage of 45% compared to the other two components, where Written Output is allotted20% while Periodic Examination is allotted with 35%.

STATEMENT OF THE PROBLEM

This study evaluated primarily the students’ performance levels in mathematics using Performance-based Assessment Task (P-bAT). Specifically, it sought to answer the following:

1. assess the quality level of the performance-based assessment tasks in mathematics in terms of:

  1. clarity of learning goals;
  2. appropriateness of performance-based assessment tool;
  3. fairness; and
  4. real-life related.

2. describe students’ perceptions on performance-based assessment tasks in terms of their:

  1. understanding of the tasks;
  2. knowledge in answering the tasks; and
  3. awareness of how these tasks were scored; and

3. evaluate the respondents’ performance level in terms of:

  1. accuracy and procedural skills; and
  2. conveyance of logical and mathematical ideas.

METHODOLOGY

This study employed a mixed-method research design. Johnson and Onwuegbuzie (2007) generally state that mixed-method research is a tactic to generate knowledge that attempts to consider multiple viewpoints, perspectives, positions, and standpoints. In this study, every respondent played a big part in order for the researcher achieve her primary goal. Hence, there were two sets of respondents to be utilized to help the researcher accomplish her purpose.

The first set of respondents were the mathematics teachers of the entire division handling mathematics subjects in senior high school. On the other hand, the second set of respondents were the grade 11 STEM students who successfully passed General Mathematics for the period of the first semester of S.Y. 2020-2021.

The primary research instruments used in this study were the three (3) performance-based assessment task activities formulated by the researcher herself. The topics were taken from the three-course contents of the General Mathematics learning guide. Specifically, functions, basic business mathematics, and logic. The said research instruments used a scoring rubric as a guide in assigning points to determine the performance level of the respondents to these activities. The second research instrument is in the form of a questionnaire that evaluates the quality level of performance-based assessment tasks in terms of learning goal clarity, appropriateness of the performance-based assessment tool, fairness, and real-life relevance. This questionnaire was rated by the mathematics teachers.

After the performance-based assessment tasks were conducted, scheduled group discussions were scheduled at the selected students’ most convenient time and place. Most interview questions were asked exactly as they were written. A cellular phone was used to record the video. The same with O’Brien and Tabaczynski (2007), the researcher believes that the interview is a widely used tool to access people’s experiences and their inner perceptions, attitudes, and feelings of reality. The researcher ensured that this study was conducted accordingly as to its purposes. Voluntariness of the respondents, both teachers and students, was observed. In like manner, the anonymity of the respondents was given the utmost confidentiality.

FINDINGS

In order to identify similarities and differences in the results, the findings were discussed in the light of previous research and available literature, where applicable.

On the Quality Level of P-bAT (Performance-based Assessment Task)

Quality assessment tool produces a quality score, according to Whiting et al., (2003). The researcher’s formulated P-bAT with an appropriate rubric was rated by the teachers in terms of these characteristics: clarity; appropriateness of the performance-based assessment tool; fairness; and real-life relevance. Each tool was rated using the three-point Likertscale where 1 or “Low Quality” was the lowest, up to 3 or “High Quality” was the highest.

Table 1 shows the mean distribution results on the quality level of the researcher’s main tool. Reflected in the said table that the three performance-based assessment tasks were of “High Quality”.  The teachers believed that the tools were formulated meticulously according to the competencies intended for grade 11 learners and that the tool measured what is intended to measure according to their purpose.

Table 1.  Mean Distribution on the Quality Level of Performance -based Assessment Tasks; n = 28.

Task Characteristics Mean Interpretation
Clarity 2.58 High Quality
Appropriateness of P-bAT Tool 2.62 High Quality
Fairness 2.46 High  Quality
Real – life Related 2.54 High Quality
P-bAT   Overall Mean 2.55 High Quality

Legend:  1.00 – 1.66 (Low Quality); 1.67 – 2.34 (Moderate Quality); 2.35 – 3.00 (High Quality).

When performance-based assessment is concerned, clarity of the problem as to the instruction and criteria of the task must be highlighted to both instructors and students. This would help the teachers determine whether the students know what to do or not. When students know what is expected of them, they are far more likely to be able to produce their answers. On the other hand, the “appropriateness” of the tool is also important. Every teacher must see to it that the topic reflected on the task must be within the core content of the subject. Since students vary in terms of learning styles, the assessment varies as well. The teachers must realize and formulate performance-based tasks appropriately to demonstrate understanding among their students. When it comes to “fairness,” the teacher must see to it that the tasks given can be understood by all the students. It must give consideration to the learner’s needs and characteristics and any reasonable adjustments that need to be made.

Next, in order to understand mathematics engagingly, the problem must be “real-life related” so that the students will expand beyond the four walls of their classroom and seek possible solutions to real-world problems. Apart from these, the rubric plays an important role in assigning appropriate scoring of the P-bAT tools. The researcher was guided by the pre-set criteria in the sense that consistency, fairness, and accurate scoring were employed. Just like Moskal (2000) said, scoring rubrics are typically employed when a judgment of “quality” is required and may be used to evaluate a broad range of subjects and activities. In addition, Goe (2007) said that high-quality performance-based assessment requires multiple measures and sources of evidence, as well as several opportunities to test. The researcher agrees with this author since the tools were formulated in order to gain enough presentations to cover every purpose.

To sum it up, “Quality Performance” tasks provide opportunities for students to think deeply and critically, to reason and construct mathematical arguments, and to make sense of problems that aren’t merely an application of an algorithmic process already learned (Danielson and Marquez, 2016).

On the Perceptions of Students Regarding Performance-based Assessment Tasks

The results of the qualitative part of this study were based on in-depth focus group discussion for the complete transcript of data. In order to answer research question number 2, the researcher got comprehensive data on the students’ perceptions regarding the usage of performance-based assessment tasks in terms:

            a.)  understanding of the tasks;

            b.)  knowledge in answering the tasks; and

            c.)  awareness of how these tasks were scored.

Table 2 shows the results of the students’ perceptions toward the given P-bAT conducted. It answers how students understand the specific tool as part of their learning progress.

Table 2.  Students’ Understanding of Performance-based Assessment Task.

Specific Questions Participants’ Statements Codes Themes
How do you understand the term “assessment”?   Measures students’ progress

Measures students’ level of learning

Process of gathering bases of grades

Evaluation Tool (A)
…mag-measure kondiingidastaangimonganatun-an.“… a sort of teachers’ judgment of how far we have learned from him/her.”
…naga-measure kung gaanonamonnaintindihanang topic pagkatapos sang lecture.“it measures how we understand the topic after the lecture.”
…learning activity kondiinmagtukoy kung anogidangkakayahan sang isakaestudiyante.“…learning activity that determines the capability of the students.”
     
What comes to your mind when you heard the words “Performance-based Task”? …diri kami ma-measure hindilangbastasa new topic patisa past.“…we are measured here not only to the new topic but also to the previous one.” Evidence of learning

Demonstration of Knowledge

Bases of Performances

Quality Tangible Product (A)
…ditonare-rate or ditonasusubokang performance naming as a student.“…our performance are tested and rated here as a student.”
…basehanng students sa performance niya kung gaanosiya nag-improve saisangbagay. “…students’ bases on the improvement of his/her performances.”
…diraginapakitakaestudiyante kung anoangkakayahanniyakonsadiinasta.“…student shows here how distant is his/her capabilities.”
       
Can you tell me something about the content of the P-bAT tool? …kailangansiyangcritical and logical thinking.“…it needs critical and logical thinking.” Quite difficult

 More complicated

Quite tricky than usual assessments

Thought Provoking Tool (A)
…gin-challenge niyagidangakonnga knowledge. “…it challenges my knowledge.”
…ang questions medyoka tricky kag nag-exert gid kami sang effort para maanserannamon.“…it has tricky questions where you need to exert more effort to answer.”
     
How different is P-bATfrom the usual PBTs you have encountered? …maramisiyang follow-up na questions namapapa-isipkamuna.“…it has many follow-up questions that can stop you and think first.”  Contains more follow up questions

More difficult than usual PBTs

Quality Tool (A)
…hindi man siyabudlaypero gin-subokniyagidangisipmokumbaga may ibasiyanga level kumparasaiban.“…actually it is not really difficult but it challenges you to think well because it is really of different level compared to other task.”
…quite baffling!

Note: A – Understanding of the tasks

As revealed in Table 2, four themes were formed that summarize how students understand the tools that they encountered in their mathematics class. These themes include evaluation tools, quality tangible products, thought-provoking tools, and quality tools.When students were asked how they understood the word “assessment,” students responsively expressed their answers confidently.

…a sort of judgment ng teachers namin para ma-measure kung how far angamingnatutunan. (…a sort of teachers’ judgment on how far are our learning.”) -STEM 01

…mag-measure kondiingidastaangimonganatun-an. (“…it measures how muchwe have learned.”) -STEM 02

…naga-measure kunggaanonamonnaintindihanang topic pagkatapos sang lecture. (“…it measures how we understand the topic after the lecture.”) – STEM 03

…learning activity kondiinmagtukoykunganogidangkakayahan sang isakaestudiyante. (“…learning activity that determines the capability of the students.”) – STEM 04

On the other hand, students distinguish the forms of an assessment as simple or more complex. Students anticipated that right after the lecture, assessment comes in order to know how far they have learned from the topics discussed by their instructors. As a result, they were aware that performance-based evaluations were not always used. When asked how they understood “performance-based task”, they knowledgeably responded.

…diri kami ma-measure hindilangbastasa new topic patisa past. (“…we are measured here not only to the new topic but also the previous one.”) – STEM 03

…ditonare-rate or ditonasusubokang performance naming as a student. (“…our performance are tested and rated here as a student.” – STEM 01

…basehanng students sa performance niya kung gaanosiya nag-improve saisangbagay. (“…students’ bases on the improvement of his/her performances.”)STEM 05

…diraginapakitakaestudiyantekunganoangkakayahanniyakonsadiinasta. (“…student shows here how distant is his/her capabilities.”) – STEM 04

Performance-based assessment, in general, assesses students holistically in terms of their ability to apply the skills and knowledge they have acquired. It is a performance-based task where sets of strategies are put together to measure how a particular student progresses. This means that performance-based tasks were formulated differently from the usual assessments that the students encountered in the classroom.

…naga-require siyang critical and logical thinking.  (”…it needs critical and logical thinking.”) – STEM 01

…gin-challenge niyagidangakonnga knowledge. (“…it challenges my knowledge.”) – STEM 03

…ang questions medyoka tricky kag nag-exert gid kami sang effort para maanserannamon. (“…it has tricky questions where you need to exert more efforts to answer.”) – STEM 04

Performance-based tasks come in various forms. Senior high school instructors in Sultan Kudarat State University also used varied forms of performance tasks. Apart from this, when students were asked how different is P-bAT from the usual assessment they encountered inside the classroom, they manifested a strong stance accordingly.

…may arasiyapasikot-sikotngamedyo tricky gidkaykailanganmogidgamitontanannga best mo para maanseran mo. (“…it has many follow up questions that can stop you and think first.”) – STEM 02

…maramisiyang follow up na questions namapapa-isipkamuna. (…it has many follow up questions that can make you stop for a while and think first.”) – STEM 01

…hindi man siyabudlaypero gin-subokniyagidangisipmokumbaga may ibasiyanga level kumparasaiban. (“…actually it is not really difficult but it challenges me to think well because it is really of different level compared to other task.”) – STEM 04

…quite baffling! – STEM 05

Performance-based assessment measures students’ ability in terms of what they know and what they are able to do. In order to do this, Hill, Ball, & Schilling (2008) stated that a teacher must possess the necessary skills in order to apply various pedagogical methods to facilitate deep conceptual understanding within their students.

In response to the perception of the students in terms of their knowledge of the P-bAT conducted, Table 3 presents the results. There were three P-bAT tools conducted for STEM students, respectively. There were seven themes that summarized the perception of the students about P-bAT.

Table 3.  Students’ Knowledge about the Performance-based Assessment Tasks.

Specific Questions Participants’ Statements Codes Themes
Can you tell me your experiences when you answered the P-bAT? …time pressured kasisa small amount of time nabinigay para masagutanito at medyokainabahan din ako.(“…time pressured because it has limit given that makes me feel nervous.”) Time-pressured

Curiosity 

Having a hard time

Unstoppa-ble Experiences (B)
…na-curious kokaynabag-uhankokay P-bAT bay, di kosanay.(“…I feel curious about P-bAt, I am not used to it.”)
…na-pressure kokay tong unaabikohindiko kaya anseran.(“…I got pressured because I thought I cannot answer it.”)
…half nabudlayan, half man hindi. (“…half struggling, half not.”)
       
What was your first move before answering the P-bAT? …ginbasakogidanaysiyatanantapos gin-understand.(“I read everything and understand it.”) Reading 

Understanding / Analyzing

First Move (B)
…una, ginbasakoang instruction then second pag-understand sa problem.(“I read the instruction first then understand the problem.”)
       
 

Did you find it difficult to answer the

P-bAT?

…medyonabudlayankosa P-bATngaginhatagkay tong una, hindiko ma-recall ang formula ngagamiton. (…I struggled when answering the P-bAT because I cannot recall the formula to be used.”) Quite difficult CP-BAT Tool (B)
…medyosiyamahirap para sa akin perokahitpapaanonakakuhanamanakong average na score.(“…I find it more difficult yet, I got an average score.”)
…para saakon Ma’am hindigid man siyabudlaykaayo, tama-tama lang.(“…for me, it is just right.”)
       
Did you get to the point where you want to give up on answering the task? …mahirap man siya, perohindisiyayongkailangannasukuantalaga.(“…yes it is difficult but, it is not the type of task that you need to surrender answering.”) Confusing yet challenging

 

Decision Making

(B)

…walakogid gin-isipnga mag surrender kaybal-ankoang P-bATngaginhatagni Ma’am makabuliggidsaakon.(“…I did not come to a point to surrender because I know it can help me.”)
…ngaa man abingamagsuko man ko?(…and why should I surrender?”)
…oo Ma’am! Kay may part siyanga di kogid ma-gets.(“…yes Ma’am! Because there are some parts that I cannot understand.”)
       
Which part of the P-bATdid you find difficult to answer? …tong mga explanation part sang tasks kaywaayako nag-think outside the box.(“…those explanation part of the task because I did not think more outside the box.”) Misleading Title

Careless-ness

Literal interpreta-tion

Mathemati-cal Communica-tion (B)
…literal angpag-intindikong conclusion kodahilsa title (Forever is Real)(“…I write my conclusion literally because of the title, Forever is Real.”)
…sasolving part sang P-bAT 1 kaymasyadoakonaging careless sapag-answer.(“…in the solving part because of my carelessness in answering that part.”)
       
Which part of P-bATdid you findhard to answer? …ang truth table, so feel koamogidningna-master ko.(“I feel that I am good in dealing with the truth table”.) Tabular presenta-tions

Understan-ding patterns

Business Math

Problem Solving is Fun (B)
…truth table kasimeronsiyang pattern at tsaka investments.(“…in the truth table because it has pattern and investments as well.”)
…mas madalisiyaanserankay may araka sang basehanlalonasa truth table.(“…for me it is not hard because of patterns especially in the truth table.”)
       
What were your bases in giving your answers in every task? …gin-base kogidsa correct formula kag principles sang Math nga gin-tudlo mo.(“…I really based it from the correct formula that I learned from you.”) Previous lessons/ topics notes Lecture Notes (B)
…based from the correct principles of the previous topics nana-jot down namin.(“…I based it from the correct principles of the previous topics that I jotted down.”)
…gin-base kosa correct procedure sang Gen. Math nga gin-tudlomo Ma’am.(“…I based it fron the right procedure of Gen. Math that you have discussed Ma’am.”)
…hindi man satananngaorasmalipatanmodali-daliang is aka topic lalonapag-nabanggitsiyaliwat.(“…it is not that you will forget everything especially if the topic will be reviewed again.”)
…pagkataposmosiyangbasahin at tsakai-comprehend parangmabibigyankanamanng hint eh.(“…after reading and comprehending you can get some hint.”)
…pwedekanaman mag-refresh samganatun-anmodati.(“…you can refresh what you have learned anyway.”)
…in case ngamalipatanmogidpwedeka man mamangkot.(“…in case you forget, asking somebody can help.”)

Note:  B – Knowledge about the Tasks

Students gained different experiences in answering every performance-based assessment task. Some are positive, while others are negative, particularly when dealing with word problems. Just like in STEM students, during the time that they answered the P-bAT tools, they experienced fear, fear of time, and fear of not getting the correct answers.

…time pressured kasisa small amount of time nabinigay para masagutanito at medyokainabahan din ako. (“…time pressured because it has limit given that makes me feel nervous.”) – STEM 01

…na-curious kokaynabag-uhankokay P-bAT bay, di kosanay. (“…I feel curious about P-bAt, I am not used to it.”)- STEM 02

…na-pressure kokay tong unaabikohindiko kaya anseran. (“… I feel pressured because I thought I cannot answer it when I see it from the start.”) (STEM 03).

…half nabudlayan, half man hindi. (“…half difficult and half easy.”) -STEM 05

It is natural that every assessment requires time. Thus, the time required will depend on the instructors’ choice of assessment. Despite what the students had experienced during the conduct of the P-bAT, they still managed to continue answering the assessment. Moving on, before the students answered the P-bAT, they created a plan to somehow understand the given tasks. The students elaborated on their plans in this context.

…ginbasakogidanaysiyatanantapos gin-understand. (“I read everything and understand it first.”) – STEM 02

…una, ginbasakoang instruction then second pag-understand sa problem. (“I read the instruction first then understand the problem.”)  – STEM 01

One step in resolving this is to read through the problem and create a word equation.In other words, the students in this part made the right move before answering the tool.In a similar manner, performance-based tasks range from simplycreated responses to complex designs and vary in form. With this, students gave their views as to how P-bAT was formulated.

…medyonabudlayankosa P-bATngaginhatagkay tong una, hindiko ma-recall ang formula ngagamiton. (“…I struggled when answering the P-bAT because I cannot recall the formula to be used.”) – STEM 03

…medyosiyamahirap para sa akin perokahitpapaanonakakuhanamanakong average na score. (“…it is quite difficult for me, but nevertheless, I got an average score.”) (STEM 01).

…medyonadudlayangidkodiri. (“…I find it more difficult yet, I got an average score.”)- STEM 02

…para saakon Ma’am hindigid man siyabudlaykaayo, tama-tama lang. (“…for me, it is just right.”) – STEM 05

As expected, since P-bAT was merely a problem-solving assessment, normally, students resolve that it was not simply made. According to Hilliard (2015), one of the characteristics of performance-based assessment is “complexity”. In order to challenge students, the task must include higher-order thinking skills to create a product or complete the process (Chun, 2010). However, students were challenged in answering the tool as they expressed their feelings. In fact, some of them almost gave up answering while others went straight through just to accomplish the tasks.

…mahirap man siya, perohindisiyayongkailangannasukuantalaga. (“…yes it is difficult but, it is not the type of task that you need to surrender.”) – STEM 01

…walakogid gin-isipnga mag surrender kaybal-an koang P-bATngaginhatagni Ma’am makabuliggidsaakon. (“…I did not come to a point to surrender because I know it can help help me.”)STEM 03

…ngaa man abingamagsuko man ko?(…and why should I surrender?”)STEM 02

Yes Ma’am! May instance ngahindikomaanseran. (“…yes Ma’am! Because there are some parts that I cannot understand.”)STEM04

In line with the statements above, because of the follow-up questions of the task, some students skip the harder parts of the task and focus on the easier ones. But not to the point where they gave up on answering the tasks. Yes, word problems can be tricky, but teachers need to assess students’ mathematical knowledge in ways that will allow them to perform what they really understand (Van de Walle, 2003).

Next, every assessment must include at least three levels of difficulty. These must be easy, moderate, and difficult questions. However, students in this context were asked only about the easy and the difficulty levels of the P-bAT.

…tong mga explanation part sang tasks kaywaayako nag-think outside the box. (“…those explanation part of the task because I did not think more outside the box.”)STEM 03

…sa  solving part sang P-bAT 1 kaymasyadoakonaging careless sapag-answer. (“…in the solving part because of my carelessness in answering that part.”)STEM 02).

…literal angpag-intindikong conclusion kodahilsa title (Forever is Real).(“…I write my conclusion literally because of the title, Forever is Real.”) – STEM 05

Students elaborated on different responses with regard to these matters. There were some who stacked up in answering some of the questions because they literally took the meaning of some questions without realizing that once they figured out what the problem was, surely their conclusion would also be correct. Greer, (1997); Verschaffel, Corte, &Lasure, (1994), pointed out that the most reported difficulty is that students do not take into account common sense considerations about the problem. Every answer has its own foundation. It could either come from the right principles or a theory. What the students answered to the given question, “where did they base their answers?”, quickly responded like the statements below.

…gin-base kogidsa correct formula kag principles sang Math nga gin-tudlo mo. (“…I really based it from the correct formula that I learned from you.”) – STEM 02

…based from the correct principles of the previous topics nana-jot down namin. (“…I based it from the correct principles of the previous topics that I jotted down.”)- STEM 01

…gin-base kosa correct procedure sang Gen. Math nga gin-tudlomo Ma’am. (“…I based it from the right procedure of Gen. Math that you have discussed Ma’am.”) – STEM 04

With these statements, students know exactly what to do despite the fact that they got minimal scores in the P-bAT tool. But because the tool was quite tricky, this might be the reason why they were not able to get high scores.Lastly, in terms of the question “in case they forget the topics included in the tasks”, students cleverly answered that it had never been a problem at all. They sorted through some options so that they could accomplish the tasks.

…hindi man satananngaorasmalipatanmodali-daliang is aka topic lalonapag-nabanggitsiyaliwat. (“…it is not that you will forget everything especially if the topic will be reviewed again.”) –  STEM 03

…pagkataposmosiyangbasahin at tsakai-comprehend parangmabibigyanka naming ng hint eh. (“…after reading and comprehending you can get some hint huh.”)- STEM 01

…in case ngamalipatanmogidpwedeka man mamangkot. (“…in case you forget, asking somebody can help.”) – STEM 05

It is natural for us human beings to have instances where we forget things, especially what we have in mind. It doesn’t mean that it will forever vanish. The same with the students: they quickly forget the topics discussed with them by their teachers, maybe for so many reasons.  Nevertheless, due to several options that students have to recall the possible solutions to the tasks, they find ways to get them.

Table 4 summarizes the findings regarding how performance-based assessment was scored in this context. Three themes are summarized in the statements that include Standardize Rubric, DepEd Grading System Components, and Standard Scoring System.

Table 4.  Students’ Awareness of How Performance-based Assessment Tasks were Scored.

Specific Questions Participants’ Statements Codes Themes
How do you understand  rubric? …it serves as a guideline, naga raise siyang standard structure or criteria sa any man na school works.(“…it serves as a guideline, it raises standard structure or criteria to any school works.”) Basis for scoring

Criteria for scoring

Guidelines for scoring

Standar-dize Rubric(C)
…is a guideline nasiyanga mag guide saimo kung anodapatmongahimuon.(“…it is a guideline that guides you on what to do?”)
…guidelines gid para saamonang rubric kaydiranakapa-loob kungano tong dapatnamoni-enhance.(“…it tells in the rubric the things what to enhance.”)
…ginagamit sang teacher kung paanoniyatsekanang activity or task sang estudiyante.(“.. it is used by the teachers to check any activity or tasks given to students.”)
…basis or guide ngamakabulighindilangsa teacher kundipatisa students.(“…it is a guide that helps the teachers and the students as well.”)
       
What is your stand about the DepEd breakdown of percentages of the grading system? …dapatgidlangna mas taasang performance task kaysasamgaibangwritten tasks.(“…performance task must be allotted higher percentage than other written tasks.”) Objective breakdown of percentages 

Demonstration of Knowledge

Bases of Performances

DepEd Grading System Compo-nents (C)
…dapat performance task gidangpinakadako, more on hands-on gidabisiya.(“…performance task must have higher percentage because it is more on hands-on activity.”)
…tama man gihaponnga mas taasang performance task kaydiraabiginasubokang capacity mokagangnatun-an mo.(“…it is good that performance task to have higher percentage because it tested the capacity of what you have learned.”
       
  …feel kogid fair gidsiyakaayosapaghatag sang score.(“…I feel that in giving fair score.”)    
…naga-give siyang fair judgment and also ma-motivate pa kami to learn more and to pay attention pa.(“…it gives fair judgment and motivates more.”)
…fair gihaponsiyanga judgement kaydirikonabal-ankonanoangakonngakulangngadapatkoi-improve kagpokusan.(“…it gives fair judgment nad help me improve to things I need to give focus.”)
…ang true essence gid sang rubric equality gid, ngpantay-pantay para maiwasanangpagka-bias.(“…the true essence of rubric is equality, fairness to avoid biases.”)

Note: C– Awareness on how P-bAT were scored

The rubric was critical in scoring the P-bAT tool. The researcher made her own appropriate rubric as a guide for scoring every task. Since the rubric was habitually employed by the teachers as a way of assessing students’ work, surely students know exactly what the rubric is all about.

…it serves as a guideline, naga raise siyang standard structure or criteria sa any man na school works. (“…it serves as a guideline, it raises standard structure or criteria to any school works.” -STEM 01

…is a guideline nasiyanga mag guide saimo kung anodapatmongahimuon. (“…it is a guideline that guides you on what to do?”) – STEM 02

…guidelines gid para saamonang rubric kaydiranakapa-loob kungano tong dapatnamoni-enhance. (“…it tells in the rubric the things what to enhance.”) – STEM 03

…ginagamit sang teacher kungpaanoniyatsekanang activity or task sang estudiyante. (“.. it is used by the teachers to check any activity or tasks given to students.”) -STEM 06

…basis or guide ngamakabulighindilangsa teacher kundipatisa students. (“…it is a guide that helps the teachers and the students as well.”)STEM 01

With the given statements above, students were truly aware of the things happening in the classroom when it comes to their performance. This is an indication that rubrics are highly observed as tools that increase reliability and validity in assessment (Ross-Fisher, 2005). In addition, Dawson (2015) states that the use of rubrics is strictly mandated for all educators and institutions because of their consistent and convenient scoring.

Up next were the students’ responses to the breakdown of the Department of Education grading system, wherein the biggest portion of percentages is allotted to “Performance Task,” which is 45% of the student’sgrades.

…dapatgidlangna mas taasang performance task kaysasamga written tasks. (“…performance task must be allotted higher percentage than other written tasks.”) -STEM 03

…dapat performance task gidangpinakadako, more on hands-on gidabisiya. (“…performance task must have higher percentage because it is more on hands-on activity.”) – STEM 02

…tama man gihaponnga mas taasang performance task kaydiraabiginasubokang capacity mokagangnatun-an mo. (“…it is good that performance task to have higher percentage because it tested the capacity of what you have learned.” -STEM 05

Since a performance task is an action or process of carrying out or accomplishing a task or function, yet not that simple a task, it is right to give this component a bigger portion of students’ grades. This gives meaning to the suggestion by Alsardary et al. (2016) that performance-based assessments require students to engage in certain activities or create products to demonstrate their academic knowledge and abilities.When it comes to rubrics, the majority of the students agree with the good things rubrics can give to both students and teachers. The same with the perceptions of students. Teachers believe that rubrics are of great help when scoring is concerned.

…feel kogid fair gidsiya kayo sapaghatag sang score. (“…I felt it in giving fair score.”) – STEM 03

…naga-give siyang fair judgment and also ma-motivate pa kami to learn more and to pay attention pa. (“…it gives fair judgment and motivates us more.”) – STEM 01

…fair gihaponsiyanga judgement kaydirikonabal-an konanoangakonngakulangngadapatkoi-improve kagpokusan. (“…it judges fairly because in rubrics, I found out what to focus on to improve.”) (STEM 02).

…ang true essence gid sang rubric equality gid, ngpantay-pantay para maiwasanangpagka-bias. (“…it gives fair judgment and help me improve to things I need to give focus.”) –STEM 03

…kungwala sang rubric, taposnagreklamoangestudiyantenga bias ka sir, bias ka ma’am bisanwalakanaghimo sang bias kay may araka sang rubric. (“…the true essence of rubric is equality, fairness to avoid biases.”) -STEM 05

Numerous studies have found that using a rubric improves the reliability of teachers’ assessments (Jonsson & Svingby, 2007; Silvestri & Oescher, 2006). As a result, another assumption exists that assessment without a rubric tends to be more subjective because it is based only on subjective judgment or on overall impressions of students’ work. With this in mind, teachers often resolve that using a rubric is better than not using one (Spandel, 2006).

On the Students’ Performance Level Using the Performance-based Assessment Tasks

Students’ scores are essential in computing their grades in mathematics. The Department of Education’s grading system comprises three components: the performance task (45%), the written outputs (25%), and the periodic examinations (30%). In this context, since the researcher herself is the teacher of the respondents, the researcher computed the students’ grades using their respective scores on the performance-based assessment tasks. Table 5 shows the grading scales of the STEM students.

Table 5. Frequency Distribution of Students’ Performance Level; n = 32.

Grading Scale Grades Consisting WO-25%, PT-45%, and PE-30% Descriptions
90 – 100 9 Outstanding
85 – 89 15 Very Satisfactory
80 – 84 6 Satisfactory
75 – 79 2 Fairly Satisfactory
Below 75 0 Did Not Meet Expectations
Total 32

Legend:  Grading Scale is adopted from DepEd Form 138; O-Outstanding, VS- Very Satisfactory, S-Satisfactory, FS-Fairly Satisfactory, DNME-Did Not Meet Expectation; Written Output(WO); Performance Task (PT); Periodical Examination (PE); and Performance-based Assessment Task (P-bAT).

Revealed in Table 5 were the computed grades with WO – 25%, PT – 45%, and PE – 30%. The performance tasks in this context were the researcher’s made P-bAT rated “High Quality” by the senior high school teachers. The scores are composed of 35 points for P-bAT 1, 55 points for P-bAT 2, and 50 points for P-bAT 3.  The results show that most of the students belong to the “Very Satisfactory” level, with no one belonging to the “Did Not Meet Expectations” level. This implies that the students work successfully in answering the performance-based assessment tasks.With these, the need for mathematics teachers to determine what creates a “high-level” task is to assess whether a task can provide the types of learning opportunities that promote students’ understanding is of prime importance, according to Boston & Wolf (2006). As a final point, proper assessment tools in the teaching and learning process are required to quantify students’ understanding and ability fairly and equally (Oathman et al., 2012).

Furthermore, Table 6 revealed the results of the significant relationship between the quality level of the performance-based tasks and the student’s grades.

Table 6. Relationship between the Quality Level of the P-bAT and the Performance Level of the Students.

  Category Mean r p-value Significance
1 Quality Level of P-bAT 2.457 0.453** 0.000 Moderate Positive Significant
2 Students Performance Level 84.686

Legend: P-bAT (Performance-based Assessment Task); ** Correlation is significant at the 0.01 level.

As reflected in Table 6, there was a moderate positive correlation of 0.453. This is true because when the performance-based tasks were made just right for the students, not too difficult or too easy, they would achieve an above-average performance level. As stated in the study by Boston and Wolf (2006), the development of an assessment tool to measure the “quality of instruction” is required to provide an informative accountability system in education. Such a tool should be capable of describing the quality of teaching and learning that occurs in actual classrooms.

On the Evaluation of Students’ Performance Level

Evaluating student’s achievement through performance assessments is not a new strategy. Yet, good teachers always judge and monitor their students’ progress through it. What is new in this study is the evaluation of students’ how well they performed in the given high-quality performance-based assessment tasks. The students were evaluated for their performance level in terms of accuracy, procedural skills, and mathematical communication. Rubrics played a very important role in this context for consistency, fairness, and reliability in assigning scores to every student.

A reliability test result of 88% in students’ accuracy and procedural skills and 95% in mathematical communication implies that the tools were on their “acceptable level” according to Hulin, Netemeyer, and Cudeck (2001), and reliable for testing.

To answer research question number 4 about the level of student’s performance level using the P-bAT tool in terms of (a.) accuracy and procedural skills, and (b.) conveyance of logical and mathematical ideas, Table 7 revealed the results.

Table 7.  Percentage Distribution on the Performance Level of the STEM students; n=32.

LEVEL STEM
Accuracy and Procedural Skills % Conveyance of Logical and Mathematical Ideas %
Level 1 0 0.00 0 0.00
Level 2 4 12.50 18 56.25
Level 3 22 68.75 12 37.50
Level 4 5 15.625 2 6.25
TOTAL 32 100 32 100

With regards to “Accuracy and Procedural Skills”, the students pinned at “Level 3″ while, with regards to ” Conveyance of Logical and Mathematical Ideas”, students dropped to “Level 2” but with a slight difference from “Level 3.” It demonstrates that students excel at “accuracy and procedural skills” rather than “mathematical communication.” It implies that students can apply procedures correctly and recognize appropriate strategies to use, but they have a slight weakness in communicating and justifying the procedures they used. In problem- solving, it is not only giving an answer but it needs to be communicated. It is true that word problems are among the most difficult forms of problems in mathematics encountered by learners (Verschaffel et al., 2000) The same with the studies conducted by Greer (1997) and Verschaffel, Corte, &Lasure (1994). They pointed out that the most common difficulty in worded problems is that students neglect common sense considerations about the problem, which affects the processes of formulating the mathematical problem and interpreting the mathematical results. As a consequence, many students fail to solve the posed problems correctly. Nonetheless, teachers and students must work together to address this problem using quality tools. Despite its level of difficulty, once it is disclosed to students properly and sincerely, and they use an appropriate rubric for every performance-based task, this would result in a consistent, fair, and reliable rating for the students.

CONCLUSIONS

Performance-based methods of learning and assessment are a reliable approach for standard-based practices, even during the ongoing pandemic. These methods involve students actively engaging in extended and meaningful mathematical tasks, while teachers facilitate and assess their learning. In the City Schools Division of Tacurong, student-centered activities have always been emphasized to ensure learning. This is why performance-based assessment tasks were introduced to senior high school levels, allowing students to demonstrate what they have learned and enhance their skills and abilities.

The strong positive correlation between the quality level of the tool and students’ performance level is not by chance but a clear manifestation that the result is reliable. We cannot deny the fact that “problem-solving” is widely used around the globe with the main purpose of developing students’ critical thinking. This form of a performance-based task must be of good quality before giving it to students.

The Department of Education strongly advocates for quality education for all. However, this can only be achieved if teachers choose a quality assessment tool. Therefore, it is a challenge to all teachers to pick an assessment tool not just for compliance but an assessment tool that allows opportunities to assess thinking and articulation. An assessment tool that is of good quality and allows them to give feedback on how well the students perform in every task.

RECOMMENDATIONS

Based on the findings of this study, the following recommendations were made:

  • As P-bAT was merely problem-solving, it is recommendable that there’s a need for collaborative effort for some time rather than individual performance most of the time. With this, students will work together without leaving some questions unanswered;
  • Performance-based tasks must be employed in varied forms, but each must have a corresponding rubric in order to give appropriate scoring;
  • The results revealed that students work better in Accuracy and Procedural Skills than Conveyance of Logical and Mathematical Ideas, teachers should focus on making these two aspects in balance; and
  • Further study on P-bAT must be conducted to evaluate students’ performance levels using some other factors, which is recommended.

ACKNOWLEDGMENT

I want to convey my heartfelt gratitude to all my professors at USM who always assisted me in finishing this study and to SKSU for the assistance provided. I am quite grateful to my husband, Dr. Benedict A. Rabut, and to our only daughter, BJ, for their love and moral support. Lastly, all praises and thanksgiving to our God almighty for all the blessings and guidance all through my days.

DECLARATION OF INTEREST STATEMENT

I hereby declare that this research paper was written by me, and it has never been published in a journal or other print source. If anything changes, especially if there is a problem if it is accepted for publication, I promise to let you know.

REFERENCES

  1. Alsardary, S., Pontiggia, L., & Blumberg, P. (2016). primary trai math part 2, (April).
  2. Boston, M., & Wolf, M. K. (2006). Assessing Academic Rigor in Mathematics Instruction: The Development of the Instructional Quality Assessment Toolkit, 1522(310).
  3. Chun, M. (2010). Taking Teaching to (Performance) Task: Linking Pedagogical and Assessment Practices. Change: The Magazine of Higher Learning, 42(2), 22–29. https://doi.org/10.1080/00091381003590795
  4. Danielson, C. (2016). Performance Tasks and Rubrics for High School Mathematics. Performance Tasks and Rubrics for High School Mathematics. https://doi.org/10.4324/9781315695259
  5. Danielson, C. & Marquez, E. (2016). Performance Tasks and Rubrics for Middle School Mathematics. 2nd ed, New York.
  6. Dawson, P. (2015). Assessment rubrics: Towards clearer and more replicable design, research and practice. Assessment & Evaluation in Higher Education, 1-14. doi:10.1080/02602938.2015.1111294 DO_No. 08 s 2015_pdf.
  7. Goe, L. (2007). The link between teacher quality and student outcomes: A research synthesis. National, (October), 1–72.
  8. Greer, B. (1997). Modelling reality in mathematics classrooms: The case of word problems. Learning and Instruction, 7(4), 293–307. doi:http://doi.org/10.1016/S0959-4752(97)00006-6
  9. Hill, H. C., Ball, D. L., & Schilling, S. G. (2008). Unpacking pedagogical content knowledge: Conceptualizing and measuring teachers’ topic-specific knowledge of students. Journal for Research in Mathematics Education, 39(4), 372–400.
  10. Hilliard, P. (2015). Performance-Based Assessment: Reviewing the Basics. Retrieved on September 2018 from https://www.edutopia.org/blog/performance-based-assessment-reviewing-basics-patricia.
  11. Hulin, C., Netemeyer, R., and Cudeck, R. (2001). Can a Reliability Coefficient Be Too High? Journal of Consumer Psychology, Vol. 10, Nr. 1, 55-58.
  12. Johnson, R. B., & Onwuegbuzie, A. J. (2007). Toward a Definition of Mixed Methods Research. Journal of Mixed Methods Research, 1(2), 112–133. https://doi.org/10.1177/1558689806298224
  13. Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review,2, 130–144.
  14. Moskal, B. (2000). Scoring rubrics: What, when and how? Practical Assessment, Research & Evaluation, 7(3), 2-7.
  15. O’Brien, W. H., & Tabaczynski, T. (2007). Unstructured interviewing. Handbook of Clinical Interviewing with Children, (1998), 16–29. https://doi.org/10.4135/9781412982740.n2
  16. Onwuegbuzie, J.A. & Collins, K.M.T. (2007). A Typology of Mixed Methods Sampling Designs in Social Science Research. HCAS Journal, Vol. 12, No. 2.
  17. Othman, H., Asshaari, I., Bahaludin, H., Nopiah, Z. M., & Ismail, N. A. (2012). Application of Rasch Measurement Model in Reliability and Quality Evaluation of Examination Paper for Engineering Mathematics Courses. Procedia – Social and Behavioral Sciences, 60(2009), 163–171. https://doi.org/10.1016/j.sbspro.2012.09.363
  18. Ross-Fisher, R. L. (2005). Developing effective success rubrics. Kappa Delta Pi, 41 (3), 131–135.
  19. Silvestri, L., & Oescher, J. (2006). Using rubrics to increase the reliability of assessment in health classes. International Electronic Journal of Health Education, 9, 25–30.
  20. Spandel, V. (2006). In defense of rubrics. English Journal, 96 (1), 19–22.
  21. Van de Walle, J.A. (2003). Designing and selecting problem-based tasks. In F. Lester (Ed.), Teaching mathematics through problem solving prekindergarten-grades 6 (pp. 67-80).
  22. Reston, VA: NCTM.Verschaffel, L., Corte, E. D., & Lasure, S. (1994). Realistic considerations in mathematical modeling of school arithmetic word problems. Learning and Instruction, 4(4), 273–294. doi:10.1016/0959- 4752(94)90002-7
  23. Verschaffel, L., Greer, B., & de Corte, E. (2000). Making sense of word problems. Vol 33, ISBN 9026516282.
  24. Whiting P, Rutjes AWS, Reitsma JB, Bossuyt PM, Kleijnen J (2003): The development of QUADAS: a tool for the quality assessment of studies of diagnostic accuracy included in systematic reviews. BMC Medical Research Methodology. 3: 25-10.1186/1471-2288-3-25.

Article Statistics

Track views and downloads to measure the impact and reach of your article.

2

PDF Downloads

486 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

GET OUR MONTHLY NEWSLETTER