International Journal of Research and Innovation in Social Science

Submission Deadline-29th November 2024
November 2024 Issue : Publication Fee: 30$ USD Submit Now
Submission Deadline-05th December 2024
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-20th November 2024
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Examining and Validating Summative Tests Used in Competency-Based Assessment for TESDA Technology Institutions (TTI): Basis for Training Design

  • Glenn M. Gambi, Ph.D.
  • OLGA C. ALONSABE, Ph.D.
  • 936-949
  • Aug 3, 2024
  • Education

Examining and Validating Summative Tests Used in Competency-Based Assessment for TESDA Technology Institutions (TTI): Basis for Training Design

Glenn M. Gambi, Ph.D.1, OLGA C. ALONSABE, Ph.D.2

1Department of Education, Division of Malaybalay City

2Capitol University, Cagayan de Oro City

DOI: https://dx.doi.org/10.47772/IJRISS.2024.807074

Received: 23 June 2024; Accepted: 03 July 2024; Published: 03 August 2024 

ABSTRACT

The implementation of Competency-Based Training delivery in Philippine-TVET has long been emphasized by TESDA. However, since its implementation, the trainers in COBSAT, a TESDA Technology Institution, have not thoroughly examined teacher-made summative test questions used to assess competencies expected to be mastered by the trainees. Like other tests in schools, teacher-made tests were not analyzed and yet used in evaluating learning outcomes. Hence, this study attempts to examine the validity of the summative tests done by the trainers in Technical Vocational Education and Training (TVET) institutions. The study employed descriptive methods of research. It focused on examining, validating, measuring reliability for internal consistency of the different tests constructed by TVET trainers, using item and content analysis. The study also delved into the perception of trainers and trainees in the process of developing test questions used in assessments and the different concerns and challenges encountered by the TVET trainers in constructing test questions. The findings revealed that test items were mostly below fair and poor items, thus need revision or deletion. The reliability for internal consistency revealed that test questions are good for classroom tests; however, there were items that need to be improved. Factors pertaining to trainee’s characteristics, lack of learning resources and trainer’s lack of competence on test construction were identified concerns and challenges faced by trainers in developing test questions.

Keywords: test item analysis, test construction, test validity and reliability, technical vocational education and training, competency-based assessment

INTRODUCTION

Republic Act 7796, otherwise known as the Technical Education and Skills Development Act of 1994, created the Technical Education and Skills Development Authority (TESDA) as a government agency tasked to manage and supervise technical education and skills development in the Philippines. TESDA, a leading partner in the development of technical vocational education is adopting the competency-based training delivery in all its TESDA Technology Institutions (TTIs) all over the country.

The implementation of competency-based training delivery in Philippine-Technical Vocational Education and Training (TVET) has long been emphasized by Technical Education Skills Development Authority (TESDA). Further, TESDA’s promulgated Training Regulations (TRs) of different qualifications suggested the principles of competency-based TVET in all its training delivery. Competency-based training emphasizes the actual potentials of the learners and its outcomes rather that the learning process within the specified time. As cited by Norton (1987) regarding the elements of competency-based training delivery, assessment of competency takes the trainee’s competence as the primary source of evidence and the criteria to be used in assessing and the conditions under which achievement will be assessed should be explicitly stated.

The purpose of conducting a competency-based assessment in all training programs offered by TTIs is to ensure that the required competencies of a particular training qualification are mastered and acquired by the trainees. Hence, careful attention should be taken into consideration in developing competency assessment like in constructing test questions that are incorporated and used in the competency-based assessment. Similarly, to make certain that the required knowledge as expected in the workplace carefully meets during school-based training.

However, since the implementation of competency-based training in the technical vocational education and training, Cagayan de Oro (Bugo) School of Arts trainers had not thoroughly examined teacher-made questions used in the assessment. The predicament observed by the researcher was concerned by Cosinero (2010) when she undoubtedly clamored that teacher-made tests that have not been analyzed to make certain of its quality (validity and reliability) and still used in assessing learners in the classroom. The pertained observation in connection with construction or development of test by trainers relates to Namoc (2008) excerpts and discussed that in most cases teacher-made test is used in classrooms to evaluate learning outcomes, but the concerns in terms of its validity and reliability are not certain because in most situations teacher-made test is not subjected to item analysis.

Hence, this study is initiated to determine the (1) demographic profile of the trainee respondents in terms of age, gender and highest educational attainment; and trainer respondents’ profile in terms of gender, length of teaching service as TVET trainer and highest educational attainment; (2) validity and reliability of the test questions constructed by the trainers; (3) perception of the trainers and trainees in the process of developing test question; (4) significant difference between the trainers and trainees perception in the process of developing test questions; (5) challenges encountered by the trainers in developing test questions; and (6) proposed enhancement program to address the concerns and challenges identified in the study.

The study is anchored on analyzing item response through classical test theory introduced by Lord & Novick (1968) elaborated and discussed by Rivera (2007). According to Fan (1998), the classical test theory generally focuses on test level information and item statistics which includes item difficulty and item discrimination.  Item difficulty and item discrimination are test characteristics that trainers are at most interested, along with options analysis and were frequently reported in performing item analysis technique. Item difficulty measures the proportion or percentage of examinees who answered the item correctly. Meanwhile, the ability of an item to discriminate between examinees with higher and lower level of knowledge is known as item discrimination. In test construction, option distractors are options that are theoretically not the correct answer; developed according to examinees misconceptions. However, for options analysis, an option should receive at least three percent of the total distribution of the total number of responses from higher and lower group.

Furthermore, Rivera (2007) discussed common statistics in describing classical test theory such as alpha coefficient and variance. He added that the classical test theory is used to determine values of reliability by using the internal consistency methods such as Kuder- Richardson 20 (KR20).

In this study, the principles of test construction wherein the steps and process that involve planning, preparing, testing and evaluating the teacher-made tests as presented by Ross (1941) were examined. Planning for test primarily includes outlining of subject matter, preparation of the table of specification and selection of the appropriate type of test item for evaluation of learning outcomes. Meanwhile, preparation includes the writing of test items and selection of items according to a prepared table of specifications. Testing and evaluating include performing of item analysis to determine difficulty, discrimination and reliability.

Teacher-made test and exams are considered as a criterion-reference test initially coined by Glaser (1963) and basically developed to observe the learning of student from a given material; uses test scores to provide an interpretation regarding the performance of a trainee or whether a trainee learned the concept of the material provided. Rivera (2007) further cited Popham (1993) corroborated the work of Glaser (1963) on criterion reference test.

The study moored on the idea of examining and validating the test construction in technical vocational education and training specifically in TESDA Technology Institution (TTI). The study made use of the input, process and output research paradigm.

As can be shown in Figure 1, the input of study covered the demographic profile of trainee respondents in terms of age, gender, and highest educational attainment. Likewise, it includes the profile of the TVET trainers in terms of gender, the length of teaching service as TVET trainer and highest educational attainment. The different test questions constructed by the trainers and the table of specifications of the sampled training qualifications were used in the study.

The study gathered and processed the data quantitatively supported by random follow-up interviews of the respondents to generate further information and validate the data collected from the responses of the respondents.

Primarily, the process involved item analysis of the test items from the test construction developed by the TVET trainers.

Figure 1. The Research Paradigm of the Study

Figure 1. The Research Paradigm of the Study

The content validity of the test questions constructed by the TVET trainers was matched based on the training standard stipulated in promulgated training regulations of the training qualifications sampled in the study; analyzed using the constructed table of specifications (TOS), indicating the competence, number of hours, percentages of topic, number of items and distribution of item content. The reliability of test questions made by the TVET trainers handling the training qualification in Automotive Servicing NC II, Bartending NC II, Food and Beverage Services NC II and Housekeeping NC II was estimated for internal consistency using the Kuder and Richardson Formula 20 or KR-20.

The perception of the respondents in the process of developing test questions in technical vocational education and training was measured using weighted mean. Meanwhile, the significant difference in the perception of the respondents in the process of developing test question was determined using the t-test for independent means. Moreover, the study involved the processing of data in order to determine the different concerns and challenges faced by the trainers in developing test questions.

The output of the study is a proposed enhancement program to address the different concerns and challenges faced by the trainers in developing test questions that are used in Technical Vocational Education and Training competency-based assessment.

METHODS

The study utilized the descriptive methods of research design. Salaria (2012) cited Aggarwal (2008) defines descriptive research as the gathering of information on the current conditions or situations for the purpose of description and interpretation. This type of research method does not solely amass and tabulate facts but includes proper analysis, interpretation, and identification. The instruments used to obtain data in descriptive studies include, but not limited to questionnaire and interviews.

Furthermore, to gather additional data, random interviews delve into depth the information was conducted by the researcher. This is to validate the perception of the trainers regarding the process of developing test questions used in technical vocational education and training.

RESULTS AND DISCUSSIONS

The purpose of this study was to examine and validate the summative tests in technical vocational education and training specifically in Cagayan de Oro (Bugo) School of Arts and Trades, a TESDA Technology Institution (TTI). Likewise, the study seeks to identify the concerns and challenges faced by the trainers in developing test questions that are used in assessments. Respondents were the eleven (11) TVET trainers and ninety (90) trainees from the training qualifications of Automotive Servicing NC II, Bartending NC II, Food and Beverage Services NC II and Housekeeping NC II. The respondents were asked to provide the necessary information provided in the researcher-made survey questionnaire, their perceptions in the process of developing test questions, and the concerns and challenges encountered by TVET trainers in the developing test questions used in the competency-based assessment. Furthermore, the trainee respondents were asked to answer the test construction developed and used by their trainer in the competency-based assessment. The test questions were treated with item analysis technique.

The study revealed that the majority of the trainee respondents were between the ages of 18 and 19 (41.11%), female (60%) and has mostly been college undergraduate. Likewise, the study found out that most of the trainer respondents were female (72.73%), have been teaching as TVET trainers for around 1-10 years (54.55%) and the majority have master units (45.45%) in education.

The performed item analysis on the test items questions constructed and used by TVET trainers in the training qualifications sampled in the study revealed that the majority of the test items need revision.

The content validity of test question constructed by TVET trainers revealed that were matched to the training standard from the training regulations of sampled training qualifications in the study. It was found out from the constructed Table of Specification which very important step in test construction; National Certificate Level II (NC II) was based on the level of thinking of knowledge, comprehension and application; wherein more percentage of test questions constructed by trainers were in knowledge and comprehension.

Table 1 presents the summary results of the reliability of test questions constructed by the TVET trainers.

Table 1 Summary Results of the Reliability of Test Questions Constructed by the TVET Trainers

Training Qualification Reliability (KR-20) Interpretation
Automotive Servicing NC II 0.71 Good for a classroom test, in the range of most. There are probably a few items which could be improved.
Bartending NC II 0.66 Somewhat low. This test should be supplemented by other measures (e.g. More test) for grading
Food and Beverage Services NC II 0.86 Very good for classroom test
Housekeeping NC II 0.81 Very good for classroom test

As presented in Table 1, test questions constructed in Food and Beverage Services NC II and Housekeeping NC II denote very high reliability which means very good for classroom test. On the other hand, the reliability of test question constructed in Automotive Servicing NC II revealed as good for a classroom test, in the range of most, indicating the probability of a few items that could be improved. Meanwhile, Bartending NC II test questions disclosed somewhat low reliability.  Although the sampled training qualification’s test constructions evaluated have indicated high reliability (except in Bartending NC II test questions) in its test questions, there are few factors affecting the reliability. This implied that test construction in the sampled training qualification of Automotive Servicing NCII, Bartending NC II, Food and Beverage Services NC II and Housekeeping NCII was good for classroom test. However, according to Lyman (1971) as cited by Alonsabe (2007), and Wells & Wollack (2003), there were factors that affect the reliability. Among of these include the length of test and test item quality; trainee’s guessing of the correct answer; heterogeneity of the examinees and the length of time in testing.

Trainer respondents considered the competency standards in the making of tests. The table of specifications aids the trainers in designing test questions and primarily identified as the first step in the process of constructing test questions. In terms of variety of use questions format, it was perceived as often. Aside from the multiple-choice, other format includes true-false questions, matching type questions, essay, and enumeration. Still, according to the TVET trainers, the multiple-choice format is common and most widely used specifically during institutional assessments. The trainee respondents corroborate with this observation.

In addition, the trainees disclosed that TVET trainers used language and terminology appropriate to the characteristics of the trainee to be assessed and used questions that are not biased. It was always observed in the test questions constructed that appropriate spaces between questions were provided and always used the appropriate font for questions to ensure easy reading of test questions. Similarly, trainers always checked that the questions do not run over to the next page.

The study found out that there is a significant difference between the trainers and the trainees perception in the process of developing test questions; wherein the indicators or task in developing test questions were perceived differently by the trainer and trainee respondents.

Among of the identified concerns and challenges faced by the TVET trainers in developing test questions include factors related to trainee‘s characteristics, learning resources and competence of trainers in the aspect developing test questions. The categorized factors revealed and identified by the trainer pertains to the heterogeneous group of trainees enrolled in TVET, less reading comprehension and poor communication skill of trainees and trainee’s retention of the lessons. Moreover, it was found out the lack of availability of learning resources such as books and manual which contributed to more time consuming in the construction of test questions to be used in the assessments. Another finding in the study revealed that the trainers have no specific training in test questions construction or development.

Considering the identified need of the trainees to develop the skills in developing a table of specifications and constructing quality type of summative tests, it is recommended that a training program be introduced to the trainees and a seminar-workshop be conducted to really give time for the trainers to construct test items along with the making of table of test specification.

The following Training Design is suggested for use in the TESDA Technology Institutions (TTI). This is hoped to make a difference in the way the summative tests is used in the competency-based assessment. Monitoring of the performance of the trainers is expected with regard to the test construction and utilization.

Seminar-Workshop on Developing Valid and Reliable Tests for Competency-based Assessments in TESDA Technology Institutions (TTI)

Training Design

Rationale:

The adoption of competency-based training in the Philippine – Technical Vocational Education and Training has long been emphasized in the technical vocational training of the country. The training delivery is designed to produce competent individuals that would pool the demand and required manpower of industries to perform blue collar jobs.

The purpose of conducting competency assessment in all training programs offered by TESDA Technology Institutions (TTI’s) is to ensure that the required competencies of a particular training qualification are mastered and acquired by the trainees. Therefore, careful attention should be taken into consideration in developing competency assessment like in constructing test questions that are incorporated and used in the institutional assessment. Similarly, to make certain that the required knowledge as expected in the workplace is acquired during school-based training.

However, present observation in TTIs pertained to the conduct of competency assessment entailed that teacher-made constructed test questions that are used in the competency-based assessment were not carefully examined and make certain of its quality. Similarly, research findings revealed the need for trainers to develop skills in the test questions construction. Hence, this four-day seminar-workshop is conceptualized for the following topics:

  1. Purpose of conducting institutional assessment in Technical Vocational Education and Training (TVET)
  2. Overview and principles of test construction
  3. Importance of Table of Specifications (TOS)
  4. Item Analysis, Validity and Reliability of Test Questions
  5. Utilization of Summative Assessment Results

Objectives:

This seminar-workshop is designed to capacitate TVET trainers in developing quality assessment tools. At the end of the activity, the trainers are expected to:

  1. Articulate on the purpose of the conducting institutional assessment.
  2. Give the context and principles of test construction
  3. Develop Table of Specifications (TOS) per specialization
  4. Construct test questions based on the TOS
  5. Conduct test item analysis, validity and reliability testing

Expected Participants:

 The seminar-workshop is purposely designed for the eleven (11) Technical Vocational Education and Training (TVET) Trainers from the following areas:

  1. Automotive Servicing NC II,
  2. Bartending NC II,
  3. Food and Beverage Services NC II,
  4. Housekeeping NC II

METHODOLOGY:

The seminar-workshop will employ the following activities:

  1. Registration, opening program
  2. Lecturette: Rationale, context, and principles of Summative Assessment/Tests
  3. Film Viewing: Assessing technical-vocational skills learned by trainees
  4. Group Discussion/sharing: making the TOS
  5. Workshop: Developing test items from the TOS, Tests, Re-test
  6. Integration: Administration of test Questions to students, processing of data and finalization

Venue: 

COBSAT Centex Building Audio-Visual Room

Estimated Budgetary Requirement:

                   Breakdown                                                         Amount

A. Training Kit for 11 TVET trainers    Php              1,000.00

pencil, pen, CD, bond paper,

brown envelope, long folder,

name tag

B. Meals (4 lunch) and Snack (am/pm)

(11 trainers, 1 expert, 1 VIS, 2 secretariats)

Lunch @ Php 150 x 4 x 15 pax                                  9,000.00

Snacks @ Php 50 x 8 x 15 pax                                   6,000.00

C. Honorarium

@Php 2000/day x 4 days                                            8,000.00

D. Contingency                                                             2,500.00

                                                                                  Total = 26,500.00

Expected Output:

At the end of the seminar-workshop, the participants are expected to produce the following outputs:

  1. Table of Specifications (TOS)
  2. Sample of Test Construction per training qualification

Training Matrix

Day 1

TIME Sub-Topics Focal Persons/ Facilitator Resources
8:00 – 8:30 Registration of Participants ( signing of attendance sheet  and preparation prior seminar-workshop proper)
8:31 – 9:00 Presenting of Participant’s Expectations HR representative
Topic1: Purpose of Conducting Institutional Assessment in Technical Vocational Education and Training
9:01 – 11:00 Overview of Training Regulations, Competency-based Training and Competency-based Curriculum Vocational Instruction Supervisor (VIS) Laptop, Overhead projector, handouts
11:01 – 12:00 Conducting Institutional Assessment in TVET
12:01 -1:15 Noon break
Topic 2: Overview and Principles of Test Construction
1:16 – 4:30 Overview of Test Construction

–              Planning for the test

–              Preparing the test

–              Analyzing and revising the test

Basic Principles of Test Construction

Expert in Test Construction Laptop, Overhead projector, handouts
4:31 : 5:00 Open Forum HR representative and Expert in Test Construction

Day 2

TIME Sub-Topics Focal Persons/ Facilitator Resources
8:00 – 8:30 Preparation
 Topic 3: Importance of Table of Specifications (TOS)
8:31 – 9:00 Addition Inputs regarding Day 1 activities

Film Viewing: The Role of Assessment

HR representative Laptop, Overhead Projector
9:01 – 10:30 Importance of Table of Specifications in Test construction

 

Steps in Developing Table of Specifications (TOS)

Expert in Test Construction Laptop, Overhead Projector, Training Regulations and Competency-Based Curriculum,
10:31 -11:45 Workshop in Developing Table of Specifications Expert in Test Construction and HR representative Writing Materials, Laptop computer
11:46 – 1:00 Noon break
1:01 – 2: 00 Presentation of Outputs and Critiquing Expert in Test Construction and HR representative Laptop Computer, Overhead Projector
 Topic 4: Item Analysis, Validity, and Reliability of Test Questions
2:01 – 3:30 Developing Multiple-Choice Type of Questions

–              Advantages

–              Disadvantages

–              Principles

Expert in Test Construction Laptop Computer, Overhead Projector
3:31 – 4:45 Workshop in Developing  Multiple – Choice Test Expert in Test Construction

Day 3

TIME Sub-Topics Focal Persons/ Facilitator Resources
8:00 – 9:00  

Continuation of Workshop in Developing Multiple Choice Test

9:01 – 11:00 Performing Item Analysis Technique (Item Difficulty, Discrimination Index, Distractor (Option) Analysis) Expert in Test Construction Laptop Computer, Overhead Projector
11:01 – 12:00 Workshop on Item Analysis Expert in Test Construction Laptop Computer, Overhead Projector, Test Questions
12:01 – 1:00 Noon break
1:01 – 4:00 Continuation on Workshop of Item Analysis Expert in Test Construction Laptop Computer, Writing Materials, Test Questions
4:01 – 5:00 Presentation of Outputs and Critiquing Expert in Test Construction and HR representative Laptop, Overhead Projector

Day 4

TIME Sub-Topics Focal Persons/ Facilitator Resources
8:00 – 8:30 Preparation
8:31 – 9:30 Performing Validity of Test Questions Expert in Test Construction Laptop Computer, Overhead Projector
9:31 – 11:15 Measuring Reliability of Test Questions Expert in Test Construction Laptop Computer, Overhead Projector
11:16 – 12:00 Workshop on Test Questions Validity and Reliability Expert in Test Construction Laptop Computer, Overhead Projector, Test Questions
12:01 – 1:00 Noon break
1:01 – 2:30 Continuation of Workshop on Test Questions Validity and Reliability Expert in Test Construction and HR representative Laptop Computer, Writing Materials, Test Question
2:31 – 4:00 Presentation of Outputs and Critiquing Expert in Test Construction and HR representative Laptop, Overhead Projector
4:01 – 5:00 Closing of Seminar

Distribution of Certificate of Training

Vocational Instruction Supervisor,

Vocational School Administrator,

Expert in Test Construction  and

HR representative

CONCLUSION

The study concludes that from the sampled TESDA Technology Institution – COBSAT where developed test constructions were used by TVET trainers in competency-based assessment in their respective training qualifications, both the trainees and trainers were mostly females. The trainers have been teaching as TVET trainers for around 10 years. In Technical Vocational Education and Training specifically in a TESDA Technology Institution (TTI), the test construction was based on the training standard competencies from the promulgated training regulations of Technical Education and Skills Developments (TESDA) and analyzed using the Table of Specifications. TVET trainers handling National Certificate Level II, the constructed test covered three levels of thinking, namely knowledge, comprehension, and application. However, test questions constructed were mostly on the level of knowledge and comprehension. Although reliability indexes of test questions constructed in the sampled training qualifications mostly all denoted higher 0.70 (reliability coefficients for good classroom test) except in Bartending NC II, the performed item analysis on the test questions constructed concludes the need of either to revise or discard some of its items used in the assessment. Hence, there is an indication that some items used in competency-based assessments need to be improved. The indicators in the process of developing test questions were perceived differently at some observations; hence, a significant difference was established.

Furthermore, in developing test questions, factors pertained to trainee’s characteristics, availability of learning resources and trainer’s competence in developing test were among of the concerns and challenges faced by trainers that need to be addressed accordingly to improve the test construction in technical vocational education and training for TESDA Technology Institutions.

In light of the conclusions mentioned, the following recommendations are offered:

TVET trainers

The lack of Technical Vocational Education and Training (TVET) trainers’ training related to construction of test could affect trainer’s competence and a hindrance in developing appropriate test construction; hence, it is recommended that TVET trainers should address the need of right training and seminars to the institution’s administration thru the institution’s Vocational Instruction Supervisor. Considering the findings revealed in the study regarding the need to revise some of the test items used in competency-based assessments, TVET trainers should initiate the revisions of those test items then. It is also recommended that TVET trainers should start and thoroughly examine the test construction in their respective training qualifications as part of the trainer’s administrative functions and duties. This is to ensure that good items are used in the assessment and this can be determined by performing the item analysis technique. TVET trainers should continue their studies in the graduate school to finish and earn their respective master’s degree in education.  The graduate studies curriculum offers  competencies that will aid them in assessing the different needs in improving training and instruction in terms of educational assessment. Individuals enrolled in technical vocational education and training were from different types of; hence, it is recommended to review and interpret the data gathered from the trainee’s characteristics that were usually distributed prior the conduct of training. Constant of review of trainee’s records and performance should be performed by the trainers, so that appropriate intervention can be initiated towards improving classroom instruction and training delivery. Lastly, the trainers should suggest for titles of learning references (books or manuals) for approval of procurement from the institution’s administration.

TVET Trainees

The findings of the study revealed for the most part of the test questions constructed were below fair or poor items. Hence, it is recommended that trainees enrolled in TVET programs should continue to support or help TVET trainers to enhance the construction of test used in the competency-based assessment. The data gathered from the candid responses of the test questions distributed to trainees during competency-based assessment could aid TVET trainers in enhancing training delivery or instruction and revisions of test questions used in the competency-based assessment.

Administration of Cagayan de Oro (Bugo) School of Arts and Trades, a TESDA Technology Institution (TTI)

The findings of the study revealed the lack of TVET trainers’ training and seminars related to the test construction; hence, it is recommended that the administration of COBSAT, a TESDA Technology Institution, to review its policies in relation to trainers capability enhancement programs. One of those is to facilitate the realization of training and seminars that would develop the competence of trainers in connection to test construction. Likewise, conduct a seminar that will provide the trainers with enough information regarding the current training needs of trainees. Further, it is recommended to strengthen stakeholder and industry linkage to generate assistance or expediting the planning of possible procurement of up-to-date learning resources (books and manuals) or to aid TVET trainers in constructing the test.

TESDA Misamis Oriental-Provincial Office

The primary function of TESDA Misamis Oriental-Provincial Office is to oversee the overall operations of TESDA Technology Institutions. Based on the findings of study sampled in a TESDA Technology Institution, TESDA Misamis Oriental Provincial Office should facilitate provincial wide training related to the construction of test to all its TESDA Technology Institutions (TTIs). Being an agency in the right position for its TTIs, TESDA Misamis Oriental Provincial Office can initiate provincial wide training that could develop the competence of its TVET trainers in terms of test construction.

REFERENCES

Books

  1. Angub, R.C. (2012). Planning training sessions. Competency-based learning material. Technical Education and Skills Development Authority.
  2. Davis, S.L. & Morrow, A.K. (2013).Creating usable assessment tools: A step by step guide to instrument design.
  3. Department of Education and Training (2008). Designing assessment tools for quality outcomes in VET. p. 31-32. 151 Royal Street East Perth WA 6004
  4. DOLE (2011). The Philippine labor and employment plan 2011-2016 inclusive growth through decent productive work. Department of Labor and Employment. Intramuros, Manila.
  5. Dunn, L. (2011). Selecting methods of assessment. Oxford, UK: Oxford Brokkes University.
  6. Francisco, A.P. (2012). Conduct competency assessment. Competency based learning material Technical Education and Skills Development Authority.
  7. Glaser, R. (1963). Instructional technology and the measurement of learning outcomes. American Psychologist 18, 519-522
  8. Neukrug, E.S. & Fawcett, R.C. (2010). Essential of testing & assessment. A practical guide for counselors, social workers and psychologist. Belmont CA USA.Brooks/Cole, Cengage Learning, 2nd edition
  9. Norton, R.E. (1987). Competency-based education and training: A humanistic and realistic approach to technical and vocational instruction. Chiba City, Japan. ERIC ED 279910.
  10. Ross, K.N. (2015).Overview of test construction. Quantitative research methods in educational planning.
  11. Wells, C.S. & Wollack, J.A. (2013). An instructor’s guide to understanding test reliability. University of Wisconsin. 1025 W. Johnson St. #373 Madison, WI 53706

Masteral Thesis and Doctoral Dissertation

  1. Alonsabe, O.C. (2007). Towards standardization of the division achievement test in science for secondary school students and the utilization of the final form for establishing baseline of competencies learned. (Dissertation, Capitol University, Cagayan de Oro City, 2007)
  2. Asa, C.T. (2008). Constructing parallel test for assessing order thinking skill in science for grade IV and grade V. (Dissertation, Capitol University, Cagayan de Oro City, 2008)
  3. Cosiñero, B.O. (2010). Test construction in science II for secondary school students and the utilization of the final form for establishing level of mastery of skills learned. (Dissertation, Capitol University, Cagayan de Oro City, 2010)
  4. Dimen, J.B. (2015). Validity of the training assessment tools for housekeeping nc II as perceived by the trainers of technical training schools in Cagayan de Oro city: basis for an enhancement program. (Dissertation, University of Southern Philippines Foundation, Cebu City, 2015)
  5. Namoc, L.L. (2008). Test construction for assessing higher order thinking skills in Math 1: A catalyzer for the test item bank development. (Dissertation, Capitol University, Cagayan de Oro City, 2008)
  6. Rañoa, C.A. (2014). Constructing test for board examination courses in establishing level of mastered skills towards the licensure examination. (Dissertation, Capitol University, Cagayan de Oro City, 2014)
  7. Rivera, J.E. (2007). Test item construction and validation: Developing a statewide assessment for agricultural science education. (Dissertation, Cornel University, 2007). Retrieved January 16, 2016, from https://ecommons.cornell.edu/bitstream/handle/1813/3496/9-10-06.pdf?sequence=1
  8. Savariz, C.M. (2012). Instructional and administrative practices and organizational performance of Cagayan de Oro (Bugo) School of Arts and Trades. (Thesis, Capitol University, Cagayan de Oro City, 2012)

Journal

  1. Fan, X. (1998). Item response theory and classical test theory: an empirical comparison of their item/person statistics. Educational and Psychological Measurement. Gale Group Information Integrity. June 1998 V58 n3 p357 (25). Retrieved January 28, 2016, from http://www2.hawaii.edu/~daniel/irtctt.pdf
  2. Hamafyelto, R.S. et. al. (2015). Assessing teacher competence in test construction and content validity of teacher made examination questions in commerce in Borno State, Nigeria.  Scientific & Academic Publishing. Vol. 5 No. 5, 2015, pp. 123-128. Retrieved February 16, 2016, from http://article.sapub.org/10.5923.j.edu.20150505.01.html
  3. Osadebe, P.U. (2014). Construction of economics achievement test for assessment of students. World Journal of Education. Vol 4, No.2;2014. Retrieved February 15, 2016, from http://www.sciedu.ca/journal/index.php/wje/article/download/4534/2616
  4. Salaria, N. (2012). Meaning of the term-descriptive survey research method. International Journal of Transformation in Business Management. Vol.No.1, Issue No. 6, Apr-Jun. Retrieved November 4, 2015, from http://www.ijtbm.com/images/short_pdf/Apr_2012_NEERU%20SALARIA%202.pdf

Web Articles

  1. Cherry, K. (2016). What is validity?. Retrieved January 19, 2016, from http://psychology.about.com/od/researchmethods/f/validity.htm
  2. Korb, K.A. (2014). Calculating reliability of quantitative measures. Retrieved January 11, 2016, from http://www.scribd.com/doc/211111125/Calculating-Reliability#scribd
  3. OEA (2005). Understanding item analysis reports. University of Washington. Retrieved January 5, 2016, from https://www.washington.edu/oea/services/scanning_scoring/scoring/item_analysis.html
  4. Renee, M. (2016). Oral questioning as an evaluation strategy. Retrieved October 15, 2015, from http://smallbusiness.chron.com/oral-questioning-evaluation-strategy-18035.html
  5. Sheil, G., Kellaghan, T., & Moran, G. (2010). Standardized testing in lower secondary education. National Council for Curriculum and Assessment. Retrieved February 10, 2016, from http://www.ncca.ie/en/Publications/Reports/Standardised_Testing_In_LowerSecondary_Education.pdf
  6. Swift, C. (2008). How to write better test. A handbook for improving test construction skills. Retrieved November 4, 2015, from http://www.indiana.edu/~best/pdf_docs/better_tests.pdf
  7. Tan, M.L. et.al. (2013). Item analysis and validation. Retrieved January 10, 2016, from http://www.slideshare.net/kEnkEnkEntan/item-analysis-and-validation
  8. TESDA (2013). Training regulation in automotive servicing nc II. Retrieved January 9, 2016, from http://tesda.gov.ph/Download/Training_Regulations?Searchcat=Training%20Regulations
  9. TESDA (2013). Training regulation in bartending nc II. Retrieved January 9, 2016, from http://tesda.gov.ph/Download/Training_Regulations?Searchcat=Training%20Regulations
  10. TESDA (2013). Training regulation in food and beverage services nc II. Retrieved January 9, 2016, from http://tesda.gov.ph/Download/Training_Regulations?Searchcat=Training%20Regulations
  11. TESDA (2013). Training regulation in housekeeping nc II. Retrieved January 9, 2016, from http://tesda.gov.ph/Download/Training_Regulations?Searchcat=Training%20Regulations
  12. Trigwell, K. (1992). Information for UTS staff on assessment. Sydney: UTS working party on assessment. Retrieved October 18, 2015, from http://www.iml.uts.edu.au/assessment/types/mcq/index.html
  13. UNICEF (2011). Adolescence an age of opportunity. Retrieved January 5, 2016, from http://www.unicef.org/adolescence/files/SOWC_2011_Main_Report_EN_02092011.pdf
  14. University of Wisconsin (2016). Reliability and validity. Retrieved January 12, 2016, from http://www.uwosh.edu/testing/faculty-information/test-scoring/score-report-interpretation/item-analysis-1/reliability-validity
  15. World Bank (2010). Philippine skills report. Skills for the labor market in the Philippines. Retrieved November 2, 2015, from http://siteresources.worldbank.org/EASTASIAPACIFICEXT/Resources/226300-1279680449418/HigherEd_Philippines Skills Report.pdf
  16. Zimmaro, D.M. (2014). Writing good multiple-choice exams. Retrieved January 16, 2016, from http://www6.cityu.edu.hk/edge/workshop/seminarseries/2010-11/Seminar03-WritingGoodMultipleChoiceExams.pdf

Article Statistics

Track views and downloads to measure the impact and reach of your article.

2

PDF Downloads

7 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

GET OUR MONTHLY NEWSLETTER

Subscribe to Our Newsletter

Sign up for our newsletter, to get updates regarding the Call for Paper, Papers & Research.

    Subscribe to Our Newsletter

    Sign up for our newsletter, to get updates regarding the Call for Paper, Papers & Research.