www.rsisinternational.org
Page 9491
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
Content Validity of an Instrument to Assess Innovation
Competencies in Engineering Education
Nor Aisyah Che Derasid
1
, Aede Hatib Musta'amal @ Jamal
2
, Mohd Salehudin Marji
3
. Norzanah
Rosmin
3
1,2,3
Department of Advanced Technical and Vocational Education and Training, Faculty of Educational
Sciences and Technology, Universiti Teknologi Malaysia, Skudai, Malaysia
4
Department of Electrical Power Engineering, Faculty of Electrical Engineering, Universiti Teknologi
Malaysia, Skudai, Malaysia
DOI: https://dx.doi.org/10.47772/IJRISS.2025.910000777
Received: 07 November 2025; Accepted: 14 November 2025; Published: 24 November 2025
ABSTRACT
Innovation competency has become an essential requirement for engineering graduates as they navigate
increasingly complex and technologically advanced environments. This study aims to develop and validate an
instrument to assess innovation competencies among engineering students in Malaysian technical and
vocational institutions. The construction of the instrument was guided by grounded theory insights obtained
from structured interviews with engineering educators, together with a synthesis of current literature and
ABET-aligned competency frameworks. This process established three major dimensions knowledge, skills,
and personality which formed the basis for an initial pool of 53 items. The content validity of the instrument
was evaluated by a panel of five experts using the Content Validity Index (CVI) and the modified Kappa
statistic. Experts rated each item for relevance using a four-point scale. The results demonstrated strong overall
agreement, with most items achieving I-CVI values between 0.80 and 1.00 and Kappa coefficients indicating
good to excellent concordance. Items falling below acceptable thresholds were removed or refined based on
expert judgment. As a result, 42 items were retained, representing essential components of innovation
competency across the three dimensions. The findings confirm that the instrument possesses strong content
validity and is suitable for further psychometric evaluation. This validated item set offers educators,
curriculum designers, and policymakers a structured and evidence-based tool for assessing innovation
readiness among engineering students. The instrument has potential applications in evaluating program
effectiveness, identifying student competency gaps, and supporting targeted instructional improvements.
Future research should incorporate pilot testing, reliability analysis, exploratory and confirmatory factor
analysis, and multi-institutional validation to strengthen the instrument’s robustness, generalizability, and
practical utility.
Keywords: innovation competency, content validity index, Kappa statistic, engineering education, instrument
development
INTRODUCTION
Innovation plays an essential role in technical and engineering fields, enabling students to develop the critical
thinking, creativity, and problem-solving abilities required to become future leaders and innovators (Suyuti,
2024). The Future of Jobs Report 2023 highlights the importance of knowledge, skills, and personality traits in
driving innovation, emphasizing competencies such as critical thinking, creativity, collaboration, adaptability,
resilience, curiosity, and lifelong learning. In an era of rapid technological advancement, engineering graduates
must possess updated and specialized knowledge to remain relevant.
However, the rapid pace of technological change presents significant challenges for educational institutions in
ensuring their curricula remain current (Roy & Roy, 2021). Traditional instructional systems often struggle to
adapt quickly, creating a gap between academic programs and industry expectations. Modern engineering
www.rsisinternational.org
Page 9492
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
problems increasingly require interdisciplinary understanding, yet many programs remain compartmentalized,
limiting students’ exposure to integrated problem-solving and teamwork (Reich et al., 2020). Prior studies
(Akdur, 2021; de Campos et al., 2020; Rovida & Zafferri, 2022) also emphasize that while technical skills are
essential, soft skills such as communication, teamwork, adaptability, and resilience are equally critical but
often underrepresented in engineering curricula. Encouraging an innovation mindset characterised by
creativity, curiosity, and a willingness to take risks is therefore essential (Gorlewicz & Jayaram, 2020).
Given these evolving expectations, researchers have called for a deeper re-examination of fundamental
competencies that support innovation in technical and engineering education (Hirudayaraj et al., 2021). This
reinforces the need for robust, empirically validated assessment tools capable of measuring innovation
competencies accurately and comprehensively. Content validity plays a crucial role in this process, as it
ensures that an instrument adequately represents the construct under study while avoiding irrelevant or
overlapping content (Chong et al., 2021; Shrotryia & Dhanda, 2019). It is widely recognized as a critical first
step in instrument development (Kipli & Khairani, 2020; Spoto et al., 2023), forming the foundation for
subsequent reliability and construct validation.
To address this need, the present study develops and validates an instrument to assess innovation competencies
among engineering students. The dimensions and items were conceptually grounded in established literature,
Accreditation Board for Engineering and Technology (ABET) expectations, and insights generated through
grounded theory procedures. These inputs supported the identification of three major dimensions knowledge,
skills, and personality with corresponding sub-dimensions. A total of 53 items were generated as the initial
pool for validation. As the detailed item-generation process is part of a separate manuscript, this article focuses
specifically on the content validity evaluation of the 53 items.
RESEARCH METHOD
This study aimed to develop an assessment instrument for innovation competency and evaluate its content
validity using quantitative procedures. The research process included two main phases: (i) establishing the
instrument dimensions and initial item pool, and (ii) evaluating content validity through the Content Validity
Index (CVI) and the modified Kappa statistic.
Grounded theory methodology was employed to help conceptualize the initial dimensions of innovation
competency. Structured interviews with three engineering and technical educators from Malaysian higher
technical education institutions were conducted to capture expert perspectives on essential components of
innovation. Each interview lasted 20-30 minutes, was transcribed verbatim, and reviewed for accuracy.
The interview data were coded using thematic analysis to identify recurring patterns and construct categories.
These qualitative insights were combined with a comprehensive review of recent literature and ABET
expectations to construct a preliminary framework. This process resulted in the formation of three core
dimensions knowledge, skills, and personality each with multiple sub-dimensions representing critical
components of innovation competency.
The combination of literature synthesis and expert insight was used to generate an initial pool of 53 items. As
the detailed coding and item-development procedures form part of another dedicated study, they are not
elaborated in this article. In the present paper, the 53 items are presented to support the validation process,
serving as the content to be evaluated for relevance and clarity.
Although grounded theory studies in engineering education often involve larger samples, the use of three
expert educators is methodologically appropriate for exploratory construct identification. Grounded theory
prioritizes conceptual richness over sample size, and the homogeneity of the participants each with direct
experience evaluating student innovation, enabled focused and contextually meaningful insights. As noted by
Vasileiou et al. (2018), saturation is influenced by participant similarity and the specificity of the research aim.
Tutar et al. (2024) also highlight that smaller samples may be justified when the goal is exploratory construct
development rather than full theory generation. Importantly, the qualitative findings in this study served as a
foundation for item generation, while the content validation was subsequently carried out by a separate, larger
www.rsisinternational.org
Page 9493
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
panel of five experts, ensuring independence between item creation and evaluation.
A panel of five experts, including engineering specialists, a psychometrician, and a curriculum design expert,
evaluated the 53 items for relevance using a four-point scale. The Item-Level Content Validity Index (I-CVI)
and modified Kappa statistic were calculated to determine agreement beyond chance. Items meeting
established thresholds were retained, forming the basis of the validated instrument.
Instrument
The instrument was constructed to evaluate the self-perceived innovation competency of technical and
engineering students. It was developed in alignment with ABET standards and comprises 53 items categorized
into three main dimensions and ten sub-dimensions, as presented in Table 1. These dimensions include
Knowledge, Skills, and Personality, reflecting broad and multifaceted aspects of innovation competency.
Table 1. Item distribution list
Dimensions
Sub-Dimensions
Item Pool
Knowledge
Critical Thinking
1. Ability to identify the core problem in a technical situation.
2. Tendency to evaluate information from multiple sources before making
decisions.
3. Skill in analysing situations to determine root causes.
4. Capability to assess strengths and weaknesses of different solution
options.
5. Habit of questioning assumptions to clarify problem definitions.
6. Use of evidence-based reasoning when approaching engineering tasks.
Creative Thinking
7. Capacity to generate multiple ideas for solving a problem.
8. Willingness to propose unconventional or novel solutions.
9. Ability to combine different concepts to create new ideas.
10. Comfort in experimenting with new or unfamiliar approaches.
11. Skill in modifying existing ideas to improve effectiveness.
12. Application of imagination and creativity in technical tasks.
Skill
Problem Solving
13. Development of step-by-step plans to address engineering challenges.
14. Ability to troubleshoot when a solution does not work as expected.
15. Consideration of potential risks before deciding on a solution method.
16. Use of systematic thinking to resolve complex problems.
17. Flexibility in adapting problem-solving strategies when needed.
18. Identification of constraints that may influence possible solutions.
www.rsisinternational.org
Page 9494
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
19. Verification of a solution’s effectiveness after implementation.
Communication
20. Clear expression of ideas to others.
21. Ability to explain technical concepts in understandable ways.
22. Active listening during discussions or teamwork.
23. Provision of constructive feedback to peers.
24. Confidence when presenting ideas to a group.
25. Ability to document work clearly and professionally.
Collaboration
26. Active contribution during team-based tasks.
27. Respectful consideration of different viewpoints in group discussions.
28. Support provided to teammates when assistance is needed.
29. Shared responsibility in working toward team goals.
30. Effective collaboration with peers from diverse backgrounds.
31. Constructive handling of conflicts within group work.
Personality
Openness
32. Interest in exploring new ideas or concepts.
33. Willingness to try unfamiliar methods or tools.
34. Curiosity and eagerness to learn new things.
35. Adaptability to changes in project requirements.
36. Openness to revising ideas based on new insights.
Conscientiousness
37. Completion of tasks with timeliness and quality.
38. Organized approach to planning technical work.
39. Attention to detail when performing engineering-related tasks.
40. Responsibility shown in achieving high standards of work.
Extraversion
41. Comfort engaging in discussions with peers.
42. Ease in expressing ideas within group settings.
43. Participation in team-based or group activities.
44. Confidence interacting with new people.
Agreeableness
45. Considerate behaviour when interacting with others.
46. Sensitivity toward others’ feelings and perspectives.
www.rsisinternational.org
Page 9495
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
47. Cooperative attitude during teamwork.
48. Appreciation expressed for others’ contributions.
49. Maintenance of positive working relationships with peers.
Emotional
Stability
50. Ability to remain calm when working under pressure.
51. Emotional control during stressful situations.
52. Resilience when facing setbacks or failures.
53. Focus maintained when encountering unexpected challenges.
Identification of Validation Panels
A purposive sampling technique was used to select five experts for the content validation process, consistent
with recommendations by Shrotryia and Dhanda (2019), who emphasize the importance of expertise, training,
and professional experience in selecting content validation panels. The chosen experts included three
specialists in mechanical, electrical, and civil engineering from technical and vocational education institutions,
one psychometric expert, and one curriculum design expert.
Appointment letters and validation forms were distributed via email, and due to geographical constraints,
experts completed the assessments online. Each expert was given 14 days to evaluate the 53 items, using a 4-
point relevance scale:
1 = not relevant,
2 = somewhat relevant,
3 = quite relevant,
4 = highly relevant.
The absence of a neutral option was intended to encourage decisive judgments.
Quantification of Content Validity
Content Validity Index (CVI)
The Item-Level Content Validity Index (I-CVI) was calculated by dividing the number of experts rating an
item as 3 or 4 by the total number of experts. This approach is widely used in validating assessment
instruments (Almanasreh et al., 2022; Indarta et al., 2023; Jamaludin et al., 2021; Kipli & Khairani, 2020;
Madadizadeh & Bahariniya, 2023). However, CVI alone may inflate agreement levels due to chance.
Therefore, in line with recommendations by Walter et al. (2019) and Zamanzadeh et al. (2014), the Kappa
statistic was also applied.
Kappa Statistic coefficient
To account for random agreement, the Kappa statistic was computed using the formula:
󰇛
󰇜

where N is the number of experts and A is the number of experts agreeing on an item.
The modified Kappa coefficient was calculated using:
www.rsisinternational.org
Page 9496
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
 

Table 2 below summarizes the evaluation criteria for Kappa Statistic coefficient.
Table 2. Kappa Statistic Coefficient Interpretation
Kappa Values
0.74 and above
0.6 to 0.74
0.40 to 0.59
(Polit & Beck, 2006; Zamanzadeh et al., 2014)
RESULTS AND DISCUSSION
This section elaborates on the findings of the I-CVI and Kappa Statistics for all three dimensions: Knowledge,
Skills, and Personality of Engineering and Technical Students' Innovation, based on Tables 3, 4, and 5,
respectively.
The Knowledge dimension in Table 3 has 12 items measuring 6 creative and 6 critical thinking aspects by five
experts rating. All 12 items in the Knowledge dimension demonstrated strong content validity, with I-CVI
values ranging from 0.80 to 1.00 and Kappa values indicating good to excellent agreement. Items aligned with
critical thinking and creative thinking were consistently rated as relevant by experts.
Table 3. Ratings on “Knowledge” Dimension of Engineering and Technical Students' Innovation
Dimensions: Knowledge
Items
E1
E2
E3
E4
E5
Experts
Agree
I-CVI
Pc
K
1
X
X
X
X
X
5
5
1.00
0.03
1.00
2
-
X
X
X
X
5
4
0.80
0.16
0.76
3
-
X
X
X
X
5
4
0.80
0.16
0.76
4
X
-
X
X
X
5
4
0.80
0.16
0.76
5
X
X
X
X
X
5
5
1.00
0.03
1.00
6
X
X
X
X
X
5
5
1.00
0.03
1.00
7
X
-
X
X
X
5
4
0.80
0.16
0.76
8
X
X
X
X
X
5
5
1.00
0.03
1.00
9
X
X
-
X
X
5
4
0.80
0.16
0.76
10
X
X
X
X
X
5
5
1.00
0.03
1.00
11
-
X
X
X
X
5
4
0.80
0.16
0.76
12
X
X
X
X
X
5
5
1.00
0.03
1.00
Note. I-CVI = item content validity index; Pc = probability of chance agreement; K = Kappa Statistic: X =
Items Rated 3 or 4
www.rsisinternational.org
Page 9497
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
The high level of agreement suggests that knowledge-related competencies such as analysing problems,
synthesizing information, and generating ideas are well-established constructs within engineering education.
This finding aligns with existing studies (e.g., Qadir et al., 2020; Thornhill-Miller et al., 2023) which
emphasize that foundational cognitive skills form the basis of innovation. Because these constructs are well-
defined and widely recognized in ABET-aligned outcomes, experts likely found it easier to evaluate their
relevance, resulting in stable and consistent ratings.
The Skills dimension displayed a combination of strong and weak content validity. While 11 items achieved
perfect I-CVI and Kappa scores, several items (specifically 18, 19, 22, 27, and 31) showed low I-CVI values
(≤0.60) and fair-to-poor Kappa coefficients.
Table 4: Ratings on "Skill" Dimension of Engineering and Technical Students' Innovation
Dimensions: Skills
Items
E1
E2
E3
E4
E5
Experts
Agree
I-CVI
Pc
k
13
-
X
X
X
X
5
4
0.80
0.16
0.76
14
X
X
X
X
X
5
5
1.00
0.03
1.00
15
X
-
X
X
X
5
4
0.80
0.16
0.76
16
X
X
X
X
X
5
5
1.00
0.03
1.00
17
X
X
X
X
X
5
5
1.00
0.03
1.00
18
-
-
X
X
X
5
3
0.60
0.31
0.42
19
X
X
-
X
-
5
3
0.60
0.31
0.42
20
X
X
X
X
X
5
5
1.00
0.03
1.00
21
X
X
X
X
X
5
5
1.00
0.03
1.00
22
-
X
-
-
X
5
2
0.40
0.31
0.13
23
X
X
X
X
X
5
5
1.00
0.03
1.00
24
X
X
X
X
X
5
5
1.00
0.03
1.00
25
X
-
X
X
X
5
4
0.80
0.16
0.76
26
X
X
X
X
X
5
5
1.00
0.03
1.00
27
X
-
-
-
X
5
2
0.40
0.31
0.13
28
X
X
X
X
X
5
5
1.00
0.03
1.00
29
X
X
-
X
X
5
4
0.80
0.16
0.76
30
-
X
X
X
X
5
4
0.80
0.16
0.76
31
X
-
-
X
X
5
3
0.60
0.31
0.42
Note. I-CVI = item content validity index; Pc = probability of chance agreement; K = Kappa Statistic: X =
www.rsisinternational.org
Page 9498
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
Items Rated 3 or 4
Some items appeared too similar to others within the communication or collaboration sub-dimensions, making
it difficult for experts to distinguish their unique contribution. Similar issues are noted by Kipli & Khairani
(2020), who argue that overlapping skill items frequently lead to inconsistent expert ratings.
Items with broader behavioural descriptors may not have clearly reflected innovation-related skills, causing
experts to perceive them as generic soft skills rather than competencies directly contributing to innovation.
Skills such as teamwork and problem-solving manifest differently across engineering programs, which may
explain varied expert judgments. This is consistent with de Campos et al. (2020), who note that
operationalizing engineering soft skills remains a challenge due to their context-dependent nature.
The mixed performance of skill items reflects the complexity of measuring interpersonal and problem-solving
skills in innovation contexts.
These skills are often multifaceted and require clear operational definitions to avoid ambiguity. Removing low-
performing items strengthens the conceptual clarity of the Skills dimension and aligns the instrument with best
practices in competency assessment.
The Personality dimension (Table 5) also showed variability. While 12 items received perfect I-CVI and
Kappa values, several items (particularly 35, 41, and others scoring ≤0.60) demonstrated weaker expert
agreement.
Table 5: Ratings on "Personality" Dimension of Engineering and Technical Students' Innovation
Dimensions: Personality
Items
E1
E2
E3
E4
E5
Experts
Agree
I-CVI
Pc
k
32
X
-
X
-
X
5
3
0.60
0.31
0.42
33
X
X
X
X
X
5
5
1.00
0.03
1.00
34
X
X
X
X
X
5
5
1.00
0.03
1.00
35
-
-
X
-
X
5
2
0.40
0.31
0.13
36
X
-
X
X
-
5
3
0.60
0.31
0.42
37
X
X
X
X
X
5
5
1.00
0.03
1.00
38
X
X
X
X
X
5
5
1.00
0.03
1.00
39
X
X
X
X
X
5
5
1.00
0.03
1.00
40
X
X
X
X
X
5
5
1.00
0.03
1.00
41
X
-
-
-
X
5
2
0.40
0.31
0.13
42
X
X
X
X
X
5
5
1.00
0.03
1.00
43
X
X
-
X
X
5
4
0.80
0.16
0.76
44
X
X
X
X
X
5
5
1.00
0.03
1.00
www.rsisinternational.org
Page 9499
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
45
-
X
X
X
-
5
3
0.60
0.31
0.42
46
X
X
-
X
X
5
4
0.80
0.16
0.76
47
X
X
X
X
X
5
5
1.00
0.03
1.00
48
X
-
X
X
X
5
4
0.80
0.16
0.76
49
-
-
X
X
X
5
3
0.60
0.31
0.42
50
X
X
-
X
X
5
4
0.80
0.16
0.76
51
X
X
X
X
X
5
4
0.80
0.16
0.76
52
X
-
X
-
X
5
3
0.60
0.31
0.42
53
X
X
X
X
X
5
5
1.00
0.03
1.00
Note. I-CVI = item content validity index; Pc = probability of chance agreement; K = Kappa Statistic: X =
Items Rated 3 or 4
Personality traits such as openness, extraversion, and emotional stability may be interpreted differently
depending on cultural and institutional contexts, which can influence how experts evaluate their relevance to
innovation. Items describing emotional or behavioural tendencies may have been viewed as too broad or not
sufficiently linked to specific innovation outcomes, making it difficult for experts to identify a clear connection
between certain traits and innovation competency in engineering. Additionally, cultural norms within
Malaysian TVET settings may shape expectations about personality traits in ways that differ from Western-
derived personality models, contributing to variations in expert judgment. These patterns are consistent with
the observations of Rovida and Zafferri (2022), who emphasize that personality constructs often require
contextual adaptation to ensure clarity and relevance across different educational environments.
The variability in personality ratings suggests that traits must be operationalized carefully to ensure relevance
to innovation behaviour. Weak items were likely removed because they lacked specificity, overlapped with
other constructs, or did not explicitly support the behavioural aspects of innovation. The remaining items better
capture the personality dimensions that meaningfully contribute to innovative performance.
Across all 53 initial items, 42 items were retained based on established thresholds (I-CVI 0.78 and strong
Kappa values). The Knowledge dimension emerged as the most stable, while the Skills and Personality
dimensions required refinement due to conceptual ambiguity or inconsistent expert interpretation.
The use of both I-CVI and Kappa strengthens the validity of the retained items, addressing concerns raised in
the literature (Zamanzadeh et al., 2014; Polit & Beck, 2006) regarding reliance on CVI alone. This dual
approach allowed clearer differentiation between items with true agreement and those rated highly by chance.
These findings closely reflect patterns observed in previous research on innovation and competency
assessment. Prior studies have consistently shown that knowledge-based competencies tend to demonstrate
strong validity, as they are conceptually well-defined and closely aligned with established engineering
expectations (Munir, 2022; Qadir et al., 2020). In contrast, skill-related items often display greater variation,
largely because these competencies are situational and may manifest differently depending on instructional
practices, teamwork contexts, and problem-solving environments (Hirudayaraj et al., 2021). Similarly,
personality-related constructs frequently require cultural adaptation to ensure relevance and clarity, as
personality traits may be interpreted differently across educational and cultural settings (Rovida & Zafferri,
2022). Taken together, these similarities indicate that the patterns observed in this study are not isolated but
instead reflect broader challenges commonly reported in the development of comprehensive competency
instruments within engineering education.
www.rsisinternational.org
Page 9500
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
The refined 42-item instrument offers a coherent framework that aligns with ABET standards and the
grounded theory insights generated in this study. It serves as a validated measure that engineering educators
can use to assess students’ innovation readiness and identify areas requiring further development. Additionally,
the instrument provides a strong foundation for subsequent psychometric evaluations, including exploratory
and confirmatory factor analysis as well as reliability testing. Overall, the findings underscore the importance
of precise and contextually appropriate item wording particularly for skills and personality constructs to ensure
consistent interpretation and strengthen the instrument’s validity.
Overall, the content validation process confirms that most items are relevant and conceptually aligned with
innovation competency constructs. The removal of low-performing items enhances the instrument’s clarity and
validity, enabling it to serve as a practical tool for evaluating innovation competencies among engineering
students.
CONCLUSION
This study highlights the significance of the developed instrument in accurately and effectively assessing
students’ innovation competencies within engineering education. The high CVI and Kappa values obtained for
most items demonstrate strong consensus among experts regarding the clarity and relevance of the
instrument’s constructs, thereby strengthening its reliability. At the same time, several limitations should be
acknowledged to contextualize the findings. First, the study relied solely on expert judgment during the content
validation phase, which, although valuable, involves an inherent degree of subjectivity. Second, the initial
qualitative phase included only three engineering educators, which may limit the breadth of perspectives
informing the item development process. Third, pilot testing was not conducted at this stage, meaning that item
performance with actual student respondents remains unexamined.
Despite these limitations, the study provides a solid foundation for the development of a valid and structured
assessment tool aligned with the needs of engineering and technical education. To strengthen the instrument
further, future research should involve a larger and more diverse panel of experts, as well as pilot testing with
students across multiple institutions to enhance generalizability. In addition, advanced psychometric analyses
such as exploratory and confirmatory factor analysis, reliability testing, and model fit examination are
recommended to verify the underlying factor structure and measurement properties. These steps will help
ensure that the instrument evolves into a robust and widely applicable tool for assessing innovation
competencies in varied educational settings.
RECOMMENDATIONS
The development of this instrument presents several valuable applications for educational practice, curriculum
enhancement, and policy planning. For engineering lecturers, the instrument offers a practical tool to assess
students’ innovation competencies and to identify learners who may require targeted instructional support. It
also enables educators to design or adjust teaching strategies, projects, and learning activities that better
cultivate innovation-related behaviours. For curriculum designers, the instrument provides evidence-based
insights that can inform curriculum review processes, ensuring that innovation competencies such as critical
thinking, creative problem-solving, collaboration, and adaptability are meaningfully embedded within course
and program structures. Policymakers and academic leaders may also utilize the instrument to evaluate
institutional performance related to innovation outcomes and to align academic programs with ABET
standards and industry expectations.
Beyond its immediate instructional use, the instrument can also support longitudinal monitoring of students’
development, enabling institutions to track innovation competency growth over time. This longitudinal
perspective can help institutions refine their innovation-focused initiatives and improve graduate readiness for
future engineering challenges. Moving forward, further validation and refinement of the instrument should
include multi-institutional testing, cross-cohort comparisons, and integration with student performance data.
These efforts will ensure that the instrument remains adaptable, reliable, and responsive to evolving
educational needs while contributing to a stronger culture of innovation in engineering and technical programs
www.rsisinternational.org
Page 9501
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
ACKNOWLEDGEMENT
The authors acknowledge the Ministry of Malaysia Higher Education, Fundamental Research Grant (FRGS
Ref: FRGS/1/2023/SS107/UTM/02/14) with UTM Vot.No:5F639.
REFERENCES
1. Akdur, D. (2021). Skills gaps in the industry: Opinions of embedded software practitioners. ACM
Transactions on Embedded Computing Systems (TECS), 20(5), 1-39.
2. Almanasreh, E., Moles, R. J., & Chen, T. F. (2022). A practical approach to the assessment and
quantification of content validity. In Contemporary research methods in pharmacy and health services
(pp. 583-599): Elsevier.
3. Chong, J., Mokshein, S. E., & Mustapha, R. (2021). A content validity study for vocational teachers’
assessment literacy instrument (VoTAL). International Journal of Academic Research in Business and
Social Sciences, 11(4), 868-883.
4. de Campos, D. B., de Resende, L. M. M., & Fagundes, A. B. (2020). The importance of soft skills for
the engineering. Creative Education, 11(8), 1504-1520.
5. Ealangov, S. (2023). Cabaran dan Strategi Menghadapi Perubahan Kurikulum dalam Kalangan
Pensyarah Bidang Agroteknologi Kolej Komuniti: Challenges and Strategies in Curriculum Change
among Agrotechnology Lecturers in Community College. Online Journal for TVET Practitioners, 8(2),
23-36.
6. Gorlewicz, J. L., & Jayaram, S. (2020). Instilling curiosity, connections, and creating value in
entrepreneurial minded engineering: Concepts for a course sequence in dynamics and controls.
Entrepreneurship Education and Pedagogy, 3(1), 60-85.
7. Hirudayaraj, M., Baker, R., Baker, F., & Eastman, M. (2021). Soft skills for entry-level engineers:
What employers want. Education Sciences, 11(10), 641.
8. Indarta, Y., Ranuharja, F., & Dewi, I. P. (2023). Measuring Validity of Interactive Presentation Media
Using Content Validity Index (CVI). Paper presented at the 9th International Conference on Technical
and Vocational Education and Training (ICTVET 2022).
9. Jamaludin, T. S. S., Nurumal, M. S., Ahmad, N., Muhammad, S. A. N., & Chan, C. M. (2021).
Development and evaluating content validity of clinical skill analysis index tools. Open Access
Macedonian Journal of Medical Sciences, 9(T5), 6-12.
10. Kipli, M., & Khairani, A. Z. (2020). Content Validity Index: An application of validating CIPP
instrument for programme evaluation. Int Multidiscip Res J, 2(4), 31-40.
11. Madadizadeh, F., & Bahariniya, S. (2023). Tutorial on how to calculating content validity of scales in
medical research. Perioperative Care and Operating Room Management, 31, 100315.
12. Munir, F. (2022). More than technical experts: Engineering professionals’ perspectives on the role of
soft skills in their practice. Industry and Higher Education, 36(3), 294-305.
13. Polit, D. F., & Beck, C. T. (2006). The content validity index: are you sure you know what's being
reported? Critique and recommendations. Research in nursing & health, 29(5), 489-497.
14. Qadir, J., Yau, K.-L. A., Imran, M. A., & Al-Fuqaha, A. (2020). Engineering education, moving into
2020s: Essential competencies for effective 21st century electrical & computer engineers. Paper
presented at the 2020 IEEE Frontiers in Education Conference (FIE).
15. Reich, P. D., Marchesi, P., Hamilton, P. L., Austin, A. W., Dehne, C., Munson, J., . . . Klimek, J. F.
(2020). The Synergistic Classroom: Interdisciplinary Teaching in the Small College Setting: Rutgers
University Press.
16. Rovida, E., & Zafferri, G. (2022). The importance of soft skills in engineering and engineering
education: Springer.
17. Roy, M., & Roy, A. (2021). The rise of interdisciplinarity in engineering education in the era of
industry 4.0: implications for management practice. IEEE Engineering Management Review, 49(3), 56-
70.
18. Shrotryia, V. K., & Dhanda, U. (2019). Content validity of assessment instrument for employee
engagement. SAGE Open, 9(1), 2158244018821751.
19. Spoto, A., Nucci, M., Prunetti, E., & Vicovaro, M. (2023). Improving content validity evaluation of
www.rsisinternational.org
Page 9502
INTERNATIONAL JOURNAL OF RESEARCH AND INNOVATION IN SOCIAL SCIENCE (IJRISS)
ISSN No. 2454-6186 | DOI: 10.47772/IJRISS | Volume IX Issue X October 2025
assessment instruments through formal content validity analysis. Psychological methods.
20. Suyuti, S. (2024). The Importance of Creativity and Innovation in Education: How to Prepare Students
for the 21st Century Workforce. Education Studies and Teaching Journal (EDUTECH), 1(1), 80-92.
21. Thornhill-Miller, B., Camarda, A., Mercier, M., Burkhardt, J.-M., Morisseau, T., Bourgeois-Bougrine,
S., . . . Mourey, F. (2023). Creativity, critical thinking, communication, and collaboration: assessment,
certification, and promotion of 21st century skills for the future of work and education. Journal of
Intelligence, 11(3), 54.
22. Tutar, H., Şahin, M., & Sarkhanov, T. (2024). Problem areas of determining the sample size in
qualitative research: a model proposal. Qualitative Research Journal, 24(3), 315-336.
23. Vasileiou, K., Barnett, J., Thorpe, S., & Young, T. (2018). Characterising and justifying sample size
sufficiency in interview-based studies: systematic analysis of qualitative health research over a 15-year
period. BMC medical research methodology, 18(1), 148.
24. Walter, M., & Mondal, P. (2019). A rapidly assessed wetland stress index (RAWSI) using Landsat 8
and Sentinel-1 radar data. Remote Sensing, 11(21), 2549.
25. Zamanzadeh, V., Rassouli, M., Abbaszadeh, A., Majd, H. A., Nikanfar, A., & Ghahramanian, A.
(2014). Details of content validity and objectifying it in instrument development. Nursing Practice
Today, 1(3), 163-171.