Enhancing Entrepreneurship Education In TVET: A Validated Evaluation Scale for Higher Vocational Institution
- Wang Juan
- Marina Ibrahim Mukhtar
- Ji Zhongli
- 4352-4367
- Mar 22, 2025
- Education
Enhancing Entrepreneurship Education In TVET: A Validated Evaluation Scale for Higher Vocational Institution
Wang Juan1, Marina Ibrahim Mukhtar2*, Ji Zhongli3
1,2Faculty Of Technical Education and Vocational, University Tun Hussein Onn Malaysia
3Faculty Of Digital Commerce, Changzhou Vocational Institute of Industrial Technology
*Corresponding author
DOI: https://dx.doi.org/10.47772/IJRISS.2025.9020341
Received: 18 February 2025; Accepted: 22 February 2025; Published: 22 March 2025
ABSTRACT
Entrepreneurship education curricula are extensively adopted in Chinese higher vocational schools; yet, there is a notable deficiency of dependable instruments to evaluate their quality. This study sought to create and test a comprehensive scale for assessing entrepreneurship education curricula in higher vocational schools, utilizing the CIPP (Context, Input, Process, and Product) paradigm. The research employed a multi-phase methodology, encompassing scale development, expert evaluation (N = 14), pilot testing (N = 135), and a formal study (N = 750), with individuals selected from two representative Chinese higher vocational institutions. Statistical investigations, including item analysis, reliability analysis, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA), were done to establish the scale’s robustness. The final scale consists of four subscales: curriculum context (3 dimensions, 11 items), curriculum input (3 dimensions, 13 items), curriculum process (2 dimensions, 8 items), and curriculum product (2 dimensions, 9 items), amounting to a total of 41 items. The results indicate robust content validity and internal consistency, validating the scale as a dependable and efficient tool for evaluating entrepreneurship education programs. This scale, designed for fundamental entrepreneurship courses, offers significant insights for curriculum enhancement and policy formulation, thereby advancing entrepreneurship education at higher vocational institutions.
INTRODUCTION
In today’s swiftly changing times, entrepreneurship education (EE) plays a crucial role in empowering younger generations with the competencies needed to effectively address the increasing intricacies of the global economic landscape (Mahmudin, 2023). With a deeper understanding of EE in higher education institutions, EE is progressively becoming a component of mainstream education, with the main objective of cultivating entrepreneurial skills and mindsets in students. EE not only teaches “how to start a business”, but also teaches the basics of the entrepreneurial process, with the goal of fostering individual innovation, enhancing critical thinking, and developing skills in new business management (Bechard & Gregoire, 2005). EE can encourage students to innovate and develop skills for future jobs, improve their employment opportunities, etc. (Zhou,2024). Over the past decade, higher education has experienced remarkable expansion; however, many graduates are facing challenges in securing quality employment. Employers have also expressed difficulties in finding individuals who possess the necessary skills (Lauder & Mayhew, 2020).
The emergence of this phenomenon calls for a rethink in education. The core purpose of 21st century skills is to develop people who can ‘use’ and ‘apply’ knowledge. In the face of new changes brought about by ever-changing technology in the workplace, graduates from higher education institutions must be well rounded. Graduates who excel in creativity and innovation, communication and teamwork, research proficiency, information literacy, critical thinking and problem-solving are better positioned to thrive in the workplace (Ghafar, 2020). Morley and Jamil (2021) point out that, in the face of the grim employment prospects for graduates, the traditional spirit of higher education and the pedagogical models no longer match the needs of students and require deeper reforms in addressing student employment issues. On the one hand, university students need rigorous academic learning to expand and update their strategic competencies. On the other hand, they need to make real-life connections and develop competencies that can be used in future careers and jobs. The development of innovative thinking and comprehensive entrepreneurial competencies can be of great help to students’ competitiveness in employment, and higher education institutions should start with the total curriculum and actively reform and adjust accordingly.
Furthermore, Pompei& Selezneva (2021) also pointed out that education has always been a powerful tool to deal with unemployment and employment instability. Higher education should pay more attention to employability and entrepreneurship to better accommodate the requirements of globalization and rapid change. A globalized world means that students should be more autonomous, adaptable to innovation, able to work in teams, take responsibility, have the attitude to maintain and update their skills, and be reflective (think about experiences and perspectives in order to better understand them and respond through learning and behavioural change). EE carries this mission of fostering innovation and entrepreneurship in students. In addition, Zaring et al.’s (2021) review of the EE literature found some common patterns: EE may be associated with economic growth, and while not all students will become entrepreneurs, the learning experience and personal development should improve their overall employability (Pardo-Garcia & Barac, 2020).
China is building an innovative country, which needs a large number of new-age entrepreneurs with active ideas, innovative thinking, and conscious actions (Li & Qin, 2022). Therefore, the Chinese government attaches great importance to EE in colleges and universities, and in 2015, the General Office of the State Council of China issued the “Opinions on the Implementation of Deepening the Reform of EE in Colleges and Universities”, which elevated the development of EE to the strategic level , and now almost all colleges and universities offer entrepreneurship-related curriculum (Fan et al., 2022). As an important part of Chinese colleges and universities, EE curriculum in Higher Vocational institutions is also in full swing. As EE expands on a large scale, its quality continues to face significant challenges, with numerous shortcomings evident in the curriculum (Hameed & Irfan, 2019). EE curriculum is the core carrier of EE, but from the current situation, EE curriculum exists out of touch with society and needs to be adjusted in time.
The traditional didactic EE curriculum can no longer meet the individualized innovation and entrepreneurship needs of higher vocational students and the requirements of future career development (Zhao,2020). At the same time, the assessment of EE curriculum is influenced by a range of ambiguous and uncertain factors, which makes it difficult to be effective (Zhao & Zheng, 2021). In view of the many shortcomings of the current EE curriculum, a systematic, scientific and feasible evaluation system is very much needed to assess the EE curriculum in Higher Vocational Colleges, so as to contribute to the improvement of the quality of the curriculum (Fan et al., 2022).
Study Background
Entrepreneurship education (EE) traces its origins to Harvard Business School, where Myles Mace offered the first EE curriculum in 1947, called “New Business Management”. Subsequently, entrepreneurship-related curriculum was introduced in various universities in the United States. In the last two decades, EE curriculum have increased exponentially on all continents (Byun, 2018; Nieuwenhuizen, 2016). Raharjo et al. (2023) believed that EE is an outstanding field for social progress and development, which plays an integral role as an engine of economic growth, a link to enhance social cohesion, an enabler of organizational success, and a catalyst for individual achievement. Bae et al. (2014) simplified the problem in their research as “EE refers to education that develops entrepreneurial attitudes and skills”. EE is not only limited to the development of entrepreneurs, but it is also the development of a valuable skill. EE contributes not only to professional development but also to the personal development of citizens (Martínez et al., 2021).
The curriculum is characterized as the totality of experiences provided by educational institutions, covering all learning aspects that students encounter in an academic environment (Balved, 2010). Curriculum evaluation refers to a comprehensive activity carried out to collect detailed and comprehensive information about the performance of an entity, in order to determine its effectiveness, efficiency, and consistency with the established goals. On the other hand, curriculum evaluation functions as a foundational framework for guiding the content of teaching, teaching methods, and the ultimate goals of the educational process (Varujie, 2016). Curriculum evaluation entails a structured, thorough, and ongoing process of gathering, analysing, and interpreting information from various sources. These data are crucial for determining the effectiveness, suitability, and relevance of the curriculum in meeting the educational objectives established by the institution.
The evaluation process not only focuses on students’ learning outcomes but also considers the efficacy of instructional approaches, the relevance of curriculum content, and the overall consistency of the curriculum with the interests, needs, and expectations of students and the entire society (Ifarajimi, 2023). Through continuous evaluation of the curriculum design and incorporating feedback from stakeholders, educational institutions can optimize the curriculum to better achieve their intended goals and serve the relevant stakeholders. Evaluation is not a one-time activity but a key and indispensable component of the process of curriculum development, implementation, and improvement (Hale & Adhia, 2022).
Higher vocational education uniquely integrates the characteristics of both higher education and vocational training, thereby offering a comprehensive educational experience. Its fundamental objective is to cultivate highly skilled professionals who are adept at meeting the diverse demands of the job market (Wu et al., 2012). As a vital component within the broader higher education system, higher vocational institutions play an indispensable role in nurturing front-line technical talent across a wide array of industries, spanning production, management, service, and research and development. These institutions are specifically tasked with imparting students with a robust blend of theoretical understanding and practical competencies. This dual focus ensures that students graduate with a solid foundation in their chosen field and the hands-on experience necessary to excel in practical. Higher vocational institutions not only enhance the employability of their graduates but also foster the development of a skilled workforce that is crucial for driving economic growth and fostering innovation (Hu & Chen, 2021).
Within the framework of curriculum evaluation methodologies, various evaluation models exist, with Stufflebeam’s CIPP model being one of the most widely utilized. Developed by Danial L. Stufflebeam in the early 1970s, the CIPP model posits that evaluation involves describing and assessing the strengths and value of objectives, designs, implementations, and outcomes to guide improvements (Stufflebeam, 2003). The CIPP model comprises four stages: context evaluation, input evaluation, process evaluation, and product evaluation. Context evaluation identifies the needs of the target population in terms of environmental factors to clarify program goals. Input evaluation determines the resources needed to effectively implement the curriculum. Process evaluation monitors program implementation to ensure that recommended techniques are followed and is designed to enhance the action plan. Product evaluation focuses on the effectiveness and applicability of educational outcomes and learning results (Akhtar, 2024). The CIPP model serves as a valuable and user-friendly tool that assists evaluators in formulating critical questions essential for the evaluation process (Hakan & Seval, 2011). One distinctive feature of the CIPP model is its adaptability, which allows users to apply individual components based on specific needs and stages of the evaluation. Consequently, each component is capable of operating independently (Singh, 2004). The CIPP model has a detailed framework for evaluating curriculum and is therefore a highly effective strategy for assessing entrepreneurship curriculum.
METHODOLOGY
Research Design
This study employed a cross-sectional measurement design to evaluate the reliability and validity of the EE curriculum evaluation scale using the CIPP model. The EE curriculum in the two higher vocational institutions under research was offered in the second semester of the freshman year, therefore, the sample consisted of sophomores and juniors of higher vocational institutions students who had participated in the EE curriculum, and convenience sampling was used. The SPSS 22.0 and Amos 21.0 were employed for data analysis. The research consisted of four steps scale development, expert review, pilot test and formal study.
Research Procedures
The four steps involved in the development and verification of SEEEC are specifically shown in Figure 1.
Figure 1: SEEEC development and validation process based on the 4-step process.
Scale Development
The research team comprised five members, including one professor in the field of EE, two associate professors, and two teaching administrators. The associate professors were responsible for literature review, data analysis and selection of experts; the teaching administrators were responsible for contacting experts and conducting pilot test; the professors were tasked with evaluating and finalizing the items as well as the scale structure.
By conducting a literature review, we obtained insights into the current landscape of curriculum assessment methods and tools. Using theoretical analysis, the research team meticulously dissected the components, features, attributes, and interrelations of EE curriculum, and constructed a foundational framework through integrative analysis. Based on the CIPP model, we have preliminarily developed the SEEEC system, which encompasses 4 subscales and 56 item.
Expert Review
In this research, the use of the Delphi expert consultation method follows the general principles of the Delphi techniques. Firstly, in the selection of experts, experts in the field of EE in China were chosen, who possess over 12 years of experience in EE and have achieved notable research outcomes in this field. Furthermore, they expressed a willingness to take part in this research. Second, the operational process of the Delphi techniques followed the process of developing a program, identifying panelists, measuring consensus, and reporting results (Olsen et al., 2021). Researchers will design a survey questionnaire and gather input from expert group members via correspondence or meetings. Experts will offer feedback using both quantitative and qualitative approaches, and researchers will adjust the questionnaire based on this feedback. Through multiple rounds of consultations and revisions, the views of the expert group members will progressively converge. Ultimately, the final version will be established based on the aggregated expert opinions (Hu and Wang, 2022; Zhao et al., 2024). In some cases, consensus may be achieved after the second round, eliminating the need for further steps (Li and Liu, 2018; Zhao et al., 2024).
The survey questionnaire comprised related explanation, demographic information about the experts, and a rating Likert scale from 1 to 5 to assess the alignment between each item and the research topic. The screening criteria for indicators are established through a rigorous process that integrates the average importance rating, standard deviation, and full-score ratio. Specifically, an indicator is selected if its average importance rating exceeds 3.5, its coefficient of variation is less than 0.25, and its full-score ratio surpasses 0.2. The item-content validity index (I-CVI) and scale-level CVI (S-CVI) were employed for content validity. If the I-CVI > 0.8 and the S-CVI > 0.9, it indicates that the content validity is strong (Polit et al., 2007; Zhao et al., 2024).
Pilot Test
The initial version of the SEEEC was finalized following revisions informed by feedback from the expert evaluation. To ensure that the pilot test sample accurately represented the formal study sample, students from two higher vocational institutions in Jiangsu Province, China, were selected for participation in both the pilot test and the formal study. Jiangsu, a leading province in education within China, is at the forefront of EE initiatives in the country. The Pilot test selected 135 students from a higher vocational institution in Jiangsu, China, who shared the same characteristics as the student sample of the formal study. The pilot test was subsequently carried out and involved three stages: item analysis, exploratory factor analysis (EFA), and reliability analysis.
In this process, 4 subscales and 44 items were retained based on the criteria for item analysis. In addition, one item was removed from the “leadership and management” dimension of the curriculum context subscales and two items were removed from the “curriculum and resources” dimension of curriculum context subscales during the EFA process, based on the factor loadings. The reliability analysis revealed that Cronbach’s alpha values for the subscales ranged from 0.877 to 0.930, while the overall Cronbach’s alpha for the total scale was 0.885. The results indicate high internal consistency across all components of the scale. Therefore, the total number of items in the SEEEC in this research was reduced from 44 to 41 to ensure reliability and validity.
Formal Study
As described in the sample collection section, this research culminated in a large-scale formal study. By conducting confirmatory factor analysis (CFA) and reliability analysis, we ultimately validated the scale framework, which comprises the curriculum context subscale (3 dimensions with 11 items), curriculum input subscale (3 dimensions with 13 items), curriculum process subscale (2 dimensions with 8 items), and curriculum product subscale (2 dimensions with 9 items), totaling 41 items. This signifies the successful completion of the scale’s development and validation process.
RESULT
Results of the Expert Consultation
The participants had an average age of 38.6 years and possessed an average of 17 years of work experience. The experts came from Jiangsu, Shaanxi and Fujian Provinces. Two rounds of expert consultations were conducted, with a total of 28 questionnaires distributed, resulting in a 100% response rate. The authority coefficient (Cr) of the expert survey was 0.88, and the coefficient of variation (Cv) was 0-0.22. The Kendall’s harmony coefficient was 0.544 (P 0.25 were deleted from the analysis of Item (Tan,2023; Sun et al., 2024)). According to the modification suggestions made by the experts, six inaccurate items were modified as follows, the item “Clear objectives of the curriculum” was modified to “The curriculum objectives are target-oriented”, the item “The objectives of the curriculum are in line with the needs of society” was modified to “The curriculum objectives meet the needs of professional talent development”, the item ” The curriculum is centered on meeting the development of students’ abilities” was modified to “The curriculum is centered on meeting the development of students’ creative and entrepreneurial skills”, the item “Teachers have specialized knowledge” was modified to “Teachers have extensive expertise in entrepreneurship education”, the item “Motivated and dedicated” was modified to “Teachers are able to fulfill their teaching tasks seriously”, the item “Entrepreneurial experience” was modified to “Teachers have entrepreneurial or enterprise practical experience”. The final result was 41 items.
Characteristics of the Sample for The Formal Study
Initially, a total of 769 students took part in the research following the provision of written informed consent.
Then, due to duplicate responses and incomplete data, 19 participants were removed after team discussion.
Therefore, the attrition rate was 2.5% with 750 participants. Among the participants, 53.6% were female (n = 402), and their ages ranged from 19 to 22 years. There were 398 (53.1%) and 352 (46.9%) students enrolled in second and third grade.
Item Analysis
Critical Ratios
Item analysis involves evaluating each question in a questionnaire to enhance its level to differentiate and identify the psychological characteristics of participants. A frequently employed assessment metric is the critical ratio (CR). The primary steps in conducting item analysis include: aggregating scores for each item that has been fully answered; calculating the total score for each participant on the scale; ranking these total scores; and determining thresholds for high and low groupings. Determine the scores of the 27% in the high and low subgroups, respectively, which were ranked according to the total score accumulated by each participant on the above scales.
Determine the scores of the top 27% (high subgroup) and bottom 27% (low subgroup). Scale scores were categorized into high and low subgroups based on critical scores, assigning 1 point to the high subgroup and 2 points to the low subgroup. Independent samples t-tests were conducted to compare the differences between the two groups on each item and to assess the mean values of each item for participants in the high and low groups. Items that failed to achieve statistical significance were excluded (Li & Liu, 2018). Participants’ total scores on the scale were ranked from highest to lowest; the top 27% were classified as the high group, while the bottom 27% were categorized as the low group. The independent samples t-test results for the two groups indicated that all items exhibited statistically significant differences, with p-values less than 0.001.
Pearson Correlation Coefficient
The correlation analysis between the scores of individual items and the overall questionnaire score yielded correlation coefficients ranging from 0.634 to 0.823. The values indicate that all items significantly contribute to the total score, thereby reinforcing the questionnaire’s internal consistency. A correlation coefficient exceeding 0.60 is typically deemed acceptable, while values surpassing 0.80 indicate exceptionally strong associations, affirming that the questionnaire items are well-aligned with the measured construct (Schober, Boer & Schwarte, 2018). This elevated internal consistency implies that the scale is reliable and effectively encapsulates the intended dimensions of entrepreneurship education curriculum evaluation, ensuring the instrument yields stable and meaningful results across various samples and applications.
Structural Effectiveness
Construct validity assesses the extent to which a test accurately measures the theoretical construct or characteristics it aims to evaluate. It is typically evaluated using exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). EFA is employed to explore dimensionality and is commonly used in the early stages of research to uncover patterns and relationships among a set of variables (Pituch & Stevens, 2016). CFA applies advanced methodologies during subsequent phases of the research to evaluate particular hypotheses or theoretical models concerning the fundamental structure of the variables (Hair et al., 2006; Pallant et al., 2010; Shrestha, 2021). These two methods are often used in tandem. When a solid theoretical foundation is absent, researchers first employ EFA to identify the underlying structure of the variables. Subsequently, they use CFA to assess how well the model derived from EFA fits the actual data (Li & Liu, 2018).
Exploratory Factor Analysis (EFA)
Prior to factor analysis, KMO values and Bartlett’s sphericity test were used to test the suitability of the sample for factor analysis. For the EFA to be valid, the Kaiser-Meyer-Olkin (KMO) measure should exceed 0.800, and Bartlett’s test of sphericity must be statistically significant at p < 0.05 (Kaiser & Rice, 1974). We performed the EFA test on total scale and all four subscales. The KMO was from 0.882-0.94 (p<0.001), which suggested that the data were appropriate for exploration factor analysis. Following this, common factors were extracted using principal component analysis and were rotated with the varimax method (The data of EFA are presented in Table 1).
Table 1: KMO value and Bartlett’s test results.
In curriculum context subscale, based on the criterion that the characteristic root must exceed 1, three factors were identified as extractable, with a cumulative variance contribution of 78.965%. We named factor 1 as “Curriculum Objectives”, factor 2 as “Curriculum Orientation”, and factor 3 as “Leadership and Management”. Any item with a load factor less than 0.4 should be deleted (XU et al., 2024), so, in curriculum context subscale, one item was deleted. In curriculum input subscale, based on the criterion that the characteristic root must exceed 1, three factors were identified as extractable, with a cumulative variance contribution of 73.741%. We named factor 1 as “Curriculum Arrangement”, factor 2 as “Teachers’ Quality”, and factor 3 as ” Curriculum Resources”. Two items had loadings < 0.4 and were deleted. In curriculum process subscale, based on the criterion that the characteristic root must exceed 1, two factors were identified as extractable, with a cumulative variance contribution of 79.208%. We named factor 1 “Curriculum Implementation” and factor 2 “Curriculum Operations Management”. Each item loaded > 0.40 on its respective dimension; therefore, no items were removed.
In curriculum product subscale, based on the criterion that the characteristic root must exceed 1, two factors were identified as extractable, with a cumulative variance contribution of 76.816%. We named factor 1 “Student Learning Outcomes” and factor 2 “Satisfactory Evaluation”. Each item loaded > 0.40 on its respective dimension; therefore, no items were removed from curriculum product subscale. Finally, SEEEC structure consists of the curriculum context subscale (3 dimensions with 11 items), curriculum input subscale (3 dimensions with 13 items), curriculum process subscale (2 dimensions with 8 items), and curriculum product subscale (2 dimensions with 9 items), totaling 41 items.
Table 2: Curriculum context subscale matrix of rotated factor loadings.
Domain | Item | Component | Commonality (common factor variance) | ||
Factor 1 | Factor 2 | Factor 3 | |||
A1. Curriculum Objectives | A1. The curriculum objectives are target-oriented. | 0.246 | 0.836 | 0.086 | 0.766 |
A2. Good fit between curriculum objectives and curriculum content | 0.117 | 0.833 | 0.235 | 0.763 | |
A3. The curriculum objectives meet the needs of professional talent development | 0.318 | 0.805 | 0.226 | 0.801 | |
A4. The curriculum objectives meet the needs of student need. | 0.241 | 0.798 | 0.180 | 0.727 | |
A2. Curriculum Orientation | B1. The curriculum is centered on meeting the development of students’ creative and entrepreneurial skills. | 0.823 | 0.269 | 0.207 | 0.792 |
B2. Integration of curriculum and professional field is good. | 0.839 | 0.287 | 0.209 | 0.831 | |
B3. The curriculum contributes to the development of students’ creative spirit. | 0.834 | 0.274 | 0.088 | 0.779 | |
B4. The curriculum enhances students’ entrepreneurial awareness. | 0.874 | 0.113 | 0.252 | 0.840 | |
A3. Leadership and Management | C1. The college has established supporting documents for EE. | 0.341 | 0.159 | 0.827 | 0.825 |
C2. The college has established a specialized institution for implementing EE | 0.132 | 0.147 | 0.862 | 0.782 | |
C3. The college has established the curriculum system for EE. | 0.151 | 0.277 | 0.825 | 0.781 | |
Rotation method: maximum variance method Varimax. |
Table 3: Curriculum input subscale Matrix of rotated factor loadings.
Domain | Item | Component | Commonality (common factor variance) | ||
Factor 1 | Factor 2 | Factor 3 | |||
Curriculum Arrangement | D1. Reasonable ratio of theoretical and practical teaching. | 0.215 | 0.226 | 0.826 | 0.780 |
D2. Adequate teaching duration. | 0.276 | 0.148 | 0.821 | 0.773 | |
D3. Reasonable class size for students. | 0.164 | 0.300 | 0.767 | 0.705 | |
D4. Rational combination of compulsory and elective curriculum | 0.159 | 0.293 | 0.807 | 0.762 | |
Teachers’ Quality | EE1.Teachers have extensive expertise in entrepreneurship education | 0.274 | 0.760 | 0.274 | 0.727 |
EE2. Teachers have excellent classroom and organizational skills. | 0.209 | 0.835 | 0.229 | 0.795 | |
EE3. Teachers are able to fulfill their teaching tasks seriously. | 0.174 | 0.813 | 0.182 | 0.724 | |
EE4. Teachers have entrepreneurial or enterprise practical experience. | 0.130 | 0.810 | 0.259 | 0.740 | |
Curriculum Resources | F1. Enlightening curriculum content. | 0.816 | 0.270 | 0.261 | 0.807 |
F2. Abundance of online resources for the curriculum. | 0.762 | 0.199 | 0.291 | 0.704 | |
F3. Laboratory and library resources can be shared with the curriculum. | 0.769 | 0.102 | 0.218 | 0.649 | |
F4. Integration of the curriculum with innovation and entrepreneurship practice platforms. | 0.796 | 0.089 | 0.097 | 0.652 | |
F5. Innovation and entrepreneurship activities. | 0.829 | 0.269 | 0.095 | 0.769 | |
Rotation method: maximum variance method Varimax. |
Table 4: Curriculum process subscale Matrix of rotated factor loadings.
Domain | Item | Component | Commonality (common factor variance) | |
Factor 1 | Factor 2 | |||
Curriculum Implementation | G1. Teachers are able to effectively utilize a variety of teaching methods cantered around content. | 0.156 | 0.863 | 0.769 |
G2. Teaching knowledge points can be transferred effectively. | 0.044 | 0.883 | 0.782 | |
G3. Students have the opportunity to fully voice their opinions in classroom discussions. | 0.201 | 0.863 | 0.784 | |
G4. Sufficient teacher-student interaction and active classroom atmosphere. | 0.247 | 0.848 | 0.780 | |
Curriculum Operations Management | H1. Curriculum software systems are well maintained and functioning. | 0.890 | 0.149 | 0.814 |
H2. Curriculum hardware is well maintained and functioning. | 0.888 | 0.056 | 0.793 | |
H3. Online curriculums are well run and managed. | 0.878 | 0.258 | 0.837 | |
H4. Daily teaching and learning is well managed | 0.859 | 0.196 | 0.777 | |
Rotation method: maximum variance method Varimax. |
Table 4: Curriculum process subscale Matrix of rotated factor loadings.
Domain | Item | Component | Commonality (common factor variance) | |
Factor 1 | Factor 2 | |||
Student Learning Outcomes | R1. Knowledge of innovation and entrepreneurship is enriched. | 0.799 | 0.222 | 0.688 |
R2. Enthusiasm for innovation and entrepreneurship is stimulated or enhanced. | 0.798 | 0.312 | 0.734 | |
R3. Innovative and entrepreneurial skills are developed. | 0.755 | 0.397 | 0.728 | |
R4. The curriculum contributes to the career development of students. | 0.768 | 0.357 | 0.717 | |
R5. Increased willingness of students to participate in entrepreneurship competitions. | 0.869 | 0.288 | 0.839 | |
Satisfactory Evaluation | S1. Satisfaction of curriculum program. | 0.279 | 0.862 | 0.822 |
S2. Satisfaction of educational resources. | 0.316 | 0.822 | 0.776 | |
S3. Satisfaction of curriculum teaching. | 0.335 | 0.835 | 0.810 | |
S4. Satisfied with the way student performance is assessed. | 0.333 | 0.831 | 0.801 | |
Rotation method: maximum variance method Varimax. |
Confirmatory Factor Analysis (CFA)
The total scale and four subscales model fit of EE curriculum in higher vocational institutions was satisfactory. The model results are shown in Table 5. The SEEEC path analysis diagram of scale simulation is shown in Figure 2.
Table 6: Fit Of Structural Equations Model
Indicator name | CMIN/DF | RMSEA | CFI | NFI | IFI | RFI | TLI | RMR |
total scale | 2.576 | 0.046 | 0.942 | 0.909 | 0.942 | 0.902 | 0.938 | 0.036 |
curriculum context subscale | 4.22 | 0.066 | 0.972 | 0.964 | 0.972 | 0.951 | 0.962 | 0.032 |
curriculum input subscale | 2.936 | 0.051 | 0.981 | 0.971 | 0.981 | 0.964 | 0.976 | 0.028 |
curriculum process subscale | 3.196 | 0.054 | 0.988 | 0.983 | 0.988 | 0.975 | 0.983 | 0.024 |
curriculum product subscale | 2.474 | 0.044 | 0.991 | 0.985 | 0.991 | 0.98 | 0.988 | 0.018 |
Figure 2: SEEEC Path Analysis Diagram of Scale Simulation.
Content Validity
Fourteen domain experts were invited to evaluate the content validity of scale items. The assessment employed a 4-point rating system (1=irrelevant, 2=weakly relevant, 3=comparatively relevant, 4=highly relevant), while allowing experts to propose revision suggestions for ambiguous items. The standards for the content validity of the scale: The index of content validity at the item level (I-CVI) > 0.78, and the index of content validity at the scale level (S-CVI) should meet the requirements of two dimensions – the general consistency rate (S-CVI/UA) > 0.8 and the average calculated value (S-CVI/Ave) > 0.9 (Sousa & Rojjanasrirat 2011; Xu et al. (2011, 2024). Statistical analysis revealed that the scale achieved S-CVI/UA of 0.902 and S-CVI/Ave of 0.993, with all item-level I-CVI values ranging from 0.929 to 1.000. All indices surpassed recommended thresholds, demonstrating excellent content validity of the scale.
Reliability Analysis
The Cronbach’s α coefficients of the total scale and subscales of SEEEC are 0.877 – 0.930 respectively, suggesting that the scale exhibits strong internal consistency. The uniformity of all subscales within this elevated range indicates that the items in each subscale effectively assess the same underlying construct, hence minimizing the probability of random errors (Hajjar & Taan, 2014). The robust internal consistency guarantees that the SEEEC is a reliable instrument for evaluating the quality of entrepreneurship education curricula at higher vocational institutions, offering precise and consistent metrics suitable for further research and practical implementation.
DISCUSSION
Significant Of SEEEC
This research aims to evaluate the quality of EE curriculum from the perspective of students by applying the CIPP model. Student evaluations serve a vital role in measuring the effectiveness and overall quality of any educational curriculum, especially in entrepreneurship education (Martin, 2019). By concentrating on student feedback, this research aims to answer several key questions: How effective is the implementation of the curriculum? How do students perceive the curriculum? Most importantly, what knowledge, skills, and experiences do students acquire through this curriculum? These questions constitute the ultimate criteria for evaluating the quality of the curriculum. A well-designed and effectively implemented curriculum should not only impart theoretical knowledge but also enable students to acquire practical skills and real-world experiences applicable in the entrepreneurial environment. Insights gained from student evaluations provide valuable feedback for the continuous improvement and optimization of the curriculum. One of the primary benefits of the CIPP model is its ability to support a sustainable, closed-loop, and iterative evaluation framework. This approach ensures that the evaluation process is continuous and iterative, allowing for ongoing assessment and improvement.
The CIPP model addresses several common challenges in educational evaluation: First, enhancing objectivity. Traditional student evaluations sometimes lack objectivity, which may be due to various biases or subjective opinions. The CIPP model mitigates this issue by integrating multiple data sources and perspectives to ensure a more balanced and comprehensive evaluation. Second, reducing data requirements. Collecting a large amount of data can be resource-intensive and time-consuming. The CIPP model simplifies the evaluation process, reduces the need for excessive data collection, while still providing meaningful insights. Third, simplifying processing and analysis. Analysing large data sets can be complex and challenging. The CIPP model provides tools and frameworks to simplify the processing and analysis of information, making it easier to identify trends, patterns, and areas for improvement (Chen et al., 2020).
Additionally, the CIPP model is capable of providing a precise evaluation of the quality of entrepreneurship curriculum. in various aspects such as content relevance, teaching methods, and student engagement. It can also identify specific problems in curriculum implementation based on quality scores and propose reasonable improvement suggestions. For example, if certain modules or activities are found to be ineffective, the CIPP model can point out these weaknesses and recommend targeted intervention measures to enhance the overall learning experience (Liu, 2022; Zhao, 2024). In summary, this research uses the CIPP model to conduct a comprehensive and systematic evaluation of EE curriculum from the perspective of students. By addressing key evaluation questions and leveraging the advantages of the CIPP framework, this study seeks to offer practical recommendations to foster significant enhancements in the quality of entrepreneurship education.
The Scientific Nature Of SEEEC
This study designed a comprehensive scale grounded in the CIPP model, with the development process adhering to the principles of practicality and rigor. The aim was to create a scientifically robust instrument that could effectively evaluate various aspects of the curriculum. To achieve this, the research adhered to established methodologies from existing literature, particularly following the systematic steps outlined by Churchill (1979) and Anderson and Gerbing (1982). The process included scale conceptualization, followed by scale refinement and purification, and finally scale validation and testing (Dastane et al., 2023). We used literature review, expert assessment, pilot test, formal study, and other processes to ensure the soundness of each item of the scale. Our experts came from three different provinces and the enthusiasm of the experts indicated their interest in the research, which was evident in the insights provided by the experts and the response rate of the questionnaire.
The response rate of the experts’ opinions in this research was 100%. The initial screening of scale items by the experts was based on the scores from the two rounds of the expert survey as well as the experts’ suggestions on adding, deleting, modifying, and combining items. In addition, the results of the pilot test were further refined through item analysis and factor analysis to validate the reliability and validity of the scale. The final SEEEC scale structure consists of the curriculum context subscale (3 dimensions with 11 items), curriculum input subscale (3 dimensions with 13 items), curriculum process subscale (2 dimensions with 8 items), and curriculum product subscale (2 dimensions with 9 items), totaling 41 items.
Reliability and Validity Testing Of SEEEC
This research exhibits robust reliability and validity, as evidenced by rigorous testing and analysis. Validity indicates how well a measurement tool reflects the intended construct, whereas reliability focuses on the consistency and accuracy of the obtained data, along with the tool’s effectiveness in reducing random error (Ahmed & Ishtiaq, 2021). To assess the internal consistency of the scale, Cronbach’s alpha was employed. Content validity was confirmed through the calculation of the I-CVI and the S-CVI (Yao, 2024). EFA and CFA were employed to test the structural validity of the assessment instruments. The reliability analysis revealed that Cronbach’s alpha coefficients for all domains ranged from 0.888 to 0.953, indicating high internal consistency and satisfactory reliability of the SEEEC scale. The SEEEC model, based on the CIPP framework, exhibits robust validity. According to established criteria, the I-CVI should exceed 0.78, and if more than five experts are involved, the S-CVI should be at least 0.80, ensuring good content validity (SOUSA & ROJJANASRIRAT, 2011; Xu et al., 2024).
Next, the I-CVI scores for the 41 items and the S-CVI for the final scale met these standards, confirming the overall content validity of the scale. Furthermore, the cumulative variance contribution of the extracted factors was 85.284%, with each factor having sufficient loading (>0.4), indicating meaningful item-factor relationships (Shrestha, 2022). The results of the confirmatory factor analysis indicated that indices including CFI, NFI, RFI, IFI, and TLI were all greater than 0.9, which is within the acceptable range (Lu et al., 2023; Zhao et al., 2024). In summary, the SEEEC scale has been rigorously evaluated and validated, demonstrating both high reliability and strong validity, making it a robust tool for assessing entrepreneurship education.
This research encountered several limitations, including constraints related to human resources, time constraints, the use of convenience sampling, and the collection of online questionnaires, which posed challenges for quality control. Additionally, the sample was drawn from only a portion of China, potentially affecting its representativeness. Future research should strive to employ larger and more diverse samples to improve the reliability of the findings. Furthermore, ongoing adjustments and improvements in methodologies are required to effectively evaluate the quality of education and better adapt to the evolving nature of EE curriculum. In order to address these limitations, follow-up research could consider incorporating multiple data collection methods and recruiting participants from various regions to ensure a broader representation. This would provide more robust and generalizable findings.
CONCLUSION
This research developed a scale based on the CIPP model to measure the quality of EE curriculum. The SEEEC structure consists of the curriculum context subscale (3 dimensions with 11 items), curriculum input subscale (3 dimensions with 13 items), curriculum process subscale (2 dimensions with 8 items), and curriculum product subscale (2 dimensions with 9 items), totaling 41 items. The construct reliability of the scale developed using the CIPP model was found to be satisfactory. The hybrid teaching effectiveness evaluation model based on the CIPP framework is practical and offers a valuable, user-friendly tool for the scientific assessment of EE curriculum in higher vocational college.
REFERENCES
- Ahmed, I., & Ishtiaq, S. (2021). Reliability and Validity: Importance in medical research. methods, 12(1), 2401-2406.
- Akhtar, S., Nawaah, D., & Jafar, S. H. (2024). Ethical empowerment: examining the impact of business education in Indian universities through the lens of international standards and CIPP model. the International Journal of Management Education, 22(3), 101066.
- Akhtar, S., Nawaah, D., & Jafar, S. H. (2024). Ethical empowerment: examining the impact of business education in Indian universities through the lens of international standards and CIPP model. the International Journal of Management Education, 22(3), 101066.
- Anderson, J.C., Gerbing, D.W., 1982. some methods for respecifying measurement models to obtain unidimensional construct measurement. j. Market. res. 19 (4), 453-460. https://doi.org/10.2307/3151719.
- Bae TJ, Qian S, Miao C, Fiet JO. The relationship between EE and entrepreneurial intentions: a meta-analytic review. Entrepreneurship theory and practice. 2014 Mar;38(2):217-54.
- Bechard, J. P., and Gregoire, D. (2005). Entrepreneurial education research revisited: the case of higher education. Acad. Manag. Learn. Edu. 4, 22-43. doi: 10.5465/amle.2005. 1613253
- Bharvad, A. J. (2010). curriculum evaluation. International Research Journal, 1(12), 72-74.
- Byun, C.-G.; Sung, C.; Park, J.; Choi, D. A Study on the Effectiveness of EE Programs in Higher Education Institutions: a Case Study of Korean Graduate Programs. J. Open Innov. Technol. Mark. Complex. 2018, 4, 26. [Google Scholar] [ CrossRef] [ Green Version]
- Chen, J., Li, L., Zheng, X., Wang, Y., Fu, W., Li, C., Wang, Y., Li, H., 2020. The construction of structural equations for evaluation indexes of classroom the construction of structural equations for evaluation indexes of classroom teaching quality in college and university. China Contin. Med. Educ. 12 (8), 58-61.
- Churchill, G.A., 1979. a paradigm for developing better measures of marketing constructs. j. market. res. 16 (1), 64. https://doi.org/10.2307/ 3150876.
- Dastane, O., Goi, C. L., & Rabbanee, F. K. (2023). The development and validation of a scale to measure perceived value of mobile commerce (MVAL-SCALE). Journal of Retailing and Consumer Services, 71, 103222. Fan, X., Tian, S., Lu, Z., & Cao, Y. (2022). Quality evaluation of EE in higher education based on CIPP model and AHP-FCE methods. Frontiers in Psychology, 13, 973511.
- Hadi, S., Abbas, E. W., & Rajiani, I. (2022, October). Should spirituality be included in EE program curriculum to boost students’ entrepreneurial intention? In Frontiers in Education (Vol. 7, p. 977089). Frontiers Media SA.
- Hair, J. J., Black, W.C., Babin, B. J., Anderson, R. R., Tatham, R. L., Multivariate data analysis, Upper Saddle River, New Jersey, 2006. Hakan, K., & Seval, F. (2011). CIPP evaluation model scale: development, reliability and validity. Procedia-Social and Behavioural Sciences, 15, 592-599.
- Hajjar, S. T. E., & Taan, E. (2014). A statistical study to develop a reliable scale to evaluate instructors within higher institutions. WSEAS Transactions on Mathematics, 13, 885-894.
- Hale, L., & Adhia, D. B. (2022). The continuous feedback model: Enabling student contribution to curriculum evaluation and development. focus on Health Professional Education: A Multi-Professional Journal, 23(1), 17-36.
- Hameed, I., and Irfan, Z. (2019). EE: a review of challenges, characteristics and opportunities. Entrepr. Educ. 2, 135-148. doi: 10.1007/ s41959-019-00018-z
- Hu Dexin, Chen Runge (2021). Development paths, practical challenges and improvement strategies of higher vocational colleges under the background of the “Double First-Class Initiative”. Modern Educational Management. 15(12):104. Modern Educational Management. 15(12):104.
- Hu, Y., Wang, Z., 2022. Nursing Research, 6th ed. People’s Health Publishing House, Beijing.
- Ifarajimi, M. A. (2023). The Role of Curriculum Evaluation and Feedback in Improving Teaching and Learning Quality in Nigeria. Sapientia Foundation Journal of Education, Sciences and Gender Studies, 5(2). Sciences And Gender Studies, 5(2).
- Ikart, E.M., 2019. Survey questionnaire survey pretesting method: an evaluation of survey questionnaire via expert reviews technique. Asian J. Social Sci. Stud. 4 (2). https://doi.org/10.20849/ajsss.v4i2.565. Article e1.
- Kaiser, H.F., Rice, J., 1974. little jiffy, mark IV. educ. psychol. meas. 34, 111-117. https://doi.org/10.1177/001316447403400115.
- Li, Z., Liu, Y. Research Methods in Nursing. 2nd edition. 2018, Beijing: People’s Health Publishing House.
- Liu, R., 2022. optimization model of mathematics instructional mode based on deep learning algorithm. Comput. Intell. Neurosci. 2022, 1817990 https:/ /doi.org/ 10.1155/2022/1817990.
- Lu, X., Wang, L., Xu, G., Teng, H., Li, J., Guo, Y., 2023. development and initial validation of the psychological capital scale for nurses in Chinese context. BMC Nurs. 22 (1), 28. https://doi.org/10.1186/s12912-022-01148-x.
- Mahmudin, T. (2023). The Importance of EE in Preparing the Young Generation to Face Global Economic Challenges. Journal of Contemporary Journal of Contemporary Administration and Management (ADMAN), 1(3), 187-192.
- Martin, F., Ritzhaupt, A., Kumar, S., & Budhrani, K. (2019). Award-winning faculty online teaching practices: curriculum design, assessment and evaluation, and facilitation. the Internet and Higher Education, 42, 34-43.
- Martínez-Gregorio, S., Badenes-Ribera, L., & Oliver, A. (2021). Effect of EE on entrepreneurship intention and related outcomes in educational contexts: a meta-analysis. the International Journal of Management Education, 19(3), 100545.
- Nieuwenhuizen, C.; Groenewald, D.; Davids, J.; Van Rensburg, L.J.; Schachtebeck, C. Best practice in EE. probl. perspect. Manag. 2016, 14, 528-536. [Google Scholar] [ CrossRef] [Green Version]
- Olsen, A. A., Wolcott, M. D., Haines, S. T., Janke, K. K., & McLaughlin, J. E. (2021). How to use the Delphi techniques to aid in decision making and build consensus in pharmacy education. Currents in Pharmacy Teaching and Learning, 13(10), 1376-1385.
- Pallant, J., SPSS survival manual: a step-by-step guide to data analysis using SPSS, Open University Press/ Mc Graw-Hill, Maidenhead, 2010 Pituch, K. A. and Stevens, J., Applied multivariate statistics for the social sciences: analyses with SAS and IBM’s SPSS (6th ed.), Taylor & amp; Francis, New York, 2016.
- Raharjo, I. B., Ausat, A. M. A., Risdwiyanto, A., Gadzali, S. S., & Azzaakiyyah, H. K. (2023). Analysing the relationship between EE, self-efficacy, and entrepreneurial performance. Journal on Education, 5(4), 11566 Journal on Education, 5(4), 11566 -11574.
- Schober, P., Boer, C., & Schwarte, L. A. (2018). Correlation coefficients: appropriate use and interpretation. Anesthesia & analgesia, 126(5), 1763-1768.
- Shrestha, N. (2021). Factor analysis as a tool for survey analysis. American journal of Applied Mathematics and statistics, 9(1), 4-11. Shrestha, N. (2021). Factor analysis as a tool for survey analysis. American journal of Applied Mathematics and statistics, 9(1), 4-11.
- Singh, M. D. (2004). Evaluation framework for nursing education programs: application of the CIPP model. international journal of nursing education scholarship, 1(1).
- Sousa V D, Rojjanasrirat W. Translation, adaptation and validation of instruments or scales for use in cross-cultural health care research: a clear and user-friendly guideline [ J ]. J Eval Clin Pract, 2011, 17(2): 268-274. DOI: 10.1111/j.1365- 2753.2010.01434.x.
- Stufffebeam, D. L. (2003). The CIPP model for evaluation. In D. L. Stufffebeam, & T. Kellaghan (Eds.), The international handbook of educational evaluation. Kluwer Academic Kluwer Academic Publishers.
- Sun, H., Wang, Y., Cai, H., Wang, P., Jiang, J., Shi, C., … & Hao, Y. (2024). The development of a performance evaluation index system for Chinese Centers for Disease Control and Prevention: a Delphi consensus study. Global Health Research and Policy, 9(1), 28.
- Warju, W. (2016). Educational program evaluation using CIPP model. Innovation of Vocational Technology Education, 12(1), 36-42. https://doi.org/ 10.17509/invotec. v12i1.4502
- Wu X, Chen Y, Zhang J, Wang Y. On improving higher vocational college education quality assessment. Physics Procedia. 2012 Jan 1; 33:1128-32.
- Xu, H., Wu, S., Jiang, J., Wu, Y., Wang, X., Gao, G., … & Wang, Y. (2024). Translation of the Patient-reported Outcomes Measure of Pharmaceutical Therapy for Quality of Life and Its Validation in Elderly Patients with Polypharmacy. Chinese General Practice, 27(05), 612.
- Yao, S., Ma, Z., Shi, Y., Wu, Y., Zhang, L., Chen, M., … & Cheng, F. (2024). Development and Reliability and Validity Test of Cardiotoxicity Risk Assessment Scale for Breast Cancer Patients Undergoing Chemotherapy. Chinese General Practice, 27(27), 3428.
- Zhao, X., and Zheng, C. (2021). Fuzzy evaluation of physical education teaching quality in colleges based on analytic hierarchy process. Int. J. Emerg. Technol. Learn. 16, 217 -230. doi: 10.3991/ijet.v16i06.21097
- Zhao, Y., Li, W., Jiang, H., Siyiti, M., Zhao, M., You, S., … & Yan, P. (2024). Development of a blended teaching quality evaluation scale (BTQES) for undergraduate nursing based on the Context, Input, Process and Product (CIPP) evaluation model: A cross-sectional survey. evaluation model: a cross-sectional survey. Nurse Education in Practice, 77, 103976.
- Zhou, R., Rashid, S. M., & Cheng, S. (2024). EE in Chinese higher institutions: Challenges and strategies for vocational colleges. Cogent Education, 11(1), 2375080