Submission Deadline-23rd September 2025
September Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-03rd October 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-19th September 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Uncovering Insights from Teacher Feedback: A Sentiment Analysis Study at Cronasia Foundation College, Inc

Uncovering Insights from Teacher Feedback: A Sentiment Analysis Study at Cronasia Foundation College, Inc

Rosso Cuyos

Cronasia Foundation College, Inc

DOI: https://doi.org/10.51244/IJRSI.2025.12060017

Received: 15 May 2025; Accepted: 26 May 2025; Published: 28 June 2025

ABSTRACT

The unprecedented integration of Information and Communication Technology (ICT) in education to improve information management, learning activities, and data-based decision making is an essential in the digital age. Using sentiment analysis, this study analyzes teacher feedback from Cronasia Foundation College, Inc. in the hope of unveiling emotional trends and critical institutional issues. Textual feedback was classified as having positive, neutral, or negative sentiment using natural language processing (NLP) and machine learning algorithms and analyzed for common themes. Results show more positive reactions associated with the instrumentality of advising and curriculum, as well as continued concerns over workload,  student engagement, and administrative support. These findings provide practical insights for institutional planning and illustrate the power of sentiment analysis in crafting responsive, data-informed policies in higher education.

Keywords: Sentiment analysis, teacher feedback, higher education, natural language processing, faculty experience, educational quality.

INTRODUCTION

The evolution of digital education is critically dependent on the extent to which effective Information and Communication Technology (ICT) tools are used to enhance organizational efficiencies, the quality of learning environments, and evidence-based decision making [1]. Emerging technologies, such as artificial intelligence, big data analytics, and cloud computing, have empowered institutions to extract actionable insights that can be utilized for continuous improvements [2]. Notwithstanding these progresses, many educational institutions are still ill-equipped to translate information systems to continuum with strategic planning goals [3].

Diving into Modern Learning Cronasia Foundation College, Inc. (CFCI), an educational institution in the Philippines, embarked on a transformative journey, recognizing the need for digital innovation in education. Although the college had computerized workflows of both academic and administrative activities, the feedback mechanism was still largely manual, limiting the college’s ability to systematically analyze teacher feedback. As teacher feedback is paramount to the development of the institutional strategy, this contribution lays out a conceptual framework that applies sentiment analysis as an AI-enabled approach to extract insightful relationships from unstructured data [4].

Implemented within the college’s Information System Strategic Plan (ISSP), this research uses NLP and machine learning to extract trends and concerns from teacher feedback. CFCI provides an ideal environment to apply these techniques because it has already committed to ICT modernization and because this study can assist in institutional planning and contribute to academic literature with respect to sentiment analysis within an educational setting [5].

Aspiring teachers learn a great deal through the feedback they receive, and this is also true for the students they instruct [6][7]. Recent applications of sentiment analysis have demonstrated promising results by classifying feedback in an objective manner, rather than using qualitative methods. [8][9]. In this framework, integrating these innovative proven techniques and theories like Communication Accommodation Theory (CAT), and Feedback Intervention Theory (FIT) offers researchers a range of models that showcase how sentiment data can serve as an informative measure, scientific gateway, and input to adaptive learning procedures [10][11]. Such an explanatory model of learning blends traditional narratives of teaching (as modeled by, for example, Piaget & Dewey) with advanced computational learning processes (as used, for example, POMDP) [12][13].

Existing literature emphasizes the usefulness of text sentiment analysis in handling feedback related to students and teachers in both online and traditional environments, through the application of tools such as deep learning, word embeddings, and transformer models such as BERT and RoBERTa [14][15][16][17]. Some advanced approaches further combine video, audio, and transcripts to analyze the classroom in real time [18]. For example, patented technologies, e.g., (US9208502B2, US7996210B2), scalable sentiment tracking across different domains was observed, underscoring its applicability for education contexts [19][20].

For instance, e-khool LMS is a platform that relies on sentiment analysis techniques that can be used to predict learner performance, course recommendation, and emotion-aware classroom documentation [21]. Different innovations work to automate semantic annotations of intellectual property documents, exposing other areas to enrich the education systems with AI [22].

In this research, we will be employing sentiment analysis on feedback provided by teachers at CFCI to explore sentiments and trends in order to engage within with institutions in rethinking their practices, improving pedagogies, and making data-driven decisions that matter in the short term and long term.

SENTIMENTS ANALYSIS METHODOLOGY

Conceptual Framework of the Sentiment Analysis Method

Fig. 1 Conceptual Framework of the Sentiment Analysis Method

Methodological Approach

Employing a mixed-methods structure, the present research aimed to examine perceptions manifested in teacher evaluations at Cronasia Foundation College, Inc. (CFCI), leveraging both quantitative data and qualitative sentiment approximation techniques to find trends and themes across feedback categories [23]. Natural language processing (NLP) approaches were implemented to approximate sentiment scoring from the Likert-scale responses via NLP analysis.

Dataset Source

This dataset is based on the structured responses for the Faculty Evaluation System of CFCI. Each review has a timestamp and describes responses to multiple items under five themes/classrooms: Classroom Environment, Teaching Methods, Communication, Feedback and Assessment, and  Overall Experience. Questions in these categories are scored on a Likert-type scale, 1 (Strongly Disagree) to 5 (Strongly Agree). These areas take a well-rounded view of the quality of teaching, including the inclusiveness of the classroom environment, the motivational quality of the inputs, and the equity of the outputs [24][25][26].

Data Understanding and Preprocessing

There are in total 132 records and 26 columns, each representing a certain aspect of teaching performance. Imputation or exclusion was the preferred treatment for handling missing values during preprocessing to ensure data quality. Despite the absence of open-ended text responses, interpretable scores on single-item Likert scales were classified using an NLP-inspired sentiment classification technique. For sentiment mapping: 1–2 →Negative, 3 → Neutral, 4–5 → Positive. Thus, making a simulated sentiment trend possible, even though we had no qualitative data.

Sentiment Scoring

Table 1 shows the sentiment scores for the first five respondents from the five domains analyzed. Respondents 4 and 5 provided all perfect scores, indicating high satisfaction across the board. This perception was also corroborated by Respondent 2, whose responses always recorded high positive sentiment, with ratings 4.6 to 5.0. In contrast, Respondent 1 had comparatively lower scores,  especially for Teaching Methods (3.8), which indicated moderate satisfaction. On the other hand, Respondent 3 kept the scores consistent at 4.0 across all four, indicating even and moderate endorsement.

Table 2 summarizes sentiment classifications among all respondents. The responses were overwhelmingly positive, with 104–107 per category. From 9 to 12 were neutral sentiments, which ranged higher for negatives at 16–17 per category. This sausage-like distribution along the various dimensions highlights a broadly positive view of faculty effectiveness. The duplication of minor formatting at the bottom of Table 2 should also be removed.

Table 1 Sentiment Scores (First 5 Respondents)

Respondent Classroom Environment Teaching Methods Communication Feedback & Assessment Overall Experience
1 4.2 3.8 4.4 4.0 4.0
2 4.6 4.6 5.0 5.0 4.6
3 4.0 4.0 4.0 4.0 4.0
4 5.0 5.0 5.0 5.0 5.0
5 5.0 5.0 5.0 5.0 5.0

Table 2 Sentiment Counts (First 5 Respondents)

Sentiment Classroom Environment Teaching Methods Communication Feedback & Assessment Overall Experience
Positive 105 106 104 107 104
Neutral 10 9 12 9 12
Negative 17 17 16 16 16
Positive 105 106 104 107 104
Neutral 10 9 12 9 12

Aggregation and Visualization

The aggregated results show overall solid faculty performance, with average category scores clustering between 4.0 and 5.0. The Teaching Effectiveness domain received the highest average ratings,  reflecting general student satisfaction concerning learning outcomes and instructional impact. Next in line were the Classroom Environment and Communication categories sharing the same top score, emphasizing efficient management of classroom dynamics as well as open interaction between faculty and students.

Teaching Methods scored a bit lower than those breadth-bucket scores but still positive, with some space to expand diversity of instruction or service to real-world context. Feedback and Assessment garnered the lowest average score but still maintained a score over 4.0, suggesting that students generally see assessments as timely and constructive, with opportunities to continue building on feedback mechanisms.

Fig. 2 Average Faculty Evaluation Scores by Question

Simulated Sentiment and Interpretive Analysis

Because traditional text analysis methods (eg, tokenization or BERT classification) lacked free-text data, the sentiment scoring approach yielded useful insights. Through the sentiment approximation mechanism, each of those ratings was converted into a scale from -1 (very poor) to +1 (very good), which would open the door for a deeper analysis split by category. Simulated sentiment scores averaged 0.655 to 0.684, indicating strong positivity. One of the weakest areas, however, was Feedback and Assessment, which also had the highest sentiment and a high average score, indicating that students prioritize clarity and fairness in evaluation. Communication was also highly rated, reinforcing an emphasis on effective teacher-student interaction. The uniformity of scores across all dimensions indicates a well-rounded learning environment with emotional affirmation.

Table 3 Sentiment-simulated mixed-method analysis

Category Average Rating Average Sentiment
Classroom Environment 4.147 0.655
Teaching Methods 4.142 0.670
Communication 4.191 0.679
Feedback and Assessment 4.194 0.684
Overall Experience 4.166 0.675

Fig. 3 Average Rating and Sentiment Score per Category

CONCLUSION

This study showed the potential of structured data analysis with sentiment approximation to derive deeper insights from educational experiences. While the qualitative textual input is complexly absent, considering the sentiment mapping principles of linguistics, there is a wonderful assessment of faculty performance at Cronasia Foundation College, Inc.

Results reveal a clear picture of respondents being overall positive about their perception of teaching effectiveness, with the highest degree of satisfaction relating to overall experience, assessment practices, and communication. Teaching methods experienced a small change, but still a potential priority for teaching interventional practices. The consistent positive sentiment across all categories is indicative of a student-focused culture that encourages both academic engagement and emotional support.

In closing, the power of sentiment analysis, even when attempted on structured data—illustrated here—provides another tool set for institutional planning, to improve the teaching enterprise. CFCI’s faculty is well-positioned to capitalize on these strengths and continue to strive for excellence in student-centered learning.

REFERENCES

  1. F. Mahmud, M. Orthi, M. Saimon, M. Moniruzzaman, M. Alamgir, A. Miah, … & G. Manik. Big Data and Cloud Computing in IT Project Management: A Framework for Enhancing Performance and Decision-Making.
  2. R. WOLNIAK. 2024. CONTINUOUS IMPROVEMENT: LEVERAGING BUSINESS ANALYTICS IN INDUSTRY 4.0 SETTINGS. Scientific Papers of Silesian University of Technology. Organization & Management/Zeszyty Naukowe Politechniki Slaskiej. Seria Organizacji i Zarzadzanie, (203).
  3. K. E. Pearlson, C. S. Saunders, & D. F. Galletta,   2024. Managing and using information systems: A strategic approach. John Wiley & Sons.
  4. V. Buhas, I. Ponomarenko, O. Kazak, & N. Korshun. 2024. AI-Driven Sentiment Analysis in Social Media Content. In Digital Economy Concepts and Technologies Workshop 2024 (Vol. 3665, pp. 12-21). Germany.
  5. E. Verma. An Analytical Study on Sentiment Analysis of the New Education System: A Twitter Mining Approach.
  6. J.O. Al-Mansouri. 2024. The Impact of Real-Time Feedback on Optimizing Teachers’ Classroom Teaching Pace. Research and Advances in Education, 3(11), 45-48.
  7. W. Pearson. 2024. Affective, behavioral, and cognitive engagement with written feedback on second language writing: a systematic methodological review. In Frontiers in Education (Vol. 9, p. 1285954). Frontiers Media SA.
  8. M. Edalati, A. S. Imran, Z. Kastrati, & S. M. Daudpota. 2022. The potential of machine learning algorithms for sentiment classification of students’ feedback on MOOC. In Intelligent Systems and Applications: Proceedings of the 2021 Intelligent Systems Conference (IntelliSys) Volume 3 (pp. 11-22). Springer International Publishing.
  9. T. Shaik, X. Tao, Y. Li, C. Dann, J. McDonald, P. Redmond, & , L. Galligan. 2022. A review of the trends and challenges in adopting natural language processing methods for education feedback analysis. Ieee Access, 10, 56720-56739.
  10. H. Sayed Abdelnasser. 2023. Exploring the Application and Effectiveness of Communication Accommodation Theory (CAT) in Community Service: A Case study of the Happiness Makers Team at Deraya University. 29(88),
  11. Y. K. Kuo, S. Batool, T. Tahir, & J. Yu. 2024. Exploring the impact of emotionalized learning experiences on the affective domain: A comprehensive analysis. Heliyon, 10(1).
  12. V. Pandya, D. Monani, D. Aahuja, & U. Chotai. 2024. Traditional vs. modern education: A comparative analysis.
  13. M. J. Basha, S. Vijayakumar, J. Jayashankari, A. H. Alawadi, & P. Durdona. 2023. Advancements in natural language processing for text understanding. In E3S web of conferences (Vol. 399, p. 04031). EDP Sciences.
  14. C. Pandey, R. Pandey. 2024. Exploring the mediating role of professional and constitutional values and the moderating role of sex in the relationship between hidden curriculum and student achievement in teacher education programmes. Journal of Applied Research in Higher Education.
  15. C. Pacol. 2024. Sentiment Analysis of Students’ Feedback on Faculty Online Teaching Performance Using Machine Learning Techniques. WSEAS Trans. Inf. Sci. Appl., 21, 65-76.
  16. C. Dervenis, P. Fitsilis, O. Iatrellis, & A. Koustelios. 2024. Assessing teacher competencies in higher education: A sentiment analysis of student feedback. International Journal of Information and Education Technology, 14(4), 533-541.
  17. M. N. Abdal, M. H. K. Oshie, M. A. Haue, & K. Islam. 2023. A Transformer Based Model for Twitter Sentiment Analysis using RoBERTa. In 2023 26th International Conference on Computer and Information Technology (ICCIT) (pp. 1-6). IEEE.
  18. N. Kersting, B. Sherin, & J. Stigler. 2014. Automated scoring of teachers’ open-ended responses to video prompts: Bringing the classroom-video-analysis assessment to scale. Educational and Psychological Measurement, 74(6), 950-974.
  19. J. Paul. 2024. A Revolutionary Solution for Automating Patent Application Development.
  20. F. Mattio. 2024. Altman Z-Score Indicators (Doctoral dissertation, Politecnico di Torino).
  21. Y. Mahima, & T. Ginige. 2022. Students Behavioral and Emotional Detection Based Satisfaction Monitoring System for E-Learning.
  22. A. Giczy, N. Pairolero, & A. Toole. 2022. Identifying artificial intelligence (AI) invention: A novel AI patent dataset. The Journal of Technology Transfer, 47(2), 476-505.
  23. S. J. Skeen, S. S. Jones, C. M. Cruse, & K.J. Horvath. 2022. Integrating natural language processing and interpretive thematic analyses to gain human-centered design insights on HIV mobile health: proof-of-concept analysis. JMIR Human Factors, 9(3), e37350.
  24. M. J. Salameh, S. S. Jones, C. M. Cruse, & K. J. Horvath. 2024. Enhancing student satisfaction and academic performance through school courtyard design: a quantitative analysis. Architectural Engineering and Design Management, 20(4), 911-927.
  25. M. Radovan, & D. Radovan. 2024. Harmonizing pedagogy and technology: Insights into teaching approaches that foster sustainable motivation and efficiency in blended learning. Sustainability, 16(7), 2704.
  26. N. Makaremi, S. Yildirim, G. T. Morgan, M. F. Touchie, A. Jakubiec, & Robinson. 2024. Impact of classroom environment on student wellbeing in higher education: Review and future directions. Building and Environment, 111958.

    Article Statistics

    Track views and downloads to measure the impact and reach of your article.

    0

    PDF Downloads

    21 views

    Metrics

    PlumX

    Altmetrics

    Track Your Paper

    Enter the following details to get the information about your paper

    GET OUR MONTHLY NEWSLETTER