Mixture of Experts (MoE) Based Top Performer Segmentation with Multilingual Chatbot Integration

Authors

Dr. Leena S More (Deshmukh)

HoD, JSPM JIMS (India)

Dr. Binod Kumar

Dean, JSPM RSCOE (India)

Article Information

DOI: 10.51244/IJRSI.2026.1304000149

Subject Category: Computer Science

Volume/Issue: 13/4 | Page No: 1713-1724

Publication Timeline

Submitted: 2026-04-17

Accepted: 2026-04-22

Published: 2026-05-08

Abstract

This paper presents a Mixture of Experts (MoE) architecture for workforce segmentation. The proposed framework combines multiple machine learning models—Support Vector Machine (SVM), Random Forest (RF), XGBoost, and Artificial Neural Network (ANN)—using a softmax-based gating network to dynamically assign weights to expert predictions. The system is evaluated on large-scale HR datasets along with real-time chatbot-generated appraisal data. Experimental results demonstrate superior performance with 92.1%+ accuracy, high cluster separability (Silhouette Score = 0.95), and significant improvements in HR efficiency, participation, and fairness. The framework supports inclusive, data-driven talent management in industrial environments.

Keywords

Mixture of Experts, Workforce Segmentation

Downloads

References

1. Breiman, L. (2001). Random forests. Machine Learning, 45(1), 5–32. [Google Scholar] [Crossref]

2. Zhou, Z. H. (2012). Ensemble methods: Foundations and algorithms. CRC Press. MacQueen, J. (1967). Some methods for classification and analysis of multivariate observations. Proceedings of the Berkeley Symposium, 281–297. [Google Scholar] [Crossref]

3. Ester, M., et al. (1996). A density-based algorithm for discovering clusters (DBSCAN). KDD, 226–231. [Google Scholar] [Crossref]

4. More, L. (2024). Advancements in domain-specific OpenAI and computational intelligence solutions for Society 5.0. In Open AI and computational intelligence for Society 5.0 (pp. 27–58). IGI Global. [Google Scholar] [Crossref]

5. More, L. (2024). Understanding and applying machine learning models. In The pioneering applications of generative AI (pp. 274–309). IGI Global. [Google Scholar] [Crossref]

6. Jain, A. K. (2010). Data clustering: 50 years beyond K-means. Pattern Recognition Letters, 31(8), 651–666. [Google Scholar] [Crossref]

7. Chien, C. F., & Chen, L. F. (2008). Data mining to improve personnel selection. Expert Systems with Applications, 34(1), 280–290. [Google Scholar] [Crossref]

8. Kaur, P., & Kaur, G. (2015). Employee performance prediction using ML. International Journal of Computer Applications, 119(3), 10–15. [Google Scholar] [Crossref]

9. More, L. (2024). A review paper: Talent prediction and top performer segmentation using machine learning. International Journal of Novel Research and Development, 9(4). [Google Scholar] [Crossref]

10. More, L. (2024). Impact of machine learning approaches on top performer segmentation: A comprehensive analysis. Tuijin Jishu/Journal of Propulsion Technology, 45(1). [Google Scholar] [Crossref]

11. More, L.s (2023). Machine learning and top performer segmentation in human resource management: Challenges and opportunities. Journal of the Asiatic Society of Mumbai, 97(1). [Google Scholar] [Crossref]

12. Breiman, L. (1996). Stacked regressions. Machine Learning, 24(1), 49–64. [Google Scholar] [Crossref]

13. Chen, T., & Guestrin, C. (2016). XGBoost: A scalable tree boosting system. Proceedings of the ACM SIGKDD, 785–794. [Google Scholar] [Crossref]

14. Dietterich, T. G. (2000). Ensemble methods in machine learning. International Workshop on Multiple Classifier Systems, 1–15. [Google Scholar] [Crossref]

Metrics

Views & Downloads

Similar Articles