Development of an AI Driven Text Simplification and Analogy Generation Platform Using a Pre-Trained BART Model

Authors

Alimi O. Maruf

Department of Computer Science, Faculty of Computing, Air force Institute of Technology (Nigeria)

James Richard Henshaw

Department of Computer Science, Faculty of Computing, Air force Institute of Technology (Nigeria)

Oluwaseyi Ezekiel Olorunshola

Department of Computer Science, Faculty of Computing, Air force Institute of Technology (Nigeria)

Adeniyi Usman Adedayo

Department of Cyber Security, Faculty of Computing Air force Institute of Technology Kaduna (Nigeria)

Enem A Theophilus

Department of Cyber Security, Faculty of Computing Air force Institute of Technology KadunaDepartment of Cyber Security, Faculty of Computing Air force Institute of Technology Kaduna (Nigeria)

Adamu-Fika Fatimah

Department of Cyber Security, Faculty of Computing Air force Institute of Technology Kaduna (Nigeria)

Article Information

DOI: 10.51584/IJRIAS.2025.1010000021

Subject Category: Computer Science

Volume/Issue: 10/10 | Page No: 274-284

Publication Timeline

Submitted: 2025-09-25

Accepted: 2025-09-30

Published: 2025-10-29

Abstract

Understanding complex information can be a challenge for most learners, especially when it is filled with technical terms, abstract ideas, or specialized language. Education, research, and technical communication often suffer when content is too difficult for the intended audience. Simplifying text can help, but simplification alone does not always create the mental connections needed for deeper understanding. This research proposes and develops an AI-driven platform that combines text simplification and analogy generation to make complex information clearer and more relatable. A pre-trained BART model is used to simplify text while preserving meaning, and a Retrieval-Augmented Generation (RAG) process is applied to generate analogies based on user-selected themes such as sports or classrooms. The system is built with Python for the backend and Flutter for the frontend, offering a user-friendly interface for real-time processing. Evaluation using ROUGE and BERTScore confirmed the system’s effectiveness. Summarization achieved a ROUGE-1 score of 0.8315, while text simplification reached a BERTScore F1 of 0.9279, indicating high semantic fidelity. Analogy generation maintained F1 scores above 0.7, demonstrating relevance and conceptual clarity. These results confirm the platform's ability to improve comprehension through high-quality simplification and relatable analogies, making it a practical tool for education and accessible communication across diverse domains.

Keywords

Text Simplification, Analogy Generation, BART Model, Retrieval-Augmented Generation, Natural Language Processing, Semantic Preservation

Downloads

References

1. Seidenberg, M. S. (2013). The science of reading and its educational implications. Psychological Science in the Public Interest, 14(1), 1–54. https://doi.org/10.1177/1529100612453577 [Google Scholar] [Crossref]

2. Crossley, S. A., Allen, D. B., & McNamara, D. S. (2011). Text readability and intuitive simplification: A comparison of readability formulas. Reading in a Foreign Language, 23(1), 1–20. https:// doi.org/ 10.64152/10125/66657 [Google Scholar] [Crossref]

3. Richland, L. E., & Simms, N. (2015). Analogical reasoning and the transfer of learning. Educational Psychology Review, 27(4), 599–614. https://doi.org/10.1007/s10648-015-9321-4 [Google Scholar] [Crossref]

4. Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., Stoyanov, V., & Zettlemoyer, L. (2020). BART: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 7871–7880. https://doi.org/10.18653/v1/2020.acl-main.703 [Google Scholar] [Crossref]

5. Lazaridou, A., Peysakhovich, A., & Lewis, M. (2021). Retrieval-augmented generation for knowledge-intensive NLP tasks. Proceedings of the 37th International Conference on Machine Learning, 129–139. https://arxiv.org/abs/2005.11401 [Google Scholar] [Crossref]

6. Carr, N. (2010). The Shallows: What the Internet Is Doing to Our Brains. W.W. Norton & Company. [Google Scholar] [Crossref]

7. Richland, L. E., Holyoak, K. J., & Stigler, J. W. (2012). Teaching the conceptual structure of mathematics. Educational Psychologist, 47(3), 189–203. https://doi.org/ 10.1080/0046 1520.2012 .667065 [Google Scholar] [Crossref]

8. Martínez, P., Ramos, A., & Moreno, L. (2024). Exploring large language models to generate easy to read content. Frontiers in Computer Science, 6, 1394705. https://doi.org/10.3389/fcomp.2024.1394705 [Google Scholar] [Crossref]

9. Siddharthan, A. (2014). A survey of research on text simplification. International Journal of Applied Linguistics, 165(2), 259–298. https://doi.org/10.1075/itl.165.2.06sid [Google Scholar] [Crossref]

10. Specia, L., Gasperin, C., & Santos, D. (2010). Translating from complex to simplified sentences. In Proceedings of Coling 2010 (pp. 1–9). Springer. https://doi.org/10.1007/978-3-642-12320-7_5 [Google Scholar] [Crossref]

11. Nisioi, S., Štajner, S., Ponzetto, S. P., & Dinu, L. P. (2017). Exploring neural text simplification models. Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Short Papers), 85–91. https://doi.org/10.18653/v1/P17-2014 [Google Scholar] [Crossref]

12. Richland, L. E., & Simms, N. (2015). Analogy, higher order thinking, and education. Wiley Interdisciplinary Reviews: Cognitive Science, 6(2), 177–192. https://doi.org/10.1002/wcs.1336 [Google Scholar] [Crossref]

13. Gentner, D., & Forbus, K. D. (2011). Computational models of analogy. Wiley Interdisciplinary Reviews: Cognitive Science, 2(3), 266–276. https://doi.org/10.1002/wcs.105 [Google Scholar] [Crossref]

14. Gao, Y., et al. (2023). Retrieval-augmented generation for large language models: A survey. arXiv preprint. https://doi.org/10.48550/arXiv.2312.10997 [Google Scholar] [Crossref]

Metrics

Views & Downloads

Similar Articles