RSIS International

Text Generation Using Recurrent Neural Networks

Submission Deadline: 30th December 2024
Last Issue of 2024 : Publication Fee: 30$ USD Submit Now
Submission Deadline: 21st January 2025
Special Issue on Education & Public Health: Publication Fee: 30$ USD Submit Now
Submission Deadline: 05th January 2025
Special Issue on Economics, Management, Psychology, Sociology & Communication: Publication Fee: 30$ USD Submit Now

International Journal of Research and Scientific Innovation (IJRSI) | Volume VI, Issue V, May 2019 | ISSN 2321–2705

Text Generation Using Recurrent Neural Networks

Zishen Thajudheen, Amit N Subrahmanya, Aditya Singh, Akshit Jhamb, Vinay Hegde

IJRISS Call for paper

 Computer Science and Engineering, R.V College of Engineering, Bangalore, India

Abstract—Language Modelling is the core problem for a number of natural language processing tasks such as text generation. In this project, we try to create a language model at a character level for generating natural language text by implement and training state-of-the-art Recurrent Neural Network. No text generation model has been designed for Kannada. With the help of transliteration it is possible to build a text generation model for Kannada. We plan to build this model and analyse its strengths and weaknesses. In this paper we aim to demonstrate the power of large RNNs by applying them to the task of predicting the next character in a stream of text. This is an important problem because a better character-level language model could improve compression of text files. We can evaluate the syntactic and semantic sense of the generated phrases.

Keywords—Text generation, Transliteration, character prediction, Machine Learning, Recurrent Neural Network (RNN).

I. INTRODUCTION

With the latest advancements in the field of deep learning many tasks of Natural Language processing are becoming easier and effortless to solve. One such task of Natural Language processing is text generation. Text generation is effectively a language modelling problem. Language modeling is a fundamental problem in Natural Language processing tasks such as text summarization.
Text, a stream of characters lined one after the other. It is particularly difficult to work with generation of text as we need to train the model on the characters and the trained model needs to be very accurate. Even an error of a particular character in the stream of characters could make the whole sentence irrelevant. Text generation using recurrent neural network is working with novels written by authors and sonnets and poems.
The main motivation behind developing this model is to maintain the writing style of the author which cannot be replicated very easily. People are generally writing with emotion and are motivated by some personal experience and it is very hard to replicate the same using a computer. This model does not summaries the text it is trained on. Rather it writes a whole new chapter or poem based on the writing style of the author.





Subscribe to Our Newsletter

Sign up for our newsletter, to get updates regarding the Call for Paper, Papers & Research.