Indian Language Text Summarization Using Recurrent Neural Networks

Authors

  • Anjali AV Scholar Engineering, NIT, Trichy, India
  • Ramasubramanian N Dept. of Computer Science NIT, Trichy, India
  • Santhanavijayan A Dept of CS NIT, Trichy, India

Keywords:

Abstractive Summarization, LSTM, Recurrent Neural Network (RNN)

Abstract

Text summarization is the process of creating short and accurate summary of a given text document. The paper is proposing an abstractive method of text summarization for text in Indian languages. The proposed algorithm uses an encoderdecoder recurrent neural networks with attention mechanism. The results observed were significantly better compared to the performance of already existing Indian language summarizer

References

[1] Y. Bengio, P. Simard, and P. Frasconi. Learning long-term dependencies with gradient descent is difficult.IEEE Transactions on Neural Networks, 5(2):157–166, 2017.

[2] K. Cho, B. Merrienboer, C. Gulcehre, F. Bougares, H. Schwenk, and Y. Bengio. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Arxiv preprint arXiv:1406.1078,2017.

[3] M. Johnson et al., “Google’s multilingual neural machine translation system: Enabling zero-shot translation,” Transactions of the Association for Computational Linguistics, vol. 5, no. 20, pp. 339–351, 2016.

[4] J. Lee, K. Cho, and T. Hofmann, “Fully character-level neural machine translation without explicit segmentation,” Transactions of the Association for Computational Linguistics, vol. 5, pp. 365–378, 2017.

Downloads

Published

2025-11-18

How to Cite

[1]
A. Anjali, N. Ramasubramanian, and A. Santhanavijayan, “Indian Language Text Summarization Using Recurrent Neural Networks”, Int. J. Comp. Sci. Eng., vol. 6, no. 11, pp. 162–164, Nov. 2025.