Indian Language Text Summarization Using Recurrent Neural Networks
Keywords:
Abstractive Summarization, LSTM, Recurrent Neural Network (RNN)Abstract
Text summarization is the process of creating short and accurate summary of a given text document. The paper is proposing an abstractive method of text summarization for text in Indian languages. The proposed algorithm uses an encoderdecoder recurrent neural networks with attention mechanism. The results observed were significantly better compared to the performance of already existing Indian language summarizer
References
[1] Y. Bengio, P. Simard, and P. Frasconi. Learning long-term dependencies with gradient descent is difficult.IEEE Transactions on Neural Networks, 5(2):157–166, 2017.
[2] K. Cho, B. Merrienboer, C. Gulcehre, F. Bougares, H. Schwenk, and Y. Bengio. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Arxiv preprint arXiv:1406.1078,2017.
[3] M. Johnson et al., “Google’s multilingual neural machine translation system: Enabling zero-shot translation,” Transactions of the Association for Computational Linguistics, vol. 5, no. 20, pp. 339–351, 2016.
[4] J. Lee, K. Cho, and T. Hofmann, “Fully character-level neural machine translation without explicit segmentation,” Transactions of the Association for Computational Linguistics, vol. 5, pp. 365–378, 2017.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.
