Demystifying Text Generation approaches

Authors

  • Upadhyay L Department of Computer Engineering, BVM Engineering College, Gujarat, India
  • Hasan MI Department of Computer Engineering, BVM Engineering College, Gujarat, India
  • Patel PS Department of Computer Engineering, BVM Engineering College, Gujarat, India

DOI:

https://doi.org/10.26438/ijcse/v7i2.788791

Keywords:

Natural Language Processing, HMM, RNN, ANN, LSTM

Abstract

Natural Language Processing (NLP) is a subfield of Artificial Intelligence that is focused on enabling computers to understand and process human languages, to get computers closer to a human level understanding of language. The main emphasis in the task of text generation is to generate semantically and syntactically sound, coherent and meaning full text. At a high level. the techniques has been to train end to end neural network models consisting of an encoder model to produce a hidden representation of text, followed by a decoder model to generate the target. For the task of text generation, various techniques and models are used. Various algorithms which are used to generate text are discussed in the following subsections. In the field of Text Generation, researcher’s main focus is on Hidden Markov Model(HMM) and Long Short Term Memory (LSTM) units which are used to generate sequential text. This paper also discusses limitations of Hidden Markov Model as well as richness of Long Short Term Memory units.

References

[1] A. Graves, "Generating Sequences with Recurrent Neural Networks," Computing Research Repository- CoRR ArXiv, 2014.

[2] J. B. C. E. Zachary C. Lipton, "A Critical Review of Recurrent Neyral Networks for Sequence Learning," Computer Research Repository- arXiv, 2015.

[3] Z. L. H. L. C. Baotian Hu, "Convolutional Neural Network Architectures for Matching Natural Language Sentences," Neural Information Processing Systems Foundation, 2014.

[4] G. R. H. T. Ruli Manurang, "Using genetic algorithms to create meaningful poetic text," Journal of Experimental & Theoritical Artificial Intelligence , vol. 24, pp. 43-64, 2013.

[5] P. T. ,. H. T. T. R. A. Eric Malmi, "DopeLearning : A Computational Approach to Rap Lyrics Generation," Knowledge Discovery and Data Mining,Association for Computer Machinery , pp. 13-17, 2016.

[6] Q. Z. Y. C. Jia Wei, "Poet-based Poetry Generation: Controlling Personal Style with Recurrent Neural Networks," 2018 workshop on computing, Networking and Communications(CNC), 2018.

[7] H. O. M. M. Naoko Tosa, "Hitch Haiku : An Interactive Supporting System for Composing Haiku Poem," International Fedaration for Information Processing, pp. 209-216, 2008.

[8] christopher c. olah’s blog.

Downloads

Published

2019-02-28
CITATION
DOI: 10.26438/ijcse/v7i2.788791
Published: 2019-02-28

How to Cite

[1]
L. Upadhyay, M. Hasan, and P. Patel, “Demystifying Text Generation approaches”, Int. J. Comp. Sci. Eng., vol. 7, no. 2, pp. 788–791, Feb. 2019.

Issue

Section

Research Article