Simplification with the Transformer - Its Drawbacks
DOI:
https://doi.org/10.26438/ijcse/v8i6.15Keywords:
Artificial Intelligence, Natural Language Processing, Neural Networks, Text Simplification, TransformerAbstract
Natural Language Processing is an active and emerging field of research in the computer sciences. Within it is the subfield of text simplification which is aimed towards teaching the computer the so far primarily manual task of simplifying text, efficiently. While handcrafted systems using syntactic techniques were the first simplification systems, Recurrent Neural Networks and Long Short Term Memory networks employed in seq2seq models with attention were considered state-of-the-art until very recently when the transformer architecture which did away with the computational problems that plagued them. This paper presents our work on simplification using the transformer architecture in the process of making an end-to-end simplification system for linguistically complex reference books written in English and our findings on the drawbacks/limitations of the transformer during the same. We call these drawbacks as the Fact Illusion Induction, Named Entity Problem and Deep Network Problem and try to theorize the possible reasons for them.
References
[1] Y. Goldberg, G. Hirst, ?Neural Network Methods in Natural Language Processing?, Morgan & Claypool Publishers, USA, 2017. ISBN no. 9781627052955
[2] A. Siddharthan, ?A Survey of Research on Text Simplification?, ITL - International Journal of Applied Linguistics, Vol.165, No.2, pp.259-298, 2014.
[3] M. Shardlow, ?A Survey of Automated Text Simplification.?, International Journal of Advanced Computer Science and Applications, Vol.4, No.1, pp. 58?70, 2014.
[4] S. Wubben, E. Krahmer, A. van den Bosch, ?Sentence Simplification by Monolingual Machine Translation?, In the Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics, Republic of Korea, pp.1015-1024, 2012.
[5] T. Vu, B. Hu, T. Munkhdalai, H. Yu, ?Sentence Simplification with Memory-augmented Neural Networks?, In the Proceedings of the NAACL-HLT, USA, pp.79-85, 2018.
[6] W. Coster, D. Kauchak, ?Simple English Wikipedia: A New Text Simplification Task.?, In the Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, USA, pp.665-669, 2011.
[7] Z. Zhu, D. Bernhard, I. Gureych, ?A Monolingual Tree-based Translation Model for Sentence Simplification?, In the Proceedings of the 23rd International Conference on Computational Linguistics, China, pp.1353-1361, 2010.
[8] S. Nisioi, S. Stajner, S. P. Ponzetto, L. P. Dinu, ?Exploring Neural Text Simplification Models?, In the Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Canada, pp.85-91, 2017.
[9] A. Vaswani et al, ?Attention is all you Need?, In the Proceedings of the 31st Conference on Neural Information Processing Systems, USA, pp.5998-6008, 2017.
[10] S. Zhao, R. Meng, D. He, S. Andi, P. Bamabang, ?Integrating Transformer and Paraphrase Rules for Sentence Simplification?, In the Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Belgium, pp.3164-3173, 2018.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.
