Transactions of the Institute of Systems, Control and Information Engineers
Online ISSN : 2185-811X
Print ISSN : 1342-5668
ISSN-L : 1342-5668
Simplification of RNN and Its Performance Evaluation in Machine Translation
Tomohiro FujitaZhiwei LuoChangqin QuanKohei Mori
Author information
JOURNAL FREE ACCESS

2020 Volume 33 Issue 10 Pages 267-274

Details
Abstract

In this paper, we study on simplification of RNN and propose new structures which enable faster learning to improve the performance while reducing the number of learning parameters. We construct 4 types of RNNs with new gated structures and call these new RNNs “SGR (Simple Gated RNN)”. SGR have two or one gate and weight or no weight for input. Comparison studies are performed to verify the effectiveness of our proposal. As a result of machine translation in relatively small corpus, compared with LSTM and GRU, our proposed SGR can realize higher scores than LSTM and GRU. Furthermore, SGR can realize faster learning approximately 1.7 times than GRU. However, with the increase of learning layers and weights for input, the learning scores of SGR seems not increase as much as we expected, which should be studied in our future work. It is necessary to analyze in more detail a performance in larger dataset and a performance difference due to multi-layering, weight for input and the number of gates.

Content from these authors
© 2020 The Institute of Systems, Control and Information Engineers
Next article
feedback
Top