2021 Volume 28 Issue 4 Pages 1162-1183
This paper proposes a method for a context-aware neural machine translation (NMT) using both the ground truth and the machine-translated previous sentences of the target-side. Through the progress made in a sentence-level NMT, a context-aware NMT has been rapidly developed to exploit previous sentences as context. Recent work in the context-aware NMT incorporates source- or target-side contexts. In contrast to the source-side context, the target-side context causes a gap between training that utilizes a ground truth sentence and a machine-translated sentence as context. This gap leads to translation quality deteriorating because the translation model is trained with only the ground truth data that cannot be used in the inference. The proposed method can make the translation model robust against mistakes and biases made at the inference. We confirmed the improvements of our proposed approach compared to models using the previous approaches in English ↔ Japanese and English ↔ German translation tasks.