2020 Volume 27 Issue 4 Pages 781-800
We present a multi-dialect neural machine translation (NMT) model tailored to Japanese. Although the surface forms of Japanese dialects differ from those of standard Japanese, most of the dialects have common fundamental properties, such as word order, and some also use numerous same phonetic correspondence rules. To take advantage of these properties, we integrate multilingual, syllable-level, and fixed-order translation techniques into a general NMT model. Our experimental results demonstrate that this model can outperform a baseline dialect translation model. In addition, we show that visualizing the dialect embeddings learned by the model can facilitate the geographical and typological analyses of the dialects.