自然言語処理
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
一般論文
Anna: A Dapper Open-Domain Dialogue Agent Based on a Joint Attention Network
Itsugun ChoHiroaki Saito
著者情報
ジャーナル フリー

2021 年 28 巻 4 号 p. 1184-1209

詳細
抄録

We constructed a high-quality open-domain dialogue generation model called Anna that is composed of a hierarchical self-attention network with multiple convolution filters and a topic-augmented network. During daily conversations, humans typically respond by understanding a dialogue history and assembling their knowledge regarding the topic. However, existing dialogue generation models are weak at capturing the dependencies among words or utterances, resulting in an insufficient understanding of context and the generation of irrelevant responses. Previous works have largely ignored topic information modeling in multi-turn dialogue, making responses overly generic. Although pre-training using large-scale transformer models has recently resulted in enhanced performance, large parameter sizes complicate such models. Anna effectively captures contextual dependencies and assigns greater weight to important words and utterances to compute context representations. We incorporate topic information into our model as prior knowledge to synthesize topic representations. Two types of representations jointly determine the probability distributions of responses, which effectively simulates how people behave in real conversations. Empirical studies on both Chinese and English corpora demonstrate that Anna outperforms baseline models in terms of response quality, parameter size and decoding speed.

著者関連情報
© 2021 The Association for Natural Language Processing
前の記事 次の記事
feedback
Top