主催: 人工知能学会
会議名: 第102回言語・音声理解と対話処理研究会
回次: 102
開催地: 国立国語研究所 講堂
開催日: 2024/11/28 - 2024/11/29
p. 64-69
Open-domain dialogue systems have been increasingly applied in various situations, with a growing need to improve user engagement. One effective approach is to generate responses based on interesting external knowledge using knowledge-grounded response generation models. However, relying solely on interestingness can result in incoherent responses, which may reduce user engagement. This paper proposes a novel method for generating engaging responses while preserving contextual coherence. Our method leverages a pre-trained knowledge-grounded response generation model and modifies the knowledge selection process to enhance response coherence and interestingness without requiring additional training. First, knowledge candidates with high contextual relevance are retrieved. These candidates are then reranked based on their interestingness, and used to generate the responses. Finally, the method detects dialogue breakdowns and regenerates responses as necessary to ensure coherence. We conducted experiments using Wizard of Wikipedia and two state-of-the-art response generation models. The results suggest that applying the proposed method improves both response coherence and interestingness.