Host: The Japanese Society for Artificial Intelligence
Name : The 102th SIG-SLUD
Number : 102
Location : [in Japanese]
Date : November 28, 2024 - November 29, 2024
Pages 64-69
Open-domain dialogue systems have been increasingly applied in various situations, with a growing need to improve user engagement. One effective approach is to generate responses based on interesting external knowledge using knowledge-grounded response generation models. However, relying solely on interestingness can result in incoherent responses, which may reduce user engagement. This paper proposes a novel method for generating engaging responses while preserving contextual coherence. Our method leverages a pre-trained knowledge-grounded response generation model and modifies the knowledge selection process to enhance response coherence and interestingness without requiring additional training. First, knowledge candidates with high contextual relevance are retrieved. These candidates are then reranked based on their interestingness, and used to generate the responses. Finally, the method detects dialogue breakdowns and regenerates responses as necessary to ensure coherence. We conducted experiments using Wizard of Wikipedia and two state-of-the-art response generation models. The results suggest that applying the proposed method improves both response coherence and interestingness.