Host: The Japanese Society for Artificial Intelligence
Name : The 38th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 38
Location : [in Japanese]
Date : May 28, 2024 - May 31, 2024
For autonomous robots to collaborate effectively with humans, explaining the rationale behind their decision-making, aiming to instill trust in humans, is crucial. Depending on the situation, identifying content that necessitates explanation, particularly based on others' belief states, is a pivotal element in tailoring appropriate explanations. Therefore, we propose an approach termed 'active estimation of others' belief,' grounded in the definition of eXplainable Autonomous Robots (XAR). In this paper, we validate the efficiency and accuracy of our active estimation method of others' beliefs through experiments conducted in a grid environment. We compare our proposed method with a random question generation approach and other methods. The results indicate the potential for efficiently estimating others' beliefs.