Host: The Japanese Society for Artificial Intelligence
Name : 34th Annual Conference, 2020
Number : 34
Location : Online
Date : June 09, 2020 - June 12, 2020
An explainable AI attracts attention from many machine learning researchers because the reason why the AI system outputs the result is important to accept the AI in human society. However, studies about acceptable decision-making and explanations by autonomous agents such as robots have not been studied yet. For generating human-friendly descriptions, it is crucial to provide minimum necessity information for human understanding. To tackle this issue, we propose a generation method of human-acceptable explanations for a decision-making agent using reinforcement learning. The proposed method employs a world predictive model to identify the critical situation for practical explanations. We verify the proposed method in a simple MDP situation which requires to access plural states to get rewards. Experimental results show that our method can generate a minimum necessity explanation for human questions in a grid maze world.