Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 1G5-OS-21b-04
Conference information

Transferring World Model to unseen task
*Yuya FUJISAKIKeigo MINAMIDAShohei HIJIKATAChika SAWANOWataru KUMAGAIYutaka MATSUO
Author information
Keywords: World Models
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

One of the model-based reinforcement learning techniques, known as the world model, predicts the environment's transitions resulting from the agent's actions. Using the world model is expected to improve sample efficiency and enable adaptation to unseen tasks. However, the world model is larger compared to other reinforcement learning models, which raises concerns about prolonged training and computational constraints when executing the model. To address these issues, we propose enhancing the world model's practicality by applying model compression and transfer learning. The objective of this study is to investigate the effect of these approaches on the world model's performance. Based on our results, we draw two conclusions: (1) the proposed method (model compression + transfer learning) has the potential to outperform learning only the target task without applying any model compression or transfer learning, and (2) the proposed method is robust to changes in hyperparameters.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top