Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
39th (2025)
Session ID : 1B4-OS-41b-02
Conference information

Learning Hierarchical State Space Models via Surprise- and Uncertainty-based Chunking
*Tomoshi IIYAMAMasahiro SUZUKIYutaka MATSUO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Complex real-world tasks are often long-horizon, making world models that can accurately predict far into the future crucial for AI agents. Hierarchical state-space models, which incorporate temporal hierarchies in latent states, have shown promise for long-term prediction by segmenting time series into subsequences and learning temporal abstraction. However, existing methods relying on rigid subsequence length assumptions or significant changes in observation often perform poorly in environments where optimal subsequence lengths vary or environmental changes occur gradually. This study proposes a method for learning hierarchical state-space models based on the discovery of frequently occurring, highly reusable patterns, drawing insights from chunking mechanisms in cognitive science. Our method extracts frequent patterns by utilizing changes in surprise and uncertainty in low-level latent states. Leveraging these patterns to learn high-level latent states reduces the complexity of transitions, enabling efficient long-term prediction. Experiments on video prediction tasks show that our method outperforms the baselines, underscoring the effectiveness of hierarchical structures derived from frequent patterns for long-term prediction.

Content from these authors
© 2025 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top