Abstract
It is still uncovered how body and spatial representation is calculated with various kinds of sensor information involved with motion. In human brain, the higher layer of the cortex is thought to represent wider spatial area and longer temporal period.
This paper proposes a hierarchical model that integrates multi modal information that is acquired when an agent interacts with the world. In each layer, the slowly changed features are extracted from input signals. The simulation experiment shows that multimodal information related to self movement is transformed into lower dimensional data that changes slowly.