主催: 一般社団法人 日本機械学会
会議名: ロボティクス・メカトロニクス 講演会2018
開催日: 2018/06/02 - 2018/06/05
In recent years, neural network (NN) has made excellent outcomes on the problems that are difficult to analytically solve due to their complexity. NN has a critical problem, called catastrophic forgetting: memories for tasks already learned would be broken when a new task is learned additionally. This problem interferes with continuous learning required for autonomous robots. It is supposed that the catastrophic forgetting is caused by rewriting the whole network by backpropagation of learning signals. This study therefore propose a method to mitigate the catastrophic forgetting using a model that learns only the output layer of recurrent NN, called reservoir computing. Instead of giving the network randomly like the conventional reservoir computing, the network structure and its weights are designed based on a fractal complex network. This network modularized memories for multiple tasks and mitigated the catastrophic forgetting.