ロボティクス・メカトロニクス講演会講演概要集
Online ISSN : 2424-3124
セッションID: 1A1-D13
会議情報

フラクタルリザーバコンピューティングを用いた継続学習
*杉野 峻生小林 泰介杉本 謙二
著者情報
会議録・要旨集 フリー

詳細
抄録

In recent years, neural network (NN) has made excellent outcomes on the problems that are difficult to analytically solve due to their complexity. NN has a critical problem, called catastrophic forgetting: memories for tasks already learned would be broken when a new task is learned additionally. This problem interferes with continuous learning required for autonomous robots. It is supposed that the catastrophic forgetting is caused by rewriting the whole network by backpropagation of learning signals. This study therefore propose a method to mitigate the catastrophic forgetting using a model that learns only the output layer of recurrent NN, called reservoir computing. Instead of giving the network randomly like the conventional reservoir computing, the network structure and its weights are designed based on a fractal complex network. This network modularized memories for multiple tasks and mitigated the catastrophic forgetting.

著者関連情報
© 2018 一般社団法人 日本機械学会
前の記事 次の記事
feedback
Top