SICE Annual Conference Program and Abstracts
SICE Annual Conference 2002
会議情報

Superimposing Memory by Dynamic and Spatial Changing Synaptic Weights
Noriyasu HommaMadan M. GuptaKenichi AbeHiroshi Takeda
著者情報
会議録・要旨集 フリー

p. 677

詳細
抄録
In this paper a novel neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural network structure is its dynamic and spatial changing weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing (growing) connecting weights. As the new synaptic connections are formed, new network structure is superimposed on the previous structure. In this superimposition, to avoid disturbing the past knowledge due to the creation of new connections, a restoration mechanism is introduced by using the DSCWs. Usefulness of the proposed model is demonstrated by using pattern classification and system identification tasks.
著者関連情報
© 2002 SICE
前の記事 次の記事
feedback
Top