Journal of the Physical Society of Japan
Online ISSN : 1347-4073
Print ISSN : 0031-9015
ISSN-L : 0031-9015
On-Line Learning of Two-Layered Neural Network with Randomly Diluted Connections
Katsuki KatayamaTsuyoshi Horiguchi
Author information
JOURNAL RESTRICTED ACCESS

2002 Volume 71 Issue 2 Pages 458-465

Details
Abstract
We investigate an on-line learning of two-layered feed-forward neural networks with randomly diluted connections by using a gradient descent algorithm. We derive coupled first-order differential equations for order parameters which describe learning process in the thermodynamic limit by assuming the self-averaging within the framework of the statistical physics. We clarify that the learning time for asymmetric dilution of connections in a teacher network is shorter than the learning time for symmetric dilution of the connections in the teacher network. But we find that the learning does not converge when the teacher network is too diluted. We obtain a phase diagram in a learning rate versus rate of dilution plane for the diluted teacher network. It turns out that the learning converges imperfectly when a student network is diluted too much, because the average value of connections for the student network is essentially smaller than the value of connection for the teacher network.
Content from these authors

This article cannot obtain the latest cited-by information.

© The Physical Society of Japan 2002
Previous article Next article
feedback
Top