Proceedings of the Fuzzy System Symposium
34th Fuzzy System Symposium
Session ID : MD2-2
Conference information

proceeding
Improvement of Initialization for Neural Network
*Yousuke OKAMOTO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In multi-layered neural networks, the back propagation algorithm provides a means of training the network to perform supervised learning tasks. This is viewed as a means of implementing the gradient descent optimization through calculation of the derivative the error function. However, it is generally known that this function has a large number of local minima, and this algorithm does not guarantee convergence to a global minimum. We derive the Hessian matrix of the error function in the 3-layered neural network under limited conditions and show the sufficient condition for convexity.

Content from these authors
© 2018 Japan Society for Fuzzy Theory and Intelligent Informatics
Previous article Next article
feedback
Top