Host: Japan Society for Fuzzy Theory and Intelligent Informatics (SOFT)
Name : 34th Fuzzy System Symposium
Number : 34
Location : [in Japanese]
Date : September 03, 2018 - September 05, 2018
In multi-layered neural networks, the back propagation algorithm provides a means of training the network to perform supervised learning tasks. This is viewed as a means of implementing the gradient descent optimization through calculation of the derivative the error function. However, it is generally known that this function has a large number of local minima, and this algorithm does not guarantee convergence to a global minimum. We derive the Hessian matrix of the error function in the 3-layered neural network under limited conditions and show the sufficient condition for convexity.