SEISAN KENKYU
Online ISSN : 1881-2058
Print ISSN : 0037-105X
ISSN-L : 0037-105X
Research Flash
Modification of Initialization Technique for Multilayer Neural Network
Tomoki NAKAMARUKazuyuki AIHARAMakito OKU
Author information
JOURNAL FREE ACCESS

2016 Volume 68 Issue 3 Pages 261-264

Details
Abstract
Deep neural networks and their learning methods( deep learning) are attracting attentions of many people in late years with many successful results. Although error backpropagation is generally used for training multilayer neural networks, its performance depends heavily on the initial parameters of networks. In this research, we proposed methods of improving Xavier Initialization for neural networks with ReLU (Rectified Linear Unit) and another modification of Xavier Initialization for neural networks dealing high dimensional numbers.
Content from these authors
© 2016 Institute of Industrial Science The University of Tokyo
Previous article Next article
feedback
Top