International Journal of Networking and Computing
Online ISSN : 2185-2847
Print ISSN : 2185-2839
ISSN-L : 2185-2839
Special Issue on the Seventh International Symposium on Computing and Networking
Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks
Kenta Inoue
著者情報
ジャーナル オープンアクセス

2020 年 10 巻 2 号 p. 293-307

詳細
抄録

One of the reasons why neural networks are used in machine learning is their high expressive power, that is, the ability to express functions. Expressive power of neural networks depends on its structures and is measured by some indices. In this paper, we focus on one of these measures named "expressive number", which is based on the number of data that can be expressed. Expressive numbers enable us to see whether the size of a neural network is suitable for the given training data before we conduct machine learning. However, existing works on expressive numbers mainly target single hidden layer neural networks, and little is known about those with two or more hidden layers. In this paper, we give a lower bound of the maximum expressive number of two hidden layer neural networks and an upper bound of that of multilayer neural networks with ReLU activation function. This result shows the expressive number of two hidden layer neural networks is in O(a_1a_2) where a_1 and a_2 are the numbers of each hidden layer's neurons.

著者関連情報
© 2020 International Journal of Networking and Computing
前の記事 次の記事
feedback
Top