International Journal of Networking and Computing
Online ISSN : 2185-2847
Print ISSN : 2185-2839
ISSN-L : 2185-2839
Special Issue on the Seventh International Symposium on Computing and Networking
Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks
Kenta Inoue
Author information
JOURNAL OPEN ACCESS

2020 Volume 10 Issue 2 Pages 293-307

Details
Abstract
One of the reasons why neural networks are used in machine learning is their high expressive power, that is, the ability to express functions. Expressive power of neural networks depends on its structures and is measured by some indices. In this paper, we focus on one of these measures named "expressive number", which is based on the number of data that can be expressed. Expressive numbers enable us to see whether the size of a neural network is suitable for the given training data before we conduct machine learning. However, existing works on expressive numbers mainly target single hidden layer neural networks, and little is known about those with two or more hidden layers. In this paper, we give a lower bound of the maximum expressive number of two hidden layer neural networks and an upper bound of that of multilayer neural networks with ReLU activation function. This result shows the expressive number of two hidden layer neural networks is in O(a_1a_2) where a_1 and a_2 are the numbers of each hidden layer's neurons.
Content from these authors
© 2020 International Journal of Networking and Computing
Previous article Next article
feedback
Top