2018 Volume 70 Issue 3 Pages 183-185
A deep neural network (DNN) has been in the center of attention in the field of machine learning, and the chaotic neuron and chaotic neural network (ChNN) models have been in the spotlight in computational neuroscience and nonlinear science. However, there are no studies on deep ChNN. In order to fill this gap, we propose a ReLU (rectifier linear unit) chaotic neuron model, which is necessary for the application of the chaotic neuron model to DNN with ReLU activation. We also show that even a single ReLU chaotic neuron can generate dynamically changing outputs in spite of the simplicity of ReLU.