2022 Volume 52 Issue 1 Pages 33-51
In this article, we introduce a study which presents a minimax error rate analysis of nonparametric regression aimed at elucidating the superiority of deep neural networks over standard methods. In the problem of nonparametric regression, it is well known that many standard methods achieve minimax optimal rates of generalization error for smooth functions, and it is not easy to reveal the theoretical advantage of deep neural networks. The work presented in this paper fills this theoretical gap by considering estimation for a class of non-smooth functions with singularities on hypersurfaces. The results obtained are as follows: (i) Analyzes the generalization error of the function estimator by deep neural networks and prove that its convergence rate is optimal (excluding logarithmic order effects). (ii) Identifies situations in which the deep neural network outperforms standard methods such as the kernel method, Gaussian process method, etc., then constructs a phase diagram with respect to shape parameters for smoothness and singularity. The superiority of this deep neural network comes from the fact that its multilayer structure can properly handle the shape of singularities.