Journal of the Japan Statistical Society, Japanese Issue
Online ISSN : 2189-1478
Print ISSN : 0389-5602
ISSN-L : 0389-5602
Special Topic: The JSS Ogawa Prize Lecture
In Estimating Functions with Singularity on Hypersurfaces and Advantages of Deep Neural Networks
Masaaki Imaizumi
Author information
JOURNAL FREE ACCESS

2022 Volume 52 Issue 1 Pages 33-51

Details
Abstract

In this article, we introduce a study which presents a minimax error rate analysis of nonparametric regression aimed at elucidating the superiority of deep neural networks over standard methods. In the problem of nonparametric regression, it is well known that many standard methods achieve minimax optimal rates of generalization error for smooth functions, and it is not easy to reveal the theoretical advantage of deep neural networks. The work presented in this paper fills this theoretical gap by considering estimation for a class of non-smooth functions with singularities on hypersurfaces. The results obtained are as follows: (i) Analyzes the generalization error of the function estimator by deep neural networks and prove that its convergence rate is optimal (excluding logarithmic order effects). (ii) Identifies situations in which the deep neural network outperforms standard methods such as the kernel method, Gaussian process method, etc., then constructs a phase diagram with respect to shape parameters for smoothness and singularity. The superiority of this deep neural network comes from the fact that its multilayer structure can properly handle the shape of singularities.

Content from these authors
© 2022 Japan Statistical Society
Previous article Next article
feedback
Top