CMStatistics 2021: Start Registration
View Submission - CMStatistics
Title: Adaptivity of deep learning to intrinsic dimensionality via mixed and anisotropic smoothness for high dimensional input Authors:  Taiji Suzuki - University of Tokyo / RIKEN-AIP (Japan) [presenting]
Atsushi Nitanda - Kyushu Institute of Technology (Japan)
Sho Okumoto - The University of Tokyo (Japan)
Abstract: The estimation error of deep learning for high dimensional input, which is possibly infinite-dimensional, is discussed. Deep learning has been applied to high dimensional data such as image and voice signals. However, a usual nonparametric regression analysis asserts that a nonparametric method such as deep learning could suffer from the curse of dimensionality. We show that deep learning can extract informative features from the high dimensional input and avoid the curse of dimensionality if the true function has so-called mixed and anisotropic smoothness. Moreover, we also investigate convolutional neural networks and show that the curse of dimensionality can be avoided even though the input is infinite-dimensional. In particular, we show that the dilated convolution is advantageous when the smoothness of the target function has a sparse structure.