CMStatistics 2018: Start Registration
View Submission - CMStatistics
Title: Efficient kernel-based learning by localization Authors:  Ingo Steinwart - University of Stuttgart (Germany) [presenting]
Abstract: Despite the recent successes of (deep) neural networks kernel-based learning (KBL) methods remain one of the most successful learning methods for unstructured small to medium sized classification and regression problems. However, when it comes to large scale applications, their computational requirements, which grow super-linearly in the number of training samples, renders their application infeasible. To address this issue, several approaches that e.g. train KBL on many small chunks of the given large data set separately have been proposed in the literature. We consider such a decomposition strategy, called localized KBL, which is based upon a spatial partition of the input space. For this localized KBL, we derive a general oracle inequality describing its learning performance. Then we apply this oracle inequality to least squares regression using Gaussian kernels and deduce local learning rates that are essentially minimax optimal under some standard smoothness assumptions on the regression function. We further introduce a data-dependent parameter selection method for our localized KBL approach and show that this method achieves the same learning rates as before. Finally, we present some larger scale experiments for our localized KBL showing that it achieves essentially the same test performance as a global KBL for a fraction of the computational requirements.