View Submission - EcoSta2019

A0289
**Title: **Distributed regression learning with coefficient and partial coefficients regularization
**Authors: **Hongwei Sun - University of Jinan (China) **[presenting]**

**Abstract: **Distributed regression learning with a coefficient regularization scheme in a reproducing kernel Hilbert space (RKHS) is studied. The algorithm randomly partitions the sample set $z_i$, $i=1,2,...,N$ into $m$ disjoint sample subsets of equal size, applies coefficient regularization scheme to each sample subset to produce an output function, and averages the individual output functions to get the final global estimator. We deduce the error bound in expectation in the $L^2$-metric and prove the asymptotic convergence for this distributed coefficient regularization learning. Satisfied learning rates are derived under a very mild regularity condition on the regression function, which reveals an interesting phenomenon that when $m< N^s$ and $s$ is small enough, this distributed learning has the same convergence rate compared with the algorithm processing the whole data in one single machine. In order to reduce the complexity of algorithms, we also study a new distributed coefficient regularization scheme, which apply a partial coefficients regularization to each sample subset to produce an output function, and average the individual output functions to get the final global estimator. The error bound in the $L^2$-metric is deduced and the asymptotic convergence for this distributed learning with partial coefficients regularization is proved by integral operator technique. Satisfactory learning rates are then derived under a standard regularity condition on the regression function.