Title: Robust boosting for regression problems
Authors: Matias Salibian-Barrera - The University of British Columbia (Canada) [presenting]
Xiaomeng Ju - The University of British Columbia (Canada)
Abstract: Gradient boosting algorithms construct a regression predictor using a linear combination of base learners. Boosting also offers a family of non-parametric regression estimators that are scalable to applications with many explanatory variables. The robust boosting algorithm is based on a two-stage approach, similar to what is done for robust linear regression: it first minimizes a robust residual scale estimator, and then improves it by optimizing a bounded loss function. Unlike previous robust boosting proposals, our approach does not require computing an ad-hoc residual scale estimator in each boosting iteration. A robust variable importance measure can also be calculated via a permutation procedure. Through simulation studies and several data analyses show that, when no atypical observations are present, the robust boosting approach works as well as the standard gradient boosting with a squared loss. Furthermore, when the data contain outliers, the robust boosting estimator outperforms the alternatives in terms of prediction error and variable selection accuracy.