CMStatistics 2022: Start Registration
View Submission - CMStatistics
Title: Accelerated componentwise gradient boosting using efficient data representation and momentum-based optimization Authors:  Daniel Schalk - LMU Munich (Germany) [presenting]
Bernd Bischl - LMU Munich (Germany)
David Ruegamer - TU Dortmund (Germany)
Abstract: Model-based or componentwise boosting (CWB) is an interpretable gradient-boosting variant that builds on additive models as base learners. Using statistical models as base learners induces an additive model estimation, inherent feature selection, applicability in high-dimensional settings, and favorable scaling w.r.t. the number of features. One downside of CWB is its computational complexity in terms of memory and runtime. We present two novel approaches that (1) reduce the memory load of CWB by applying a discretization technique to numerical features and (2) incorporate Nesterov momentum to speed up the fitting process of the model. Our adaptation of CWB not only drastically reduces memory consumption but also allows the use of specialized matrix operations that further speed up its runtime. Our incorporation of Nesterov momentum preserves well-known advantages of CWB while notably speeding up the algorithm's convergence. We demonstrate its effectiveness in simulation studies and a large real-world benchmark experiment.