CFE 2020: Start Registration
View Submission - CMStatistics
Title: Efficient hyperparameter selection for penalised regression using communicating gradient descent algorithm Authors:  Stephane Chretien - NPL (United Kingdom)
Alex Gibberd - Lancaster University (United Kingdom)
Sandipan Roy - University of Bath (United Kingdom) [presenting]
Abstract: In high-dimensional regression problems, statistical learning often relies on sparsity assumptions on the regression vector, which are often enforced using L1-type penalties in the estimation stage. One of the main difficulties in solving such penalised problems is to calibrate the so-called relaxation parameter or hyperparameter accurately. Many different methods are available for the hyperparameter selection problem: Bayesian Optimisation, Cross-Validation, Multi-Armed Band algorithms, etc. We propose a new methodology based on running communicating pools of incremental gradient algorithms in parallel, each of them corresponding to a specific value of the hyperparameter on a grid. Each time a new data is observed, the prediction performance of the different values of the hyperparameter can be compared using a Follow the Leader scheme. Theoretical guarantees are provided for our method, showing the power of communication for this problem. The results are illustrated with numerical experiments showing that our simple selection rule can achieve prediction results comparable to state of the art, at a lower computational cost