CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1079
Title: Monotonicity and double descent in uncertainty estimation with Gaussian processes Authors:  Liam Hodgkinson - University of Melbourne (Australia) [presenting]
Chris van der Heide - University of Queensland (Australia)
Fred Roosta - University of Queensland (Australia)
Michael Mahoney - UC Berkeley (United States)
Abstract: Contrary to what classical learning theory suggests, it is known that the quality of many modern machine learning models improves as the number of parameters increases. For predictive performance, these effects have recently been quantified with the double descent learning curve, which shows that larger models exhibit smaller test errors under appropriate regularization. We will present an analogous theory for models which estimate uncertainty, namely Gaussian processes (GP). In particular, contrary to popular belief, we will prove under a few assumptions that model quality of GPs under marginal likelihood improves monotonically in the number of covariates (even synthetic ones), provided an appropriate degree of regularization is imposed. To support our theory, we will show a variety of experiments, where we find this phenomenon holds beyond our considered assumptions and depends on several key factors, including kernel regularity and data conditioning.