Title: Scalable distributional learning
Authors: Nikolaus Umlauf - University of Innsbruck (Austria) [presenting]
Abstract: Estimating distributional regression models with very large data sets is a difficult task. In particular, the use of non-standard distributions can easily lead to memory-related but also efficiency problems, which usually results in the models not being able to be estimated at all, sometimes even on high-performance computers. We, therefore, propose a novel backfitting algorithm that is based on the ideas of stochastic gradient descent and can deal virtually with any amount of data on a conventional laptop. Moreover, the algorithm performs automatic variable and smoothing parameter selection and its performance is in most cases superior or at least equal to other implementations for distributional regression. With this new algorithm, we demonstrate the estimation of complex distribution regression models with a challenging example using a new, very large dataset on child undernutrition in low- and middle-income countries.