CMStatistics 2022: Start Registration
View Submission - CMStatistics
B0694
Title: Ensembling deep transformation models Authors:  Lucas Kook - University of Copenhagen (Denmark) [presenting]
Andrea Goetschi - University of Zurich (Switzerland)
Philipp FM Baumann - ETH Zurich (Switzerland)
Torsten Hothorn - University of Zurich (Switzerland)
Beate Sick - University of Zurich Zurich University of Applied Sciences (Switzerland)
Abstract: Aggregating predictions from several models is a well-known and popular approach to forecasting in many scientific domains, such as machine learning and meteorology. The models may be as simple as decision trees or as complex as deep neural networks. Combining probablistic predictions from deep neural networks is referred to as deep ensembling. It has been shown to lead to better and more robust predictions and uncertainty quantification than the individual members. However, even if individual ensemble members are partially interpretable, the ensemble itself is no longer interpretable in general. We present transformation ensembles which guarantee improved prediction performance and preserve their members' interpretable model structure. The key idea of transformation ensembles is to specify a latent random variable with a simple distribution, and to estimate the model and aggregate its predictions on this latent scale. We demonstrate how to build and fit deep and partially interpretable transformation ensembles and use them to quantify both aleatoric and epistemic uncertainty.