CMStatistics 2018: Start Registration
View Submission - CMStatistics
B0397
Title: Posterior model selection consistency: Beyond asymptotic optimality Authors:  David Rossell - Universitat Pompeu Fabra (Spain) [presenting]
Abstract: An important property for Bayesian model selection is that the posterior probability of the data-generating model (or Kullback-Leibler closest to it) converges to 1, and that the corresponding convergence rate is fast. This guarantees frequentist model selection consistency and asymptotically valid uncertainty quantification. The aim is two-fold. First, we provide a general framework to study consistency for any given model, prior, sample size $n$ and dimension $p$, and potentially under model misspecification. Second, we deploy it to canonical variable selection and show that there can be a big gap between lessons learnt from asymptotically optimal rates (the large $n$, even large $p$ paradigm) and practical situations with finite sample size (the small $n$, large $p$ paradigm). Specifically, this gives interesting insights regarding sparsity/sensitivity tradeoffs, e.g. one may forsake asymptotic optimality to obtain significant gains in power. These gains are noticeable even in simple sparse scenarios, and become more relevant in truly non-sparse settings.