Title: Deviance information criterion for model selection: Justification and variation
Authors: Yong Li - Renmin university of China (China)
Jun Yu - Singapore Management University (Singapore)
Tao Zeng - Zhejiang University (China) [presenting]
Abstract: Deviance information criterion (DIC) has been extensively used for making model selection based on the MCMC output. Although it is understood as a Bayesian version of AIC, a rigorous justification has not been provided in the literature. We show that when the plug-in predictive distribution is used, DIC can have a rigorous decision-theoretic justification in a frequentist setup. Under a set of regularity conditions, we show that DIC chooses a model that gives the smallest expected Kullback-Leibler divergence between the data generating process (DGP) and the plug-in predictive distribution asymptotically. An alternative expression for DIC, based on the Bayesian predictive distribution, is proposed. The new DIC has a smaller penalty term than the original DIC and is very easy to compute from the MCMC output. It is invariant to reparameterization and yields a smaller expected loss than the original DIC asymptotically.