Title: Bayesian sharp minimaxity via FDR penalization
Authors: Qifan Song - Purdue University (United States) [presenting]
Abstract: Bayesian inference is considered for high dimensional regression problems with an unknown sparse coefficient vector. In literature, various Bayesian approaches are proposed and shown to be consistent for model selection. We first study the relationship between rate minimaxity of estimation and model selection consistency, and conjecture that selection consistency estimator is supoptimal in terms of $L_2$ error, especially when the true coefficient is relatively dense and contains many weak signals. Inspired by the B-H FDR procedure and its minimaxity under normal means model, we propose a Bayesian modeling that corresponds to FDR and show that its corresponding posterior contraction rate is rate-minimax, and the number of false discoveries selected by posterior is bounded. More importantly, we find that under certain near orthogonal design, the posterior is asymptotically sharply minimax in terms of the multiplicative constant, and ratio of number of false discoveries over true sparsity can be arbitrarily small.