Title: A scalable frequentist model averaging method
Authors: HaiYing Wang - University of Connecticut (United States) [presenting]
Abstract: Frequentist model averaging is an effective technique to handle model uncertainty. However, calculation of the weights for averaging is extremely difficult, if not impossible, even when the dimension of the predictor vector, $p$, is moderate, because we may have $2^p$ candidate models. The exponential size of the candidate model set may also bring additional numerical error in calculating the weights. A scalable frequentist model averaging method is proposed, which is statistically and computationally efficient, to overcome this problem by using the singular value decomposition. The method enables us to find the optimal weights by considering at most $p$ candidate models. We prove that the minimum loss of the scalable model averaging estimator is asymptotically equal to that of the traditional model averaging estimator, and that the scalable Mallows/Jackknife model averaging estimators are asymptotically optimal. We also further extend the method for the high-dimensional case (i.e., $p>> n$). Numerical studies illustrate the superiority of the proposed method in terms of both statistical efficiency and computational cost.