Title: A scalable frequentist model averaging method
Authors: HaiYing Wang - University of Connecticut (United States) [presenting]
Abstract: Frequentist model averaging is an effective technique to handle model uncertainty. However, calculation of the weights for averaging is extremely difficult, if not impossible, even when the dimension of the predictor vector, $p$, is moderate, because we may have $2^p$ candidate models. We propose a scalable frequentist model averaging method to overcome this difficulty by using the singular value decomposition. The method enables us only need to find the optimal weights for at most $p$ candidate models. We prove that the minimum loss of the scalable model averaging estimator is asymptotically equal to that of the traditional model averaging estimator, and that the scalable Mallows/Jackknife model averaging estimators are asymptotically optimal. We also further extend the method for the high-dimensional case (i.e., $p\ge n$). Numerical studies illustrate the superiority of the proposed method in terms of both the statistical efficiency and computational cost.