A0514
Title: Predictive ability tests with possibly overlapping models
Authors: Jack Fosten - King's College London (United Kingdom) [presenting]
Valentina Corradi - University of Surrey (United Kingdom)
Daniel Gutknecht - University of Mannheim (Germany)
Abstract: Novel tests are provided for comparing the out-of-sample predictive ability of two or more competing models that are possibly overlapping. The tests do not require pre-testing, they allow for dynamic misspecification and are valid under different estimation schemes and loss functions. In pairwise model comparisons, the test is constructed by adding a random perturbation to both the numerator and denominator of a standard Diebold-Mariano test statistic. This prevents degeneracy in the presence of overlapping models but becomes asymptotically negligible otherwise. The test has the correct size uniformly over all null data-generating processes. A similar idea is used to develop a superior predictive ability test for the comparison of multiple models against a benchmark. Monte Carlo simulations demonstrate that the tests exhibit very good size control in nite samples reducing both incidences of under- and oversizing relative to its competitors. Finally, an application to forecasting U.S. excess bond returns provides evidence in favour of models using macroeconomic factors.