Title: Testing nowcast monotonicity
Authors: Daniel Gutknecht - University of Mannheim (Germany) [presenting]
Jack Fosten - University of East Anglia (United Kingdom)
Abstract: Nowcasting has become an important tool to many public and private institutions in obtaining timely predictions of low-frequency variables such as Gross Domestic Product (GDP) using current information. Nowcasters often report that a nowcasting method is successful if its predictions improve monotonically as we move towards the publication date of the target variable. We develop a novel testing approach to formally evaluate the monotonicity of a nowcasting method. In order to highlight the usefulness of this new test, we provide various analytical examples where nowcast monotonicity fails or where it helps to assess the trade-off between timeliness and accuracy of the predictor variables. Formally, we extend a methodology for testing many moment inequalities, which allows the number of inequalities to be large relative to the sample size, to the case of nowcast monotonicity testing. We show that rolling parameter estimation in the pseudo out-of-sample approach can be accommodated in this setting and illustrate the performance of our test with a detailed set of Monte Carlo simulations. We conclude with an empirical application of nowcasting U.S. real GDP growth and five GDP sub-components.