Title: Forecasting evaluation in JDemetra+
Authors: David de Antonio Liedo - National Bank of Belgium (Belgium) [presenting]
Abstract: The new forecasting evaluation framework of the software JDemetra+ (JD+) is described, and its usefulness is proved for decision making and diagnosis. Our algorithms are inspired by recent ideas to apply small sample asymptotics to correct the well-known size distortions of popular test, e.g. Diebold-Mariano. Our example is motivated by the ESS Guidelines on Seasonal Adjustment, which suggest updating models, filters, outliers and regression parameters (henceforth, the specification) at regular time intervals. Thus, users need to fix on their own the precise updating policies because currently there are not data-driven rules in JDemetra+ to decide how to do it or how frequently. We propose calculating out-of-sample forecasts over the last two years using recursive parameter estimation. If those forecasts improve those that would have resulted using the new specification automatically proposed by JD+ we would suggest discarding the update. Using thousands of series for the US and the EU, we will provide an experimental overview of the number of times our decision rule would advise us to update the specification over the last five years, and what the gains would be. They will be given by the decrease in the root mean squared forecast error (RMSE). We will consider both the errors at forecasting unadjusted data and revision errors over a given time interval. The last concept requires approximating the true adjusted data for the whole sample averaging all accepted methods.