CMStatistics 2018: Start Registration
View Submission - CMStatistics
B0367
Title: Forecasters dilemma: Extreme events and forecast evaluation Authors:  Sebastian Lerch - Karlsruhe Institute of Technology (Germany) [presenting]
Thordis Thorarinsdottir - Norwegian Computing Center (Norway)
Francesco Ravazzolo - Free University of Bozen-Bolzano (Italy)
Tilmann Gneiting - Heidelberg Institute for Theoretical Studies (Germany)
Abstract: In public discussions of the quality of forecasts, attention typically focuses on the predictive performance in cases of extreme events. However, the restriction of conventional forecast evaluation methods to subsets of extreme observations has unexpected and undesired effects, and is bound to discredit skillful forecasts when the signal-to-noise ratio in the data generating process is low. Conditioning on outcomes is incompatible with the theoretical assumptions of established forecast evaluation methods, thereby confronting forecasters with what we refer to as the forecasters dilemma. For probabilistic forecasts, proper weighted scoring rules have been proposed as decision-theoretically justifiable alternatives for forecast evaluation with an emphasis on extreme events. Using theoretical arguments, simulation experiments and a real data study on probabilistic forecasts of U.S. inflation and gross domestic product (GDP) growth, we illustrate and discuss the forecasters dilemma along with potential remedies.