Forecaster's Dilemma: Extreme Events and Forecast Evaluation
MetadataShow full item record
In public discussions of the quality of forecasts, attention typically focuses on the predictive performance in cases of extreme events. However, the restriction of conventional forecast evaluation methods to subsets of extreme observations has unexpected and undesired effects, and is bound to discredit skillful forecasts when the signal-to-noise ratio in the data generating process is low. Conditioning on outcomes is incompatible with the theoretical assumptions of established forecast evaluation methods, thereby confronting forecasters with what we refer to as the forecaster’s dilemma. For probabilistic forecasts, proper weighted scoring rules have been proposed as decision-theoretically justifiable alternatives for forecast evaluation with an emphasis on extreme events. Using theoretical arguments, simulation experiments and a real data study on probabilistic forecasts of U.S. inflation and gross domestic product (GDP) growth, we illustrate and discuss the forecaster’s dilemma along with potential remedies.
Showing items related by title, author, creator and subject.
Kascha C; Ravazzolo F (2010)In this paper, we empirically evaluate competing approaches for combining inflation density forecasts in terms of Kullback-Leibler divergence. In particular, we apply a similar suite of models to four different datasets ...
Aastveit KA; Foroni C; Ravazzolo F (2016)In this paper we propose a parametric block wild bootstrap approach to compute density forecasts for various types of mixed-data sampling (MIDAS) regressions. First, Monte Carlo simulations show that predictive densities ...
Bédard, J; Coulombe, D; Courteau, L (2016)This paper provides empirical evidence of the impact of the voluntary disclosure of management earnings forecasts in IPO prospectuses and of the credibility of these forecasts, as perceived by investors at the time of the ...