Conditionally Optimal Weights and Forward-Looking Approaches to Combining Forecasts
Andrey Vasnev (University of Sydney)
In applied forecasting, there is a trade-off between in-sample fit and out-of- sample forecast accuracy. Parsimonious model specifications typically outperform richer model specifications. Consequently, there is often predictable information in forecast errors that is difficult to exploit.
However, we show how this predictable information can be exploited in forecast combinations. In this case, optimal combination weights should minimize conditional mean squared error, or a conditional loss function, rather than the unconditional variance as in the commonly used framework of Bates and Granger (1969).
We prove that our conditionally optimal weights lead to better forecast performance. The conditionally optimal weights support other forward-looking approaches to combining forecasts, where the forecast weights depend on the expected model performance. We show that forward-looking approaches can robustly outperform the random walk benchmark and many of the commonly used forecast combination strategies, including equal weights, in real-time out-of-sample forecasting exercises of inflation.