Accounting for Observational Uncertainty in Forecast Verification: An Information-Theoretical View on Forecasts, Observations, and Truth

Recently, an information-theoretical decomposition of Kullback–Leibler divergence into uncertainty, reliability, and resolution was introduced. In this article, this decomposition is generalized to the case where the observation is uncertain. Along with a modified decomposition of the divergence score, a second measure, the cross-entropy score, is presented, which measures the estimated information loss with respect to the truth instead of relative to the uncertain observations. The difference between the two scores is equal to the average observational uncertainty and vanishes when observations are assumed to be perfect. Not acknowledging for observation uncertainty can lead to both overestimation and underestimation of forecast skill, depending on the nature of the noise process.


Published in:
Monthly Weather Review, 139, 7, 2156-2162
Year:
2011
ISSN:
1520-0493
Keywords:
Laboratories:




 Record created 2013-07-03, last modified 2018-03-17

External link:
Download fulltext
URL
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)