0000000000123259
AUTHOR
Frank Lad
SCORING ALTERNATIVE FORECAST DISTRIBUTIONS: COMPLETING THE KULLBACK DISTANCE COMPLEX
We develop two surprising new results regarding the use of proper scoring rules for evaluating the predictive quality of two alternative sequential forecast distributions. Both of the proponents prefer to be awarded a score derived from the other's distribution rather than a score awarded on the basis of their own. A Pareto optimal exchange of their scoring outcomes provides the basis for a comparison of forecast quality that is preferred by both forecasters, and also evades a feature of arbitrariness inherent in using the forecasters' own achieved scores. The well-known Kullback divergence, used as a measure of information, is evaluated via the entropies in the two forecast distributions a…
The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as &ldquo
Predictive distributions that mimic frequencies over a restricted subdomain
A predictive distribution over a sequence of $$N+1$$ events is said to be “frequency mimicking” whenever the probability for the final event conditioned on the outcome of the first N events equals the relative frequency of successes among them. Exchangeable distributions that exhibit this feature universally are known to have several annoying concomitant properties. We motivate frequency mimicking assertions over a limited subdomain in practical problems of finite inference, and we identify their computable coherent implications. We provide some examples using reference distributions, and we introduce computational software to generate any complete specification desired. Theorems on reducti…
Reassessing Accuracy Rates of Median Decisions
We show how Bruno de Finetti''s fundamental theorem of prevision has computable applications in statistical problems that involve only partial information. Specifically, we assess accuracy rates for median decision procedures used in the radiological diagnosis of asbestosis. Conditional exchangeability of individual radiologists'' diagnoses is recognized as more appropriate than independence which is commonly presumed. The FTP yields coherent bounds on probabilities of interest when available information is insufficient to determine a complete distribution. Further assertions that are natural to the problem motivate a partial ordering of conditional probabilities, extending the computation …
Completing the logarithmic scoring rule for assessing probability distributions
We propose and motivate an expanded version of the logarithmic score for forecasting distributions, termed the Total Log score. It incorporates the usual logarithmic score, which is recognised as incomplete and has been mistakenly associated with the likelihood principle. The expectation of the Total Log score equals the Negentropy plus the Negextropy of the distribution. We examine both discrete and continuous forms of the scoring rule, and we discuss issues of scaling for scoring assessments. The analysis suggests the dual tracking of the quadratic score along with the usual log score when assessing the qualities of probability distributions. An application to the sequential scoring of f…
Sequentially Forecasting Economic Indices Using Mixture Linear Combinations of EP Distributions
This article displays an application of the statistical method moti- vated by Bruno de Finetti's operational subjective theory of probability. We use exchangeable forecasting distributions based on mixtures of linear com- binations of exponential power (EP) distributions to forecast the sequence of daily rates of return from the Dow-Jones index of stock prices over a 20 year period. The operational subjective statistical method for comparing distributions is quite different from that commonly used in data analysis, because it rejects the basic tenets underlying the practice of hypothesis test- ing. In its place, proper scoring rules for forecast distributions are used to assess the values o…
Extropy: Complementary Dual of Entropy
This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…