Search results for "extropy"
showing 4 items of 4 documents
SCORING ALTERNATIVE FORECAST DISTRIBUTIONS: COMPLETING THE KULLBACK DISTANCE COMPLEX
2018
We develop two surprising new results regarding the use of proper scoring rules for evaluating the predictive quality of two alternative sequential forecast distributions. Both of the proponents prefer to be awarded a score derived from the other's distribution rather than a score awarded on the basis of their own. A Pareto optimal exchange of their scoring outcomes provides the basis for a comparison of forecast quality that is preferred by both forecasters, and also evades a feature of arbitrariness inherent in using the forecasters' own achieved scores. The well-known Kullback divergence, used as a measure of information, is evaluated via the entropies in the two forecast distributions a…
The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
2018
The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as &ldquo
Completing the logarithmic scoring rule for assessing probability distributions
2012
We propose and motivate an expanded version of the logarithmic score for forecasting distributions, termed the Total Log score. It incorporates the usual logarithmic score, which is recognised as incomplete and has been mistakenly associated with the likelihood principle. The expectation of the Total Log score equals the Negentropy plus the Negextropy of the distribution. We examine both discrete and continuous forms of the scoring rule, and we discuss issues of scaling for scoring assessments. The analysis suggests the dual tracking of the quadratic score along with the usual log score when assessing the qualities of probability distributions. An application to the sequential scoring of f…
Extropy: Complementary Dual of Entropy
2015
This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…