Search results for "extropy"

showing 4 items of 4 documents

SCORING ALTERNATIVE FORECAST DISTRIBUTIONS: COMPLETING THE KULLBACK DISTANCE COMPLEX

2018

We develop two surprising new results regarding the use of proper scoring rules for evaluating the predictive quality of two alternative sequential forecast distributions. Both of the proponents prefer to be awarded a score derived from the other's distribution rather than a score awarded on the basis of their own. A Pareto optimal exchange of their scoring outcomes provides the basis for a comparison of forecast quality that is preferred by both forecasters, and also evades a feature of arbitrariness inherent in using the forecasters' own achieved scores. The well-known Kullback divergence, used as a measure of information, is evaluated via the entropies in the two forecast distributions a…

Settore MAT/06 - Probabilita' E Statistica MatematicaProbability (math.PR)Mathematics - Statistics TheoryStatistics Theory (math.ST)PARETO OPTIMAL EXCHANGETOTAL LOGARITHMIC SCORING RULEKULLBACK SYMMETRIC DIVERGENCEPREVISIONENTROPY/EXTROPYSettore SECS-S/06 -Metodi Mat. dell'Economia e d. Scienze Attuariali e Finanz.FOS: MathematicsMathematics - ProbabilityCROSS ENTROPYBREGMAN DIVERGENCE
researchProduct

The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex

2018

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as &ldquo

Kullback–Leibler divergenceSettore MAT/06 - Probabilita' E Statistica MatematicaLogarithmGeneral Physics and Astronomylcsh:Astrophysics02 engineering and technologyBregman divergenceMathematical proofInformation theory01 natural sciencesArticle010104 statistics & probabilityFermi–Dirac entropyKullback symmetric divergencelcsh:QB460-4660202 electrical engineering electronic engineering information engineeringEntropy (information theory)0101 mathematicslcsh:Sciencerelative entropy/extropyAxiomMathematics020206 networking & telecommunicationslcsh:QC1-999total logarithmic scoring ruleProbability distributiondualityPareto optimal exchangelcsh:QprevisionextropySettore SECS-S/01 - StatisticaentropyMathematical economicslcsh:PhysicsEntropy
researchProduct

Completing the logarithmic scoring rule for assessing probability distributions

2012

We propose and motivate an expanded version of the logarithmic score for forecasting distributions, termed the Total Log score. It incorporates the usual logarithmic score, which is recognised as incomplete and has been mistakenly associated with the likelihood principle. The expectation of the Total Log score equals the Negentropy plus the Negextropy of the distribution. We examine both discrete and continuous forms of the scoring rule, and we discuss issues of scaling for scoring assessments. The analysis suggests the dual tracking of the quadratic score along with the usual log score when assessing the qualities of probability distributions. An application to the sequential scoring of f…

Settore MAT/06 - Probabilita' E Statistica MatematicaLogarithmScoring ruleDow-Jones stock indexScoreLikelihood principletotal log scorelogarithmic scoreProbability theoryStatisticsproper scoring ruleEconometricsEntropy (information theory)Probability distributionNegentropyextropyentropySettore SECS-S/01 - StatisticaMathematicsAIP Conference Proceedings
researchProduct

Extropy: Complementary Dual of Entropy

2015

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…

Bregman divergenceFOS: Computer and information sciencesStatistics and ProbabilitySettore MAT/06 - Probabilita' E Statistica MatematicaKullback–Leibler divergenceComputer Science - Information TheoryGeneral MathematicsFOS: Physical sciencesBinary numberMathematics - Statistics TheoryStatistics Theory (math.ST)Kullback–Leibler divergenceBregman divergenceproper scoring rulesGini index of heterogeneityDifferential entropyBinary entropy functionFOS: MathematicsEntropy (information theory)Statistical physicsDual functionAxiomMathematicsdifferential and relative entropy/extropy Kullback- Leibler divergence Bregman divergence duality proper scoring rules Gini index of heterogeneity repeat rate.Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniDifferential and relative entropy/extropyInformation Theory (cs.IT)Probability (math.PR)repeat ratePhysics - Data Analysis Statistics and ProbabilitydualityStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaMathematics - ProbabilityData Analysis Statistics and Probability (physics.data-an)Statistical Science
researchProduct