6533b820fe1ef96bd1279bcd
RESEARCH PRODUCT
The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
Giuseppe SanfilippoGianna AgròFrank Ladsubject
Kullback–Leibler divergenceSettore MAT/06 - Probabilita' E Statistica MatematicaLogarithmGeneral Physics and Astronomylcsh:Astrophysics02 engineering and technologyBregman divergenceMathematical proofInformation theory01 natural sciencesArticle010104 statistics & probabilityFermi–Dirac entropyKullback symmetric divergencelcsh:QB460-4660202 electrical engineering electronic engineering information engineeringEntropy (information theory)0101 mathematicslcsh:Sciencerelative entropy/extropyAxiomMathematics020206 networking & telecommunicationslcsh:QC1-999total logarithmic scoring ruleProbability distributiondualityPareto optimal exchangelcsh:QprevisionextropySettore SECS-S/01 - StatisticaentropyMathematical economicslcsh:Physicsdescription
The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as &ldquo
year | journal | country | edition | language |
---|---|---|---|---|
2018-08-01 | Entropy |