showing 1 related works from this author
Information potential for some probability density functions
2021
Abstract This paper is related to the information theoretic learning methodology, whose goal is to quantify global scalar descriptors (e.g., entropy) of a given probability density function (PDF). In this context, the core concept is the information potential (IP) S [ s ] ( x ) : = ∫ R p s ( t , x ) d t , s > 0 of a PDF p(t, x) depending on a parameter x; it is naturally related to the Renyi and Tsallis entropies. We present several such PDF, viewed also as kernels of integral operators, for which a precise relation exists between S[2](x) and the variance Var[p(t, x)]. For these PDF we determine explicitly the IP and the Shannon entropy. As an application to Information Theoretic Learning w…