- Send email
-
Post an issue
on gitlab.com -
with Hypothesis
What qualifies as a satisfactory extension of Shannon entropy to continuous variables is somewhat subjective. What extension is desirable is highly dependent on what one wishes to measure.
Shannon defines a continuous entropy [1] but clarifies
“In the discrete case the entropy measures in an absolute way the randomness of the chance variable. In the continuous case the measurement is relative to the coordinate system.”
The continuous entropy defined by Shannon can be negative. The type of extension considered in this document is one that measures entropy “in an absolute way”.
Statistical Variance
Information is also a measure of uncertainty. A common, if not the most common, measure of uncertainty for continuous variables is statistical variance. Like entropy, variance shares the property that the measure for a combination of two independent sources is the sum of the measures for the respective sources. More formally, given any two independent variables and A preference in entropy extension could be to match variance for continuous variables. This document considers the class of entropy extensions that align with variance for continuous variables.
Mutual Information
A measurement derived from entropy is which Shannon refers to as the actual rate of information transmission [1] (where and are the start and end of a noisy communication channel). More recent authors refer to this as mutual information [2]. One of many interpretations is that measures the amount of information about provided by .
A notable property of mutual information (for two variables) is that it is zero if and only if the two random variables are independent.
Variance Explained
Variance explained is another measurement which also captures a sense of how much information one variable provides about another. Formally, given random variables and , variance explained is using the definition of conditional expectation [3] where is a random variable.
The variance explained by an independent variable is zero. However, unlike mutual information, zero variance explained does not imply independence. Consider a random variable that takes the values with equal probability. The random variable explains zero variance, but and are not independent. Intuitively, the random variable does provide some information about . It informs whether is zero or not. In this sense, something analogous to mutual information is a more appropriate measure of how much information provides about .
Random Objects vs Variables
A random object is a function with a domain of a probability space [3]. When the function values are real numbers, it is a random variable (or real random object). A finite random object means a random object that takes on finitely many values. Or in other words, the range of a finite random object is a set of finite size.
Both entropy and variance are functions of random objects. In the case of Shannon entropy, the random object is finite (with values often referred to as symbols). In the case of variance, the random object is a random variable (possibly a vector of real numbers in ). The distances between values of a finite random variable affect variance, but not Shannon entropy.
Desirable Extension Properties
Some random variables are also finite random objects, but their variance and Shannon entropy are not necessarily equal. Any extension must have some extra input beyond just a random object to determine whether the output is statistical variance vs Shannon entropy. Let represent a desirable extension with and denoting the cases when some extra input determines Shannon entropy vs statistical variance, respectively.
Extending mutual information to continuous variables is a desirable property. Since entropy is equal to the mutual information between a variable and itself, describing an extension of mutual information, will also describe an extension of entropy.
For finite random objects, the Shannon entropy case should satisfy
Similarly, for real random objects (random variables), the variance case should satisfy
Lastly, the notable property to satisfy is mutual information extended to continuous variables: