All 1 entries tagged Entropy
View all 3 entries tagged Entropy on Warwick Blogs | View entries tagged Entropy at Technorati | There are no images tagged Entropy on this blog
June 03, 2017
An example of a random variable with finite Fisher information but infinite entropy
The following notes concern some calculations I have made relating to Zanella et al (2017). The considerations laid out below are entirely trivial, but are helpful in making it clear whether or not certain conditions might be logically independent of each other.
Consider a random variable X with probability density f(x) = eφ(x).
- The entropy of X is given by H = - ∫ log(f(x)) f(x) dx = - ∫ φ(x) f(x) dx = - E[ φ(X) ];
- One is also interested in something loosely related to Fisher information, namely Ι = ∫ φ'(x)2 f(x) dx = E[ φ'(X)2 ].
Question:
Is it possible for Ι to be finite while H is infinite?
Answer:
Yes.
Consider a density f(x) which is proportional (for large positive x) to 1/(x log(x)2). Consequently φ(x) = constant - (log x + 2 log log x) for large x.
1: Using a change of variable (eu = x) it can be shown that H = - ∫ log(f(x)) f(x) dx is infinite. The contribution to the entropy H for large x is given by - ∫∞ log(f(x)) f(x) dx, controlled by
∫∞ (log x + 2 log log x) dx /(x log(x)2) = ∫∞ (u + 2 log u) eu du /(u2 eu) ≥ ∫∞du / u = ∞.
2: On the other hand elementary bounds show that Ι = ∫ φ'(x)2 f(x) dx can be finite. The contribution to the "Fisher information" Ι for large x is given by - ∫∞ φ'(x)2 f(x) dx, related to
∫∞ (1 / x + 2 / (x log x) )2 dx /(x log(x)2) = ∫∞ (1 + 2 / log x )2 dx /(x3 log(x)2) < constant × ∫∞ dx / x3 < ∞.
An example of such a density (behaving in the required manner at ±∞) is
f(x) = log(2) |x| / ((2+x2) (log(2+x2))2) .
Reference
Zanella, G., Bédard, M., & Kendall, W. S. (2017). A Dirichlet Form approach to MCMC Optimal Scaling. Stochastic Processes and Their Applications, to appear, 22pp. http://doi.org/10.1016/j.spa.2017.03.021