Quizz 2: Information: Theory and Geometry

by Frank NIELSEN
This is the second installment of the quizz series.
Check the 1st quizz first.
(As usual, I welcome comments and suggestions! -:) )


1. They are various notions of information like Fisher, Kolmogorov (complexity), Shannon, etc. Historically, which came first?
a) Kolmogorov information
b) Shannon information
c) Fisher information
2. When and by whom was the term entropy first coined?
a) Rudolf Clausius in 1850
b) Claude Shannon in 1948
c) Ludwig Boltzmann in 1877
d) Josiah Gibbs in 1882

3. What is the other name of the cross-entropy?
a) inaccuracy
b) affinity
c) joint entropy
d) conditional entropy

4. Is mutual information (MI) a metric distance
Yes, sure!
No, no way!
5. Among the class of parametric family of distances, Bregman divergences and Csiszar f-divergences (also called Ali-Silvey distances) are of prime interests. The Jensen-Shannon divergence (average of distribution entropies minus the entropy of the average distribution) belongs to:
Bregman divergences
Csiszar f-divergences
Not a Bregman/Csiszar divergence, it is a Jensen difference for the Shannon entropy only (Burbea-Rao divergence).
6. C.R. Rao pioneered the geometrization of statistics in 1945 by taking the Fisher information matric as the Riemannian matrix of a statistical manifold. In the same paper, Rao proved what is now called the Cramer-Rao lower bound. This seminal lower bound was later improved using a sequence of matrices that bear the name:
Amari matrices
Basu matrices
Mahalanobis matrices
Bhattacharyya matrices

7. Means like the arithmetic, geometric or harmonic means can be generalized by axiomatizing a few expected properties. This was done independently by Kolmogorov and Nagumo, by choosing a strictly increasing continuous function f and defining the mean as f{-1}((f(a)+f(b))/2). What is the other name of those f-means:
information mean
quasi-arithmetic mean
Pythagoras means
entropic means

8. Riemannian statistical manifolds where first investigated in the 1940's by C.R. Rao and Jeffreys. The seminal paper of C.R. Rao was published in the Bulletin of Calcutta Mathematical Society. When?
In 1944
In 1945
In 1946 (same year, as Jeffreys prior)
In 1948 (same year as Shannon paper)

9. Statistical manifolds can be studied by setting any arbitrary Riemannian metric. Who proved using geometric category arguments that the Fisher information matrix is up to a constant the only Riemmanian metric that ensure invariance by reparameterization (Markov morphisms), and when?
Amari in 1982
Chentsov in 1972
Hellinger in 1908
Chernoff in 1962

10. In the popular class of f-divergences, what is the only metric f-divergence?
Euclidean distance
Chi-squared distance
total variation distance
Jensen-Shannon distance


© April 2010 Frank Nielsen, All rights reserved.