Quizz 1: Spelling correctly information-theoretic distances

It is very common to mispell distances in information theory. Let us check!
(for sake of simplicity I removed all accents -:) )


1. How do you name the distance that both provide lower and upper bounds on the Bayes classification error?
a) Bhatacharya
b) Bhattacharya
c) Bhattacharrya
d) Bhattacharyya

2. How do you name the relative entropy distance?
a) Kullback-Liebler divergence
b) Kullbak-Leibler divergence
c) Kullback-Leibler divergence
d) Kullback-Leibleir divergence

3. Hellinger metric distance is related to work of Chernoff/Kakutani and ...
Matsushita
Matusita
Matsushiita
Matsushitta

4. What is the other name of the Earth mover distance?
Monge-Kanterovich-Wasserstein
Monge-Kantorovich-Wasserstein
Monge-Kantorovitch-Wasserstein
Monge-Kanteroveach-Wasserstein

5. Tsallis entropy is also known as ...
Havra-Charvit
Havrdat-Charva
Havrda-Charvat
Havdra-Charvit

6. The symmetrized relative entropy is also known as the J-divergence. J stands for
Jeffreys
Jeyffreys
Jeyffrey
Jeffrey


© March 2010 Frank Nielsen, All rights reserved.