10h-12h : Gabriel Peyré (CNRS, CEREMADE, Université Paris-Dauphine), Wasserstein Methods in Imaging. (abstract) 12h-14h : Pause 14h-16h : Guillaume Carlier (Professeur, CEREMADE, Université Paris-Dauphine), Barycenters in the Wasserstein Space. (abstract)
10h-12h : Michel Broniatowski (UPMC), Echantillonnage pondéré, maximum de vraisemblance et divergences. (abstract) 12h-14h : Pause. 14h-16h : Michaël Aupetit (CEA), Un modèle génératif pour l'apprentissage automatique de la topologie d'un nuage de points. (abstract)
10h-12h : Professeur Alfred Hero (Extracting correlations from random matrices: phase transitions and Poisson limits, abstract) 12h-14h : Pause. 14h-16h : Professeur Christophe Vignat (Caractérisations géométriques des lois à entropie généralisée maximale, abstract)
10h-12h : Professeur Asuka Takatsu: Wasserstein geometry of the space of Gaussian measures (abstract, paper: WASSERSTEIN GEOMETRY OF GAUSSIAN MEASURES) 12h-14h : Pause déjeuner 14h-16h : Professeur Wilfrid Kendall: Riemannian barycentres: from harmonic maps and statistical shape to the classical central limit theorem (abstract)
10h - 12h: Xavier Pennec (partie I), Current Issues in Statistical Analysis on Manifolds for Computational Anatomy 12h - 14h: déjeuner 14h - 16h: Xavier Pennec (partie I), Current Issues in Statistical Analysis on Manifolds for Computational Anatomy 16h - 16h30: Pause café 16h30-17h30: Frank Nielsen, The Burbea-Rao and Bhattacharyya centroids (vidéo)
We begin by recalling a source coding theorem by Campbell, which relates a generalized measure of length to the Rényi-Tsallis entropy. We show that the associated optimal codes can easily be obtained using considerations on the so-called escort-distributions and that this provide an easy implementation procedure. We also show that these generalized lengths are bounded below by the Rényi entropy.
We then discuss the maximum entropy problems associated with Rényi Q- entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are derived and characterized. The Rényi entropy of the optimum distributions can be viewed as a function of the constraint. This defines two families of entropy functionals in the space of possible expected values. General properties of these functionals, including nonnegativity, minimum, convexity, are documented. Their relationships as well as numerical aspects are also discussed. Finally, we examine some specific cases for the reference measure Q(x) and recover in a limit case some well-known entropies.
Finally, we describe a simple probabilistic model of a transition between two states, which leads to a curve, parametrized by an entropic index q, in the form of a generalized escort distribution. In this setting, we show that the Rényi-Tsallis entropy emerges naturally as a characterization of the transition. Along this escort-path, the Fisher information, computed with respect to the escort distribution becomes an “escort-Fisher information”. We show that this Fisher information involves a generalized score function, and appears in the entropy differential metric associated to Rényi entropy. We also show that the length of the escort-path is related to Jeffreys' divergence. When q varies, we show that the paths with minimum Fisher information (respectively escort-Fisher information) and a given q-variance (resp. standard variance) are the paths described by q-gaussians distributions. From these results, we obtain a generalized Cramér-Rao inequality. If time permits, we will show that these results can be extended to higher order q-moments.
14h-16h: Geométrie de l'Analyse en Composantes IndépendantesL'Analyse en Composantes Indépendantes (ACI) consiste à déterminer la transformation linéaire d'un ensemble de N signaux (ou N images, ou toutes données N-variées) qui fournisse des composantes "aussi indépendantes que possible". Une mesure naturelle d'indépendance est l'information mutuelle (étendue à N variables). L'étude de ce problème, aux nombreuses applications, fait apparaitre, en sus de l'information mutuelle, d'autres quantités définies dans le langage de la théorie de l'information: entropie et non gaussianité dans le cas de modèles i.i.d., diversité spectrale dans le cas de modèles gaussiens stationnaires, etc...
Suivant une suggestion d'Amari, j'ai mis à jour la structure géométrique de l'ACI. Mon intervention prendra comme point de départ l'étude de la vraisemblance des modèles statistiques d'ACI, ce qui nous mènera très directement à l'élucidation des critères ACI et de leurs inter-relations en termes de la géométrie de l'information.
Léon Nicolas Brillouin (1889-1969) a publié la traduction de son ouvrage en anglais "Science and Information Theory" (1956) en français en 1958:
"La science et la théorie de l'information" (traduction par M. Parodi)
(google book)