Back to my home page.

Computational Information Geometry

Computational information geometry is an emerging topic of research that is born of two mainstream fields: computational geometry and information geometry. Computational geometry mainly focused on building efficient data-structures and algorithms for geometric objects in the Euclidean space equipped with the Euclidean distance. There have been recently various generalizations of seminal algorithms such as Voronoi diagrams to various metrics (eg., Lp norms, convex distance functions, etc.) Nowadays, computational geometry is focusing on high-dimensions, fast optimization approximations (core-sets, metric embeddings), and certified computing (ie., from computational geometry to geometric computing: solving the robustness issue). 
 
Information geometry has relied on differential geometry to first study families of distributions by looking at the underlying manifold structure [AN2000]. For example, consider the family of univariate Normal distributions. Each 1D Normal distribution can be defined by two parameters (mu,sigma>0). Thus the space of 1D Normal distributions is described by the upper plane. Clearly, the Euclidean distance between two Normal distribution parameters does not make sense any more as we expect more similarity as their standard deviation increase. We rather use entropic functions such as the relative entropy (also called Kullback-Leibler divergence, or I-divergence) that is not symmetric. In fact the Kullback-Leibler divergence is not a metric as triangle inequality and symmetry are not satisfied. Information geometry provided a clean framework for studying these statistical distributions. Methodologies of information geometry have further been extended to linear systems, time series and quantum systems.  
 
Computational information geometry considers discrete algorithms on these information spaces that are equipped with divergences. For example, we would like to compute the 1-center of a set of distributions: a typical statistical inference problem. Or, we would like to triangulate the information space using the dual of the information Voronoi diagram in order to perform interpolation tasks, etc. We may also want to cluster Normal distributions in order to simplify a Gaussian mixture models using an abstract k-means or EM algorithms [B+2005].  
 
Recently, the concept of Bregman divergences has emanated from several papers as a key tool to generalize traditional metric-based algorithms. The beauty of Bregman divergences is that it allows to abstract Euclidean-like distance measures with entropic-like statistical measures, thus providing in a same framework a generic notion that can be used by discrete algorithms. (For example, in CGAL, you may defined a Bregman traits class to unify the computation of ordinary Voronoi diagram with statistical ones.) In statistics, many well-known distributions such as multivariate Normal, multinomial Bernouilli or Poisson distributions can be tackled using the framework of exponential families. Exponential families offer a simple canonical expression (in disguise) of the probability density functions of the above distributions. Moreover, interestingly, the exponential families in statistics are in 1-to-1 bijection with Bregman divergences [B+2005]. 
 
Computational information geometry offers a lot of prospects for handling noise theoretically as distributions. We expect further work on surface reconstruction, point matching, etc that incorporate some level of variability in their input. After all, most input data are observed data, corrupted by noise, that can be modelled by a statistical process (eg, 'Gaussian' noise). 
 
Thus, computational information geometry pushes one step further computational geometry. Yet, it is not the ultimate computation XXX geometry. Indeed, we prone the notion of computational abstract geometry that incorporates imaginary geometries such as hyperbolic, spherical/elliptical and Riemmannian geometries among others. There are plenty opportunities to generalize and axiomatize geometric algorithms (as D. Knuth did for the convex hull [K1992]). In computational abstract geometry, we likely axiomatize notions of bisectors, edges linking sites, and fundamental relationships (ie, orthogonality).  
 
Below, you will find a few seminal works in that direction:

Discrete Algorithms

References