Tokyo Research
The Kullback Leibler Divergence Between Lattice Gaussian Distributions
Author
Nielsen, Frank
Abstract
A lattice Gaussian distribution of given mean and covariance matrix is a discrete distribution supported on a lattice maximizing Shannon’s entropy under these mean and covariance constraints. Lattice Gaussian distributions find applications in cryptography and in machine learning. The set of Gaussian distributions on a given lattice can be handled as a discrete exponential family whose partition function is related to the Riemann theta function. In this paper, we first report a formula for the Kullback–Leibler divergence between two lattice Gaussian distributions and then show how to efficiently approximate it numerically either via Rényi’s -divergences or via the projective -divergences. We illustrate how to use the Kullback-Leibler divergence to calculate the Chernoff information on the dually flat structure of the manifold of lattice Gaussian distributions.