ARTICLE

Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures

IEEE Signal Process. Lett. | Vol.23, pages 1543-1546, 2016

Author

Nielsen, Frank and Sun, Ke

Abstract

The Kullback-Leibler (KL) divergence between two mixture models is a fundamental primitive in many signal processing tasks. Since the KL divergence of mixtures does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration or approximated. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the KL divergence of univariate mixtures. We illustrate the versatile method by reporting on our experiments for approximating the KL divergence between Gaussian mixture models.

Related Members