フランク・ニールセン
Tokyo Research
Nielsen, Frank and Sun, Ke
The Kullback-Leibler (KL) divergence between two mixture models is a fundamental primitive in many signal processing tasks. Since the KL divergence of mixtures does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration or approximated. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the KL divergence of univariate mixtures. We illustrate the versatile method by reporting on our experiments for approximating the KL divergence between Gaussian mixture models.
Tokyo Research