Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have bee...
Guardado en:
Autor principal: | Frank Nielsen |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/f8ed504ac684481fb7f860145ec5bc48 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Convolutional Autoencoding and Gaussian Mixture Clustering for Unsupervised Beat-to-Beat Heart Rate Estimation of Electrocardiograms from Wearable Sensors
por: Jun Zhong, et al.
Publicado: (2021) -
One Generalized Mixture Pareto Distribution and Estimation of the Parameters by the EM Algorithm for Complete and Right-Censored Data
por: Mohamed Kayid
Publicado: (2021) -
Nonparametric Multivariate Density Estimation: Case Study of Cauchy Mixture Model
por: Tomas Ruzgas, et al.
Publicado: (2021) -
How to fit models of recognition memory data using maximum likelihood.
por: John C. Dunn
Publicado: (2010) -
Principal components analysis for mixtures with varying concentrations
por: Olena Sugakova, et al.
Publicado: (2021)