Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application

Among statistical models, Gaussian Mixture Models (GMMs) have been used in numerous applications to model the data in which a mixture of Gaussian curves fits them. Several methods have been introduced to estimate the optimum parameters to a GMM fitted to the data. The accuracy of such estimation met...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Mehran Azimbagirad, Luiz Otavio Murta Junior
Formato: article
Lenguaje:EN
Publicado: Elsevier 2021
Materias:
Acceso en línea:https://doaj.org/article/30216875707c468684c84f1087a1c533
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:30216875707c468684c84f1087a1c533
record_format dspace
spelling oai:doaj.org-article:30216875707c468684c84f1087a1c5332021-12-03T04:01:37ZTsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application2772-528610.1016/j.neuri.2021.100002https://doaj.org/article/30216875707c468684c84f1087a1c5332021-09-01T00:00:00Zhttp://www.sciencedirect.com/science/article/pii/S2772528621000029https://doaj.org/toc/2772-5286Among statistical models, Gaussian Mixture Models (GMMs) have been used in numerous applications to model the data in which a mixture of Gaussian curves fits them. Several methods have been introduced to estimate the optimum parameters to a GMM fitted to the data. The accuracy of such estimation methods is crucial to interpret the data. In this paper, we proposed a new approach to estimate the parameters of a GMM using critical points of Tsallis-entropy to adjust each parameter's accuracy. To evaluate the proposed method, seven GMMs of simulated random (noisy) samples generated by MATLAB were used. Each simulated model was repeated 1000 times to generates 1000 random values obeying the GMM. In addition, five GMM shaped samples extracted from magnetic resonance brain images were used, aiming for image segmentation application. For comparison assessment, Expectation-Maximization, K-means, and Shannon's estimator were employed on the same dataset. These four estimation methods using accuracy, Akaike information criterion (AIC), Bayesian information criterion (BIC), and Mean Squared Error (MSE) were evaluated. The mean accuracies of the Tsallis-estimator for simulated data, i.e., the mean values, variances, and proportions, were 99.9(±0.1), 99.8(±0.2), and 99.7(±0.3)%, respectively. For both datasets, the Tsallis-estimator accuracies were significantly higher than EM, K-means, and Shannon. Tsallis-estimator, increasing the estimated parameters' accuracy, can be used in statistical approaches and machine learning.Mehran AzimbagiradLuiz Otavio Murta JuniorElsevierarticleTsallis entropyShannon entropyExpectation-MaximizationK-meansGaussian Mixture ModelNeurosciences. Biological psychiatry. NeuropsychiatryRC321-571ENNeuroscience Informatics, Vol 1, Iss 1, Pp 100002- (2021)
institution DOAJ
collection DOAJ
language EN
topic Tsallis entropy
Shannon entropy
Expectation-Maximization
K-means
Gaussian Mixture Model
Neurosciences. Biological psychiatry. Neuropsychiatry
RC321-571
spellingShingle Tsallis entropy
Shannon entropy
Expectation-Maximization
K-means
Gaussian Mixture Model
Neurosciences. Biological psychiatry. Neuropsychiatry
RC321-571
Mehran Azimbagirad
Luiz Otavio Murta Junior
Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
description Among statistical models, Gaussian Mixture Models (GMMs) have been used in numerous applications to model the data in which a mixture of Gaussian curves fits them. Several methods have been introduced to estimate the optimum parameters to a GMM fitted to the data. The accuracy of such estimation methods is crucial to interpret the data. In this paper, we proposed a new approach to estimate the parameters of a GMM using critical points of Tsallis-entropy to adjust each parameter's accuracy. To evaluate the proposed method, seven GMMs of simulated random (noisy) samples generated by MATLAB were used. Each simulated model was repeated 1000 times to generates 1000 random values obeying the GMM. In addition, five GMM shaped samples extracted from magnetic resonance brain images were used, aiming for image segmentation application. For comparison assessment, Expectation-Maximization, K-means, and Shannon's estimator were employed on the same dataset. These four estimation methods using accuracy, Akaike information criterion (AIC), Bayesian information criterion (BIC), and Mean Squared Error (MSE) were evaluated. The mean accuracies of the Tsallis-estimator for simulated data, i.e., the mean values, variances, and proportions, were 99.9(±0.1), 99.8(±0.2), and 99.7(±0.3)%, respectively. For both datasets, the Tsallis-estimator accuracies were significantly higher than EM, K-means, and Shannon. Tsallis-estimator, increasing the estimated parameters' accuracy, can be used in statistical approaches and machine learning.
format article
author Mehran Azimbagirad
Luiz Otavio Murta Junior
author_facet Mehran Azimbagirad
Luiz Otavio Murta Junior
author_sort Mehran Azimbagirad
title Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
title_short Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
title_full Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
title_fullStr Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
title_full_unstemmed Tsallis generalized entropy for Gaussian mixture model parameter estimation on brain segmentation application
title_sort tsallis generalized entropy for gaussian mixture model parameter estimation on brain segmentation application
publisher Elsevier
publishDate 2021
url https://doaj.org/article/30216875707c468684c84f1087a1c533
work_keys_str_mv AT mehranazimbagirad tsallisgeneralizedentropyforgaussianmixturemodelparameterestimationonbrainsegmentationapplication
AT luizotaviomurtajunior tsallisgeneralizedentropyforgaussianmixturemodelparameterestimationonbrainsegmentationapplication
_version_ 1718373922846539776